| Literature DB >> 29463985 |
Abdullah-Al Nahid1, Yinan Kong1.
Abstract
Breast cancer is one of the largest causes of women's death in the world today. Advance engineering of natural image classification techniques and Artificial Intelligence methods has largely been used for the breast-image classification task. The involvement of digital image classification allows the doctor and the physicians a second opinion, and it saves the doctors' and physicians' time. Despite the various publications on breast image classification, very few review papers are available which provide a detailed description of breast cancer image classification techniques, feature extraction and selection procedures, classification measuring parameterizations, and image classification findings. We have put a special emphasis on the Convolutional Neural Network (CNN) method for breast image classification. Along with the CNN method we have also described the involvement of the conventional Neural Network (NN), Logic Based classifiers such as the Random Forest (RF) algorithm, Support Vector Machines (SVM), Bayesian methods, and a few of the semisupervised and unsupervised methods which have been used for breast image classification.Entities:
Mesh:
Year: 2017 PMID: 29463985 PMCID: PMC5804413 DOI: 10.1155/2017/3781951
Source DB: PubMed Journal: Comput Math Methods Med ISSN: 1748-670X Impact factor: 2.238
Figure 1Number of new people facing cancer in Australia from 2007 to 2018 [5].
Figure 2Number of people dying due to cancer in Australia from 2007 to 2018 [5].
Figure 3Anatomy of the female breast images (for the National Cancer Institute 2011; Terese Winslow, US Government, has certain rights).
Figure 4(a, b) show mammogram benign and malignant images (examples of noninvasive image) and (c, d) show histopathological benign and malignant images (examples of invasive image).
Figure 5A very basic breast image classification model.
Available breast image database for biomedical investigation.
| Database | Number of images | Database size (GB) | Image capture technique | Image type | Total patients |
|---|---|---|---|---|---|
| MIAS | 322 | 2.3 | Mammogram | 161 | |
| DDSM | Mammogram | 2620 | |||
| CBIS-DDSm | 4067 | 70.5 | MG | DICOM | 237 |
| ISPY1 | 386,528 | 76.2 | MR, SEG | 237 | |
| Breast-MRI-NACT-Pilot | 99,058 | 19.5 | MRI | 64 | |
| QIN-Breast | 100835 | 11.286 | PET/CT, MR | DICOM | 67 |
| Mouse-Mammary | 23487 | 8.6 | MRI | DICOM | 32 |
| TCGA-BRCA | 230167 | 88.1 | MR, MG | DICOM | 139 |
| QIN Breast DCE-MRI | 76328 | 15.8 | CT | DICOM | 10 |
| BREAST-DIAGNOSIS | 105050 | 60.8 | MRI/PET/CT | DICOM | 88 |
| RIDER Breast MRI | 1500 | .401 | MR | DICOM | 5 |
| BCDR | Mammogram | 1734 | |||
| TCGA-BRCA | 53.92 (TB) | Histopathology | 1098 | ||
| BreakHis | 7909 | Histopathology | 82 | ||
| Inbreast | 419 | Mammogram | 115 |
Figure 6Number of papers published based on MIAS and DDSM databases.
Figure 7Classification of features for breast image classification.
Feature descriptor.
| Feature category | Feature description |
|---|---|
| Texture | Haralick texture features [ |
| (1) Angular Second Moment (ASM), (2) Contrast, (3) correlation, (4) Sum of Squares of Variances (SSoV), (5) Inverse of Difference (IoD), (6) Sum of Average (SoA), (7) Sum of Variances (SoV), (8) Sum of Entropy (SoE), (9) Entropy, (10) Difference of Variance (DoV), (11) Difference of Entropy (DoE), (12) Gray-Level Concurrence Matrix (GLCM). | |
| Tamura features [ | |
| (1) Coarseness, (2) Contrast, (3) directionality, (4) line-likeness, (5) roughness, (6) regularity. | |
| Global texture descriptor | |
| (1) Fractal dimension (FD), (2) Coarseness, (3) Entropy, (4) Spatial Gray-Level Statistics (SGLS), (5) Circular Moran Autocorrelation Function (CMAF). | |
|
| |
| Detector | Single scale detector |
| (1) Moravec's Detector (MD) [ | |
| Multiscale detector [ | |
| (1) Laplacian of Gaussian (LoG) [ | |
|
| |
| Strutural | (1) Area, (2) bounding box, (3) centroid, (4) Convex Hull (CH), (5) eccentricity, (6) Convex Image (CI), (7) compactness, (8) Aspect Ratio (AR), (9) moments, (10) extent, (11) extrema, (12) Major Axis Length (MaAL), (13) Minor Axis Length (MiAL), (14) Maximum Intensity (MaI), (15) Minimum Intensity (MiI), (16) Mean Intensity (MI), (17) orientation, (18) solidity. |
Feature descriptor.
| Feature category | Feature description |
|---|---|
| Statistical | (1) Mean, (2) Median, (3) Standard Deviation, (4) Skewness, (5) Kurtosis, (6) Range |
|
| |
| Descriptor | (1) Scale Invariant Feature Transform (SIFT) [ |
|
| |
| BI-RADS [ | (1) Margin Integrality (MarI), (2) Margin Ambiguity (MarA), (3) Echo Pattern Posterior Feature (EPPF), (4) Calcification in Mass (CM), (5) Architectural Distortion (AD), (6) Edema, (7) Eymph Nodes Axillary (ENA) (8) Ducts Changes (DC), (9) Skin Thickening (ST), (10) Postsurgical Fluid Collection (PSFC), (11) Skin Retraction (SR1), (12) Fat Necrosis (FN), (13) Lump Nodes Intramammary (LNI). |
Figure 8A summary of feature selection method.
A simplified hierarchy of classification.
| Learning technique | Algorithm | ||
|---|---|---|---|
| Supervised | Conventional | (a) Logic based | (1) ID3, (2) C4.5, (3) bagging, |
| (4) random trees, (5) Random Forest, | |||
| (6) boosting, (7) advanced boosting, | |||
| (8) Extreme Boosting (XGBoosting). | |||
| (b) Bayesian | (1) Naive Bayes | ||
| (2) Bayesian Network | |||
| (c) Conventional Neural Network | |||
| (d) Support Vector Machine | |||
| DNN-based | (a) Convolutional Neural Network (CNN), | ||
| (b) Deep Belief Network (DBN), | |||
| (c) Generative Adversial Network (GAN). | |||
|
| |||
| Unsupervised | Conventional | (a) | |
| (b) Self-Organizing Map (SOP) | |||
| (c) Fuzzy | |||
| DNN-based | (a) Deep Belief Network (DBN) | ||
|
| |||
| Semisupervised | Conventional | (a) Self-training | |
| (b) Graph Based | |||
| (c) S3V3 | |||
| (d) Multiview | |||
| (e) Generative model | |||
Figure 9Confusion Matrix.
Figure 10A generalized supervised classifier model.
Figure 11A model of a biological neuron.
Figure 12Working principle of a simple Neural Network technique.
Neural Network for breast image classification.
| Reference | Descriptor | Image type | Number of images | Key findings |
|---|---|---|---|---|
| Rajakeerthana et al. [ | (1) GLCM, GLDM, SRDM, NGLCM, GLRM | Mammogram | 322 | (1) The classifier achieved 99.20% Accuracy. |
|
| ||||
| Lessa and Marengoni [ | (1) Mean, Median, Standard Deviation, Skewness, Kurtosis, Entropy, Range | Thermographic | 94 | (1) Achieved Sensitivity, Specificity, and Accuracy are 87.00%, 83.00%, and 85.00%, respectively. |
|
| ||||
| Wan et al. [ | (1) ALBP (2) BBLBP | OCM | 46 | (1) Achieved Sensitivity and Specificity are 100% and 85.20%. respectively. |
| (2) ROC value obtained 0.959. | ||||
|
| ||||
| Chen et al. [ | (1) 19 BI-RADS features have been used | Ultrasound | 238 | (1) Chi squared method has been utilized for the feature selection. |
| (2) Achieved Accuracy, Sensitivity, and Specificity are 96.10%, 96.70%, and 95.70%, respectively. | ||||
|
| ||||
| de Lima et al. [ | (1) Total 416 features have been used | Mammogram | 355 | (1) Multiresolution wavelet and Zernike moment have been utilized for the feature extraction. |
|
| ||||
| Abirami et al. [ | (1) 12 statistical measures such as Mean, Median, and Max have been utilized as the features | Mammogram | 322 | (1) Wavelet transform has been utilized for the feature extraction. |
| (2) The achieved Accuracy, Sensitivity, and Specificity are 95.50%, 95.00%, and 96.00%, respectively. | ||||
|
| ||||
| El Atlas et al. [ | (1) 13 morphological features have been utilized | Mammogram | 410 | (1) Firstly the edge information has been utilized for the mass segmentation and then the morphological features were extracted. |
| (2) Achieved best Accuracy is 97.5%. | ||||
Neural Network for breast image classification.
| Reference | Descriptor | Image type | Number of images | Key findings |
|---|---|---|---|---|
| Alharbi et al. [ | (1) 49 features have been utilized. | Mammogram | 1100 | (1) Five feature selection methods: Fisher score, Minimum Redundancy-Maximum Relevance, Relief-f, Sequential Forward Feature Selection, and Genetic Algorithm have been used. |
| (2) Achieved Accuracy, Sensitivity, and specificity are 94.20%, 98.36%, and 99.27%, respectively | ||||
|
| ||||
| Peng et al. [ | (1) Haralick and Tamura features have been utilized | Mammogram | 322 | (1) Feature reduction has been performed by Rough-Set theory and selected 5 prioritized features. |
| (2) The best Accuracy, Sensitivity, and Specificity achieved were 96.00%, 98.60%, and 89.30% | ||||
|
| ||||
| Jalalian et al. [ | (1) GLCM | Mammogram | (1) The obtained classifier Accuracy, Sensitivity, and Specificity are 95.20%, 92.40%, and 98.00%, respectively. | |
| (2) Compactness | ||||
|
| ||||
| Li et al. [ | (1) Four feature vectors have been calculated | Mammogram | 322 | (1) 2D contour of breast mass in mammography has been converted into 1D signature. |
| (2) NN techniques achieved Accuracy is 99.60% when RMS slope is utilized. | ||||
|
| ||||
| Chen et al. [ | (1) Autocorrelation features | Ultrasound | 242 | (1) The overall achieved Accuracy, Sensitivity, and Specificity are 95.00%, 98.00%, and 93%, respectively. |
|
| ||||
| Chen et al. [ | (1) Autocorrelation features | Ultrasound | 1020 | (1) The obtained ROC area is 0.9840 ± 0.0072. |
Neural Network for breast image classification.
| Reference | Descriptor | Image type | Number of images | Key findings |
|---|---|---|---|---|
| Chen et al. [ | (1) Variance Contrast of Wavelet Coefficient | Ultrasound | 242 | (1) The achieved ROC curve 0.9396 ± 0.0183 |
| (2) Autocorrelation of Wavelet Coefficient | ||||
|
| ||||
| Silva et al. [ | (1) 22 different morphological features such as convexity and lobulation have been utilized | Ultrasound | — | (1) The best obtained Accuracy and ROC curve are 96.98% and 0.98, respectively |
|
| ||||
| Saritas [ | (1) Age of patient, (2) mass shape, (3) mass border, (4) Mass density, (5) BIRADS | Mammogram | — | (1) Disease prediction rate is 90.5% |
| (2) Neural Network utilized 5 neurons in input layers and one hidden layer. | ||||
|
| ||||
| López-Meléndez et al. [ | (1) Area, perimeter, etc. have been utilized | Mammogram | 322 | (1) The achieved Sensitivity and Specificity are 96.29% and 99.00%, respectively. |
Figure 13ReLU Operation.
Figure 14Max-Pooling and Average Pooling.
Figure 15Work-flow of a Convolutional Neural Network.
Available software for deep learning analysis.
| Software | Interface and backend | Provider |
|---|---|---|
| Caffe [ | Python, MATLAB, C++ | Berkeley Vision and Learning Centre, University of California, Berkeley |
| Torch [ | C, LuaJIT | |
| MatConvNet [ | MATLAB, C | Visual Geometry Group, Department of Engineering, University of Oxford |
| Theano [ | Python | Montreal Institute for Learning Algorithms |
| University of Montreal | ||
| TensorFlows [ | C++, Python | |
| CNTK [ | C++ | Microsoft |
| Keras [ | Theano, Tensor Flow | MIT |
| dl4j [ | Java | Skymind Engineering |
| DeeBNET [ | MATLAB | Information Technology Department, Amirkabir University of Technology |
Convolutional Neural Network.
| Reference | Descriptor | Image type | Number of images | Key findings |
|---|---|---|---|---|
| Wu et al. [ | (1) Global Features | Mammogram | 40 | (1) Achieved Sensitivity 75.00% and Specificity 75.00%. |
|
| ||||
| Sahiner et al. [ | (1) Global Features | Mammogram | 168 | (1) The achieved ROC score is 0.87. |
|
| ||||
| Lo et al. [ | (1) Density, size, Shape, Margin | Mammogram | 144 | (1) The achieved ROC curve is 0.89. |
|
| ||||
| Fonseca et al. [ | (1) Global Features | Mammogram | — | (1) Breast density classification has been performed utilizing HT-L3 convolution. |
| (2) Average achieved obtained Kappa value is 0.58. | ||||
|
| ||||
| Arevalo et al. [ | (1) Global Features | Mammogram | 736 | (1) The achieved ROC curve is 0.826. |
|
| ||||
| Su et al. [ | (1) Global Features | Mammogram | 92 | (1) Fast Scanning CNN (fCNN) method has been utilized to reduce the information loss. |
| (2) The average Precision, Recall, and | ||||
|
| ||||
| Sharma and Preet [ | (1) GLCM, GLDM Geometrical | Mammogram | 40 | (1) The best Accuracy achieved is 75.23% and 72.34%, respectively, for fatty and dense tissue classification. |
|
| ||||
| Spanhol et al. [ | (1) Global Features | Histopathology | 7909 | (1) The best Accuracy achieved 89 ± 6.6%. |
|
| ||||
| Rezaeilouyeh et al. [ | (1) Local and Global Features | Histopathology | — | (1) Shearlet transform has been utilized for extracting local features. |
| (2) When they utilize RGB image along with magnitude of Shearlet transform together, the Achieved Sensitivity, Specificity, and Accuracy were 84.00 ± 1.00%, 91.00 ± 2.00%, and 84.00 ± 4.00%; when they utilize RGB image along with both the phase and magnitude of Shearlet transform together, the achieved Sensitivity, Specificity, and Accuracy were 89.00 ± 1.00%, 94.00 ± 1.00%, and 88.00 ± 5.00%. | ||||
Convolutional Neural Network.
| Reference | Descriptor | Image type | Number of images | Key findings |
|---|---|---|---|---|
| Albayrak and Bilgin [ | (1) Global Features | Histopathology | 100 | (1) Cluster-based segmentation has been performed to find out the cellular structure. |
| (2) Blob analysis has been performed on the segmented images. | ||||
| (3) To reduce the high dimensionality, Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA) methods have been utilized. | ||||
| (4) Before the dimensionality reduction the Precision, Recall, and | ||||
| (5) The best average Accuracy is 73.00% (without dimensionality reduction) and 96.8% (with dimensionality reduction). | ||||
|
| ||||
| Jiao et al. [ | (1) Global and Local Features. | Mammogram | — | (1) They performed their experiments on the DDSM database. |
| (2) Total required parameter is 5.8 × 107 and time for the per image processing is 1.10 ms. | ||||
| (3) The best classification achieved is 96.70%; however they show that when they utilize the VGG model the Accuracy was 97.00% which is slightly better than their model. | ||||
| However in terms of memory size and time per image processing their model gives better performance than the VGG model. | ||||
|
| ||||
| Zejmo et al. [ | (1) Global Features | Cytology | 40 | (1) GoogleNet and AlexNet models have been utilized. |
| (2) The best Accuracy obtained when they utilized GoogleNet model was 83.00%. | ||||
Convolutional Neural Network.
| Reference | Descriptor | Image type | Number of images | Key findings |
|---|---|---|---|---|
| Jiang et al. [ | (1) Global Features | Mammogram | — | (1) Image preprocessing was performed to enhance tissue characteristics. |
| (2) Transfer learning was performed and obtained AUC was 0.88 whereas when the system learned from scratch, the best ROC is 0.82. | ||||
|
| ||||
| Suzuki et al. [ | (1) Global Features | Mammogram | 198 | (1) The achieved sensitivity 89.90%. |
| (2) Transfer learning techniques have been utilized. | ||||
|
| ||||
| Qiu et al. [ | (1) Global Features | Mammogram | 270 | (1) Average achieved Accuracy is 71.40%. |
|
| ||||
| Samala et al. [ | (1) Global Features | — | 92 | (1) They utilized Deep Learning CNN (DLCNN) and CNN models for classification. |
| (2) The AUC of CNN and DLCNN model is 0.89 and 0.93, respectively. | ||||
|
| ||||
| Sharma and Preet [ | (1) Global Features | Mammogram | 607 | (1) Transfer learning and ensemble techniques utilized. |
| (2) When using ensemble techniques the soft voting method has been used. | ||||
| (3) The best ROC score is 0.86. | ||||
|
| ||||
| Kooi et al. [ | (1) Global and Local features | Mammogram | 44090 | (1) Transfer learning method utilized (VGG model). |
|
| ||||
| Geras et al. [ | (1) Global Features | Mammogram | 102800 | (1) They investigated the relation of the Accuracy with the database size and image size. |
|
| ||||
| Arevalo et al. [ | (1) Global Features | Mammogram | 736 | (1) The best ROC value was 0.822. |
Figure 16A general structure of a tree.
Logic Based.
| Reference | Descriptor | Image type | Number of images | Key findings |
|---|---|---|---|---|
| Beura et al. [ | (1) Two-dimensional discrete orthonormal | Mammogram | — | (1) Achieved Accuracy and AUC values on MIAS database are 98.3%, 0.9985. |
|
| ||||
| Diz et al. [ | (1) GLCM | Mammogram | 410 | (1) Their achieved Accuracy value is 76.60% |
| (2) GLRLM | (2) Mean false positive value is 81.00%. | |||
|
| ||||
| Zhang et al. [ | (1) 133 features (mass based and content based) | Mammogram | 400 | (1) Computer model has been created which is able to find a location that was not detected by trainee. |
|
| ||||
| Ahmad and Yusoff [ | (1) Nine features selected | Biopsy | 700 | (1) Achieved Sensitivity, Specificity, and Accuracy are 75.00%, 70.00%, and 72.00%, respectively. |
|
| ||||
| Paul et al. [ | (1) Harlick texture feature | Histopathological | 50 | (1) Their achieved Recall and Precision are 81.13% and 83.50%. |
|
| ||||
| Chen et al. [ | (1) Dual-tree complex wavelet transform (DT-CWT) has been used for the feature extraction. | Mammogram | — | (1) Achieved Received Operating Curve (ROC) 0.764. |
|
| ||||
| Zhang et al. [ | (1) Curvelet Transform | Histopathological | 50 | (1) Random Subspace Ensemble (RSE) utilized. |
Logic Based.
| Reference | Descriptor | Image type | Number of images | Key findings |
|---|---|---|---|---|
| Angayarkanni and Kamal [ | (1) GLCM | Mammogram | 322 | (1) The Achieved Sensitivity and Accuracy are 93.40% and 99.50%, respectively. |
|
| ||||
| Wang et al. [ | (1) Horizontal Weighted Sum | Mammogram | 322 | (1) Surrounding Region Dependence Method (SRDM) utilized for region detection. |
|
| ||||
| Tambasco Bruno et al. [ | (1) Curvelet Transform | Mammogram | — | (1) ANOVA method utilized for feature prioritization. |
|
| ||||
| Muramatsu et al. [ | (1) Radial Local Ternary Pattern (RLTP) | Mammogram | 376 | (1) Textural features have been extracted from the regions of interest (ROIs) using RLTP. |
|
| ||||
| Dong et al. [ | (1) NRL margin gradient | Mammogram | — | (1) Chain code utilized for extraction of regions of interest (ROIs). |
|
| ||||
| Piantadosi et al. [ | (1) Local Binary Pattern-Three Orthogonal Projections (LBP-TOP) | Mammogram | — | (1) Their achieved Accuracy, Sensitivity, and Specificity values are 84.60%, 80.00%, and 90.90%. |
Figure 17SVM finds the hyperplane which separates two classes.
SVM for breast image classification (Page-1).
| Reference | Descriptor | Image type | Number of images | Key findings |
|---|---|---|---|---|
| Malik et al. [ | (1) Speed of sound | QTUS | — | (1) Glands, fat, skin, and connective tissue have been classified. |
|
| ||||
| Chang et al. [ | (1) Textural features such as | Ultrasound | 250 | (1) Benign and malignant images have been classified. |
|
| ||||
| Akbay et al. [ | (1) 52 features have been extracted | Mammogram | — | (1) Microcalcification (MC) Classification Accuracy 94.00% |
|
| ||||
| Levman et al. [ | (1) Relative Signal Intensities | MRI | 76 | (1) Benign and malignant lesions are investigated. |
|
| ||||
| de Oliveira Martins et al. [ | (1) Ripley's | Mammogram | 390 | (1) Benign and malignant image classification. |
SVM for breast image classification.
| Reference | Descriptor | Image type | Number of images | Key findings |
|---|---|---|---|---|
| Zhang et al. [ | (1) Fractional Fourier transform information utilized as features | Mammogram | 200 | (1) They selected ROI for avoiding redundant complexity. |
|
| ||||
| Shirazi and Rashedi [ | (1) GLCM | Ultrasound | 322 | (1) ROI extracted for reducing redundant complexity. |
|
| ||||
| Sewak et al. [ | (1) Radius, perimeter, area, compactness, smoothness, concavity, concave points, symmetry, fractal dimension, and texture of nuclei calculated | Biopsies | 569 | (1) Achieved Accuracy, Sensitivity, and Specificity are 99.29%, 100.00%, and 98.11%, respectively. |
|
| ||||
| Dheeba and Tamil Selvi [ | (1) The laws texture features utilized | Mammogram | 322 | (1) The achieved Accuracy is 86.10%. |
SVM for breast image classification.
| Reference | Descriptor | Image type | Number of images | Key findings |
|---|---|---|---|---|
| Taheri et al. [ | (1) Intensity information | Mammogram | 600 | (1) Classified images into normal and abnormal images. |
|
| ||||
| Tan et al. [ | (1) Shape, fat, presence of calcification texture, spiculation, Contrast, Isodensity type features selected | Mammogram | 1200 | (1) Features have been selected from the region of interest. |
|
| ||||
| Kavitha and Thyagharajan [ | (1) Histogram of the intensity has been used as a statistical feature. | Mammogram | 322 | (1) When using SVM with the linear kernel the obtained Accuracy, Sensitivity, and Specificity are 98%, 100%, and 96%, respectively. |
Bayesian classifier.
| Reference | Descriptor | Image type | Number of images | Key findings |
|---|---|---|---|---|
| Kendall and Flynn [ | (1) Features extracted using DCT method. | Mammogram | (1) Bayesian classifier obtained 100.00% sensitivity with 64.00% specificity. | |
|
| ||||
| Oleksyuk et al. [ | — | — | (1) Bayesian method obtained 86.00% with 80.00% specificity. | |
|
| ||||
| Burling-Claridge et al. [ | (1) Statistical and LBP features extracted. | Mammogram | 322/410 | (1) Bayesian method obtained 67.07 ± 0.73% and 67.61 ± 0.83% Accuracy on MIAS and Inbreast image datasets (using statistical features). |
|
| ||||
| Raghavendra et al. [ | (1) Gabor wavelet transform utilized for feature extraction. | Mammogram | 690 | (1) Locality Sensitive Discriminant Analysis (LSDA) for the data reduction. |
|
| ||||
| Pérez et al. [ | (1) 23 features utilized. | Mammogram | — | (1) UFilter feature selection methods utilized and its efficiency verified by Wilcoxon statistical test. |
|
| ||||
| Rashmi et al. [ | (1) 10 features utilized. | — | — | (1) Benign and malignant tumors have been classified. |
|
| ||||
| Gatuha and Jiang [ | (1) 10 features utilized. | — | — | (1) They built an android based benign and malignant tumor classifier. |
Bayesian classifier.
| Reference | Descriptor | Image type | Number of images | Key findings |
|---|---|---|---|---|
| Benndorf et al. [ | (1) BI-RADS features utilized. | — | 2766 | (1) For the training data the AUC value is 0.959 for the inclusive model, whereas AUC value is 0.910 for the descriptor model. |
|
| ||||
| Rodríguez-López and Cruz-Barbosa [ | (1) Eight image feature nodes utilized. | — | — | (1) NB model obtained 79.00% Accuracy, 80.00% Sensitivity. |
|
| ||||
| Nugroho et al. [ | (1) Eight image feature nodes utilized. | Mammogram | — | (1) Naive Bayes model along with SMO; obtained ROC value is 0.903. |
|
| ||||
| Rodríguez-López and Cruz-Barbosa [ | (1) Eight image features have been | — | 231 | (1) Bayesian Network model obtained 82.00% Accuracy, 80.00% Sensitivity, and 83.00% Specificity when they utilized only three features. |
|
| ||||
| Shivakumari et al. [ | — | 231 | (1) Analyze the Ljubljana breast image dataset. | |
|
| ||||
| Rodríguez-López and Cruz-Barbosa [ | (1) Seven different clinical features extracted. | Mammogram | 690 | (1) Obtained Accuracy, Sensitivity, and Specificity are 82.00%, 80.00%, and 83.00%, respectively. |
K-means Cluster Algorithm and Self-Organizing Map for breast image classification.
| Reference | Descriptor | Image type | Number of images | Key findings |
|---|---|---|---|---|
| Moftah et al. [ | (1) Intensity distribution used as feature. | MRI | — | (1) Three types of evaluation measures performed: |
|
| ||||
| Lee et al. [ | (1) 1734 signal patterns. | MRI | 322 | (1) Available signal patterns have been classified into 10 classes. |
|
| ||||
| Dalmiya et al. [ | (1) Discrete Wavelet Transform. | Mammogram | — | (1) Cancer tumor masses have been segmented. |
|
| ||||
| Elmoufidi et al. [ | (1) Local Binary Pattern. | Mammogram | 322 | (1) Image enhancing. |
|
| ||||
| Samundeeswari et al. [ | Ultrasound | — | (1) Utilizing ant colony and regularization parameters. | |
|
| ||||
| Rezaee [ | Discrete Wavelet Transform. | Mammogram | 120 | (1) Early detection of tumors from the breast image. |
|
| ||||
| Chandra et al. [ | (1) Gray intensity values. | Mammogram | — | (1) Mammogram image has been clustered using SOM along with the Quadratic Neural Network. |
K-means Cluster Algorithm and Self-Organizing Map for breast image classification.
| Reference | Descriptor | Image Type | No. of Images | Key Findings |
|---|---|---|---|---|
| Lashkari and Firouzmand [ | Thermogram | 23 | (1) Both FCM method and Adaboost method utilized separately to classify images. | |
|
| ||||
| Nattkemper et al. [ | MRI | — | (1) | |
|
| ||||
| Slazar-Licea et al. [ | ⋯ | — | (1) Fuzzy | |
|
| ||||
| Marcomini et al. [ | (1) 24 morphological features | Ultrasound | 144 | (1) Minimizing noise using Wiener filter, equalized and Median filter |
|
| ||||
| Chen et al. [ | (1) 24 autocorrelation texture features | Ultrasound | 243 | (1) Obtained ROC area 0.9357 ± 0.0152. Accuracy 85.60%, Specificity 70.80%. |
|
| ||||
| Iscan et al. [ | (1) Two-dimensional discrete cosine transform | Ultrasound | — | (1) Automated threshold scheme introduce to increase the robustness of the SOM algorithm. |
Semisupervised algorithm for breast image classification.
| Reference | Descriptor | Image type | Number of images | Key finding |
|---|---|---|---|---|
| Cordeiro et al. [ | (1) Zernike moments have been used for the feature extraction. | — | 685 | (1) Semisupervised Fuzzy GrowCut algorithm utilized. |
|
| ||||
| Cordeiro et al. [ | — | Mammogram | 322 | (1) Semisupervised Fuzzy GrowCut as well as the Fuzzy GrowCut algorithm utilized for tumors, region segmentation. |
|
| ||||
| Nawel et al. [ | — | — | — | (1) Semisupervised Support Vector Machine (S3VM) utilized. |
|
| ||||
| Zemmal et al. [ | — | DDSM | — | (1) Transductive semisupervised learning technique using (TSVM) utilized for classification along with different features. |
|
| ||||
| Zemmal et al. [ | — | — | 200 | (1) Semisupervised Support Vector Machine (S3VM) utilized with various kernels. |
|
| ||||
| Zemmal et al. [ | (1) GLCM (2) Hu moments (3) Central Moments | Mammogram | — | (1) Transductive Semisupervised learning technique used for image classification. |
|
| ||||
| Peikari et al. [ | (1) Mean, Mode, Standard Deviation, Media, Skewness, Kurtosis | Histopathological | 322 | (1) The Ordering Points to Identify the Clustering Structure (OPTICS) method utilized for image classification [ |
Semisupervised algorithm for breast image classification.
| Reference | Descriptor | Image type | Number of images | Key findings |
|---|---|---|---|---|
| Zhu et al. [ | (1) Relative local intensity | Ultrasound | 144 | (1) One important microenvironment inside the tumor is vasculature, which has been classified in this paper. |
|
| ||||
| Liu et al. [ | — | Ultrasound | — | (1) Iterated Laplacian regularization based semisupervised algorithm for robust feature selection (Iter-LR-CRFS) utilized. |