| Literature DB >> 33286257 |
Jose-Agustin Almaraz-Damian1, Volodymyr Ponomaryov1, Sergiy Sadovnychiy2, Heydy Castillejos-Fernandez3.
Abstract
In this paper, a new Computer-Aided Detection (CAD) system for the detection and classification of dangerous skin lesions (melanoma type) is presented, through a fusion of handcraft features related to the medical algorithm ABCD rule (Asymmetry Borders-Colors-Dermatoscopic Structures) and deep learning features employing Mutual Information (MI) measurements. The steps of a CAD system can be summarized as preprocessing, feature extraction, feature fusion, and classification. During the preprocessing step, a lesion image is enhanced, filtered, and segmented, with the aim to obtain the Region of Interest (ROI); in the next step, the feature extraction is performed. Handcraft features such as shape, color, and texture are used as the representation of the ABCD rule, and deep learning features are extracted using a Convolutional Neural Network (CNN) architecture, which is pre-trained on Imagenet (an ILSVRC Imagenet task). MI measurement is used as a fusion rule, gathering the most important information from both types of features. Finally, at the Classification step, several methods are employed such as Linear Regression (LR), Support Vector Machines (SVMs), and Relevant Vector Machines (RVMs). The designed framework was tested using the ISIC 2018 public dataset. The proposed framework appears to demonstrate an improved performance in comparison with other state-of-the-art methods in terms of the accuracy, specificity, and sensibility obtained in the training and test stages. Additionally, we propose and justify a novel procedure that should be used in adjusting the evaluation metrics for imbalanced datasets that are common for different kinds of skin lesions.Entities:
Keywords: balance; computer-aided systems; convolutional neural networks; data; deep learning; fusion; handcraft; melanoma; mutual information; transfer learning
Year: 2020 PMID: 33286257 PMCID: PMC7516968 DOI: 10.3390/e22040484
Source DB: PubMed Journal: Entropy (Basel) ISSN: 1099-4300 Impact factor: 2.524
Figure 1Block diagram of the novel Computer-Aided Detection (CAD) system.
Figure 2(a) Original image ; (b) image (a) processed with a Gaussian filter.
Figure 3(a) Original image , (b) on channel L, (c) on channel a*, and (d) on channel b*.
Figure 4Results of the thresholding stage: (a) Original image , (b) binary image obtained from the threshold of the L channel, (c) binary image obtained from the threshold of the a* channel, (d) binary image obtained from the threshold of the b* channel.
Figure 5Results of the preprocessing stage: Original image , (a) Region of Interest (ROI) obtained, (b) segmented image , (c) obtained asymmetry at 0°, and (d) obtained asymmetry at 90°.
The number of features extracted by architecture.
| CNN Architecture |
|
|---|---|
| VGG19 | 4096 |
| VGG16 | 4096 |
| ResNET-50 | 2048 |
| Inception v3 | 2048 |
| Mobilenet v1 | 1024 |
| Mobilenet v2 | 1280 |
| DenseNET-201 | 1920 |
| Xception | 2048 |
Example of features that expose the lowest Mutual Information (MI) values.
| Inception v3 + Handcraft Features | |
|---|---|
|
|
|
| 1448 | 3.813956037657107 × 10 |
| 648 | 1.4314941993776031 × 10 |
| 1020 | 3.4236070477255964 × 10 |
| 804 | 3.515106075036023 × 10 |
| 333 | 4.213506368255793 × 10 |
| 562 | 4.4971751217204314 × 10 |
| 91 | 5.133156269931938 × 10 |
| 852 | 6.514623080855486 × 10 |
| 1689 | 7.53828133426282 × 10 |
| 1788 | 7.605629690621285 × 10 |
Example of features that expose the highest MI values.
| Inception v3 + Handcraft Features | |
|---|---|
|
|
|
| Mean_b | 0.09829037284938669 |
| Min_b | 0.09536305749234275 |
| Max_b | 0.06834317593527395 |
| 578 | 0.06131147578510121 |
| Min_G | 0.05817685924318594 |
| 116 | 0.055293703553799256 |
| 389 | 0.05446628875169646 |
| 464 | 0.05424503079140042 |
| Var_L | 0.05420063226575533 |
| 288 | 0.053949117014718606 |
Figure 6Distribution of classes on the ISIC2018/HAM10000 dataset.
Performance results of the proposed method using selected deep learning architectures fused with handcraft features.
| Acc. Train | Acc. Test | Sensibility | Specificity | Precision | F-Score | AUC | G-Mean | IBA | MCC | |
|---|---|---|---|---|---|---|---|---|---|---|
| VGG16 | 88.60 | 84.90 | 79.23 | 0.85 | 88.74 | 83.71 | 84.79 | 0.85 | 0.72 | 0.7012 |
| VGG19 | 90.23 | 87.14 | 82.46 | 0.87 | 90.44 | 86.26 | 87.05 | 0.87 | 0.76 | 0.7451 |
| Mobilenet v1 | 91.48 | 89.32 | 84.04 | 0.89 | 93.49 | 88.51 | 89.21 | 0.89 | 0.79 | 0.7898 |
| Mobilenet v2 |
|
|
|
|
|
|
|
|
|
|
| ResNET-50 | 90.67 | 87.86 | 81.24 | 0.88 | 93.09 | 86.76 | 87.72 | 0.87 | 0.77 | 0.7624 |
| DenseNET-201 | 91.10 | 88.54 | 83.25 | 0.88 | 92.61 | 87.68 | 88.44 | 0.88 | 0.78 | 0.5985 |
| Inception V3 | 91.33 | 88.10 | 84.87 | 0.88 | 90.59 | 87.42 | 88.02 | 0.88 | 0.77 | 0.7632 |
| Xception | 90.47 | 87.53 | 83.19 | 0.87 | 90.58 | 86.73 | 87.44 | 0.87 | 0.76 | 0.7525 |
Comparison between our novel CAD and state-of-the-art CADs.
| Metric | [ | [ | [ | Proposed Method Using |
|---|---|---|---|---|
| Accuracy | 86.07 | 95 | 85.55 | 92.40 |
| Sensibility | 78.93 | 93 | 86 | 86.41 |
| Specificity | 93.25 | - | - | 90 |
| Precision | - | 93 | 85 | 92.08 |
| F-Score | - | - | 86 | 89.16 |
| G-Mean | - | - | - | 0.90 |
| IBA | - | - | - | 0.80 |
| MCC | - | - | - | 0.7953 |
| Imbalance Data | Yes | No | No | No |
| Fused Data | Yes | Yes | Yes | Yes |
| Type of Classification | Binary | Binary | Multiclass | Binary |