| Literature DB >> 36236240 |
Marco La Salvia1, Emanuele Torti1, Raquel Leon2, Himar Fabelo2, Samuel Ortega2,3, Francisco Balea-Fernandez4, Beatriz Martinez-Vega2, Irene Castaño5, Pablo Almeida6, Gregorio Carretero5, Javier A Hernandez6, Gustavo M Callico2, Francesco Leporati1.
Abstract
Cancer originates from the uncontrolled growth of healthy cells into a mass. Chromophores, such as hemoglobin and melanin, characterize skin spectral properties, allowing the classification of lesions into different etiologies. Hyperspectral imaging systems gather skin-reflected and transmitted light into several wavelength ranges of the electromagnetic spectrum, enabling potential skin-lesion differentiation through machine learning algorithms. Challenged by data availability and tiny inter and intra-tumoral variability, here we introduce a pipeline based on deep neural networks to diagnose hyperspectral skin cancer images, targeting a handheld device equipped with a low-power graphical processing unit for routine clinical testing. Enhanced by data augmentation, transfer learning, and hyperparameter tuning, the proposed architectures aim to meet and improve the well-known dermatologist-level detection performances concerning both benign-malignant and multiclass classification tasks, being able to diagnose hyperspectral data considering real-time constraints. Experiments show 87% sensitivity and 88% specificity for benign-malignant classification and specificity above 80% for the multiclass scenario. AUC measurements suggest classification performance improvement above 90% with adequate thresholding. Concerning binary segmentation, we measured skin DICE and IOU higher than 90%. We estimated 1.21 s, at most, consuming 5 Watts to segment the epidermal lesions with the U-Net++ architecture, meeting the imposed time limit. Hence, we can diagnose hyperspectral epidermal data assuming real-time constraints.Entities:
Keywords: deep learning; disease diagnosis; high-performance computing; hyperspectral imaging; skin cancer
Mesh:
Substances:
Year: 2022 PMID: 36236240 PMCID: PMC9571453 DOI: 10.3390/s22197139
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.847
Figure 1Proposed experimental framework. (a) Taxonomy of the epidermal lesions included in the HS database, including number of subjects and images in each category; (b) Distribution of images for the binary (left) and multilabel (right) classification problems; (c) Different elements of the HS acquisition system; (d) HS cube characteristics; (e) HS dataset ground-truths; (f) Proposed processing framework based on a k-fold cross-validation, including data augmentation and aggregated model evaluation; (g) Low-power Nvidia Jetson GPU for algorithm deployment to reach real-time performance.
Figure 2CUDA execution logic and data transfer flow.
Figure 3Performance of the epidermal lesion classification. (a,b), Binary and multilabel classification performance of the four different approaches, respectively.
Figure 4Performance of the epidermal lesion segmentation. (a,b), Binary and multilabel segmentation performance of the four different approaches, respectively. The acronyms have the following meanings: Skin (S), Benign (B), Malignant (M), Benign Epithelial (BE), Benign Melanocytic (BM), Malignant Epithelial (ME), and Malignant Melanocytic (MM).
Figure 5Deployment performance. (a) Processing time and (b), power consumption comparisons of the different NVIDIA GPUs considered in this study. Jetson Nano M1 and Jetson Nano M2 indicate the two possible power configurations of the Jetson Nano board, which are 10 and 5 W of power budget, respectively.
Figure 6Mean and standard deviation (std) of the spectral signatures of the HS dataset. (a) Spectral signatures of skin, Benign and Malignant; (b) Spectral signatures of benign epithelial and melanocytic and malignant epithelial and melanocytic. S: Skin, B: Benign; M: Malignant; BE: Benign epithelial; BM: Benign melanocytic; ME: Malignant epithelial; MM: Malignant melanocytic.