| Literature DB >> 35741255 |
Miguel Mascarenhas1,2,3, João Afonso1,2, Tiago Ribeiro1,2, Hélder Cardoso1,2,3, Patrícia Andrade1,2,3, João P S Ferreira4,5, Miguel Mascarenhas Saraiva6, Guilherme Macedo1,2,3.
Abstract
BACKGROUND: Colon capsule endoscopy (CCE) is an alternative for patients unwilling or with contraindications for conventional colonoscopy. Colorectal cancer screening may benefit greatly from widespread acceptance of a non-invasive tool such as CCE. However, reviewing CCE exams is a time-consuming process, with risk of overlooking important lesions. We aimed to develop an artificial intelligence (AI) algorithm using a convolutional neural network (CNN) architecture for automatic detection of colonic protruding lesions in CCE images. An anonymized database of CCE images collected from a total of 124 patients was used. This database included images of patients with colonic protruding lesions or patients with normal colonic mucosa or with other pathologic findings. A total of 5715 images were extracted for CNN development. Two image datasets were created and used for training and validation of the CNN. The AUROC for detection of protruding lesions was 0.99. The sensitivity, specificity, PPV and NPV were 90.0%, 99.1%, 98.6% and 93.2%, respectively. The overall accuracy of the network was 95.3%. The developed deep learning algorithm accurately detected protruding lesions in CCE images. The introduction of AI technology to CCE may increase its diagnostic accuracy and acceptance for screening of colorectal neoplasia.Entities:
Keywords: artificial intelligence; colon capsule endoscopy; colorectal neoplasia; convolutional neural network
Year: 2022 PMID: 35741255 PMCID: PMC9222144 DOI: 10.3390/diagnostics12061445
Source DB: PubMed Journal: Diagnostics (Basel) ISSN: 2075-4418
Figure 1Summary of study design for the training and validation phases. PR—protruding lesion; N—normal mucosa or other findings.
Figure 2Heatmaps (A) and output (B) obtained from the application of the convolutional neural network. (A) Examples of heatmaps showing CCE features of protruding lesions as identified by the CNN. (B) The bars represent the probability estimated by the network.
Figure 3Evolution of the accuracy of the convolutional neural network during training and validation phases, as the training and validation datasets were repeatedly inputted in the neural network.
Confusion matrix and performance marks.
| Expert Classification | |||
|---|---|---|---|
| Protruding Lesion | Normal Mucosa | ||
| CNN classification | Protruding lesion | 434 | 6 |
| Normal mucosa | 48 | 655 | |
| Sensitivity | 90.0% | ||
| Specificity | 99.1% | ||
| PPV | 98.6% | ||
| NPV | 93.2% | ||
| Accuracy | 95.3% | ||
Abbreviations: CNN—convolutional neural network; PPV—positive predictive value; NPV—negative predictive value.
Figure 4ROC analyses of the network’s performance in the detection of protruding lesions vs. normal colonic mucosa/other findings. ROC—receiver operator characteristics. PR—protruding lesion.
Three-fold cross validation.
| Sensitivity (%) | Specificity (%) | PPV (%) | NPV (%) | Accuracy (%) | AUC | |
|---|---|---|---|---|---|---|
|
| 82.8 | 97.5 | 62.6 | 99.1 | 96.9 | 0.980 |
|
| 87.4 | 95.9 | 57.1 | 99.2 | 95.4 | 0.970 |
|
| 92.1 | 94.7 | 48.4 | 99.6 | 94.6 | 0.980 |
|
| 87.4 ± 4.6 | 96.1 ± 1.4 | 56.0 ± 7.1 | 99.3 ± 0.2 | 95.6 ± 1.1 | 0.976 ± 0.006 |
Abbreviations: ±SD—±standard deviation; PPV—positive predictive value; NPV—negative predictive value; AUC—area under the receiving operator characteristics curve.