| Literature DB >> 34092974 |
Tao Yan1, Pak Kin Wong2, Ye-Ying Qin3.
Abstract
Upper gastrointestinal (GI) cancers are the leading cause of cancer-related deaths worldwide. Early identification of precancerous lesions has been shown to minimize the incidence of GI cancers and substantiate the vital role of screening endoscopy. However, unlike GI cancers, precancerous lesions in the upper GI tract can be subtle and difficult to detect. Artificial intelligence techniques, especially deep learning algorithms with convolutional neural networks, might help endoscopists identify the precancerous lesions and reduce interobserver variability. In this review, a systematic literature search was undertaken of the Web of Science, PubMed, Cochrane Library and Embase, with an emphasis on the deep learning-based diagnosis of precancerous lesions in the upper GI tract. The status of deep learning algorithms in upper GI precancerous lesions has been systematically summarized. The challenges and recommendations targeting this field are comprehensively analyzed for future research. ©The Author(s) 2021. Published by Baishideng Publishing Group Inc. All rights reserved.Entities:
Keywords: Artificial intelligence; Convolutional neural network; Deep learning; Endoscopy; Precancerous lesions
Year: 2021 PMID: 34092974 PMCID: PMC8160615 DOI: 10.3748/wjg.v27.i20.2531
Source DB: PubMed Journal: World J Gastroenterol ISSN: 1007-9327 Impact factor: 5.742
Figure 1Infographic with icons and timeline for artificial intelligence, machine learning and deep learning.
Figure 2Illustration of the diagnostic process of physician, machine learning and deep learning. A: Physician diagnostic process; B: Machine learning; C: Deep learning. Conv: Convolutional layer; FC: Fully connected layer; GIM: gastric intestinal metaplasia.
Summary of studies using deep learning for detection of esophageal precancerous lesions
|
|
|
|
|
|
|
|
|
| Cai | 2019 | WLE | Retrospective | Detection of precancerous lesions and early ESCC | -- | 2615 images | Sensitivity: 97.8%. Specificity: 85.4%. Accuracy: 91.4% |
| Guo | 2020 | NBI, M-NBI | Retrospective | Detection of precancerous lesions and early ESCC | SegNet | 13144 images and 168865 video frames | Sensitivity: 96.10% for M-NBI videos, 60.80% for non-M-NBI videos, 98.04% for images. Specificity: 99.90% for non-M-NBI/M-NBI videos, 95.30% for images |
| de Groof | 2020 | WLE | Retrospective | Detection of Barrett’s neoplasia | ResNet/U-Ne | 1544 images | Sensitivity: 91%. Specificity: 89%. Accuracy: 90% |
| de Groof | 2020 | WLE | Retrospective | Detection of Barrett’s neoplasia | ResNet/U-Ne | 494364 unlabeled images and 1704 labeled images | Sensitivity: 90%. Specificity: 88%. Accuracy: 89% |
| Struyvenberg | 2021 | NBI | Retrospective | Detection of Barrett’s neoplasia | ResNet/U-Ne | 2677 images | Sensitivity: 88%. Specificity: 78%. Accuracy: 84% |
| Hashimoto | 2020 | WLE, NBI | Retrospective | Recognition of early neoplasia in BE | Inception-ResNet-v2, YOLO-v2 | 2290 images | Sensitivity: 96.4%. Specificity: 94.2%. Accuracy: 95.4% |
| Hussein | 2020 | WLE | Retrospective | Diagnosis of early neoplasia in BE | Resnet101 | 266930 video frames | Sensitivity: 88.26%. Specificity: 80.13% |
| Ebigbo | 2020 | WLE | Retrospective | Diagnosis of early EAC in BE | DeepLab V.3+, Resnet101 | 191 images | Sensitivity: 83.7%. Specificity: 100%. Accuracy: 89.9% |
| Liu | 2020 | WLE | Retrospective | Detection of esophageal cancer from precancerous lesions | Inception-ResNet | 1272 images | Sensitivity: 94.23%. Specificity: 94.67%. Accuracy: 85.83% |
| Wu | 2021 | WLE | Retrospective | Automatic classification and segmentation for esophageal lesions | ELNet | 1051 images | Classification sensitivity: 90.34%. Classification specificity: 97.18%. Classification accuracy: 96.28%. Segmentation sensitivity: 80.18%. Segmentation Specificity: 96.55%, Segmentation accuracy: 94.62% |
| Ghatwary | 2021 | WLE | Retrospective | Detection of esophageal abnormalities from endoscopic videos | DenseConvLstm, Faster R-CNN | 42425 video frames | Sensitivity: 93.7%. F-measure: 93.2% |
BE: Barrett’s esophagus; DL: Deep learning; EAC: Esophageal adenocarcinoma; ESCC: Esophageal squamous cell carcinoma; M-NBI: Magnifying narrow band imaging; NBI: Narrow band imaging; WLE: White light endoscopy.
Summary of studies using deep learning for detection of gastric precancerous lesions
|
|
|
|
|
|
|
|
|
| Shichijo | 2017 | WLE | Retrospective | Diagnosis of | GoogLeNet | 43689 images | Sensitivity: 88.9%; Specificity: 87.4%; Accuracy: 87.7% |
| Itoh | 2018 | WLE | Retrospective | Analysis of | GoogLeNet | 179 images | Sensitivity: 86.7%; Specificity: 86.7% |
| Zheng | 2019 | WLE | Retrospective | Evaluation of | ResNet-50 | 15484 images | Sensitivity: 91.6%; Specificity: 98.6%; Accuracy: 93.8% |
| Nakashima | 2018 | BLI-bright, LCI | Prospective | Prediction of | GoogLeNet | 666 images | Sensitivity: 96.7%; Specificity: 86.7% |
| Nakashima | 2020 | WLE, LCI | Prospective | Diagnosis of | -- | 13127 images | For currently infected patients, the sensitivity and specificity are 62.5% and 92.5%, respectively |
| Guimarães | 2020 | WLE | Retrospective | Diagnosis of atrophic gastritis | VGG16 | 270 images | Accuracy: 93% |
| Zhang | 2020 | WLE | Retrospective | Diagnosis of atrophic gastritis | DenseNet121 | 5470 images | Sensitivity: 94.5%; Specificity: 94.0%; Accuracy: 94.2% |
| Horiuchi | 2020 | M-NBI | Retrospective | Differentiation between early gastric cancer and gastritis | GoogLeNet | 2826 images | Sensitivity: 95.4%; Specificity: 71.0%; Accuracy: 85.3% |
| Wang | 2019 | WLE | Retrospective | Localization and identification of GIM | DeepLab V.3+ | 200 images | Accuracy: 89.51% |
| Zheng | 2020 | WLE | Retrospective | Detection of atrophic gastritis and GIM | ResNet-50 | 3759 images | Sensitivity for atrophic gastritis: 87.2%; Specificity for atrophic gastritis: 91.1%; Sensitivity for GIM: 90.3%; Specificity for GIM: 93.7% |
| Yan | 2020 | NBI, M-NBI | Retrospective | Diagnosis of GIM | EfficientNetB4 | 2357 images | Sensitivity: 91.9%; Specificity: 86.0%; Accuracy: 88.8% |
| Cho | 2019 | WLE | Prospective | Classification of multiclass gastric neoplasms | Inception-Resnet-v2 | 5217 images | Accuracy: 84.6% |
| Inoue | 2020 | WLE, NBI | Retrospective | Detection of duodenal adenomas and high-grade dysplasias | Single-Shot Multibox Detector | 1511 images | For high-grade dysplasia, the sensitivity and specificity are all 100% |
| Lui | 2020 | NBI | Retrospective | Classification of gastric lesions | ResNet | 3000 images | Sensitivity: 97.1%; Specificity: 85.9%; Accuracy: 91.0% |
BLI-bright: Blue laser imaging-bright; DL: Deep learning; GIM: Gastric intestinal metaplasia; H. pylori: Helicobacter pylori; LCI: Linked color imaging; M-NBI: Magnifying narrow band imaging; NBI: Narrow band imaging; WLE: White light endoscopy.
Figure 3Data augmentation for a typical magnifying narrow band image for training a convolutional neural network model. This is performed by using a variety of image transformations and their combinations. A: Original image; B: Flip horizontal and random rotation; C: Flip vertical and magnification; D: Random rotation and shift; E: Flip horizontal, minification and shift; F: Flip vertical, rotation and shift.
Figure 4Informative features (partially related to lesions areas) acquired by the convolutional neural networks, where warmer colors mean higher contributions to decision making. A: Original endoscopic images; B: Corresponding attention. BE: Barrett’s esophagus; GIM: Gastric intestinal metaplasia.