| Literature DB >> 35221775 |
Yuxue Zhao1, Bo Hu2, Ying Wang1, Xiaomeng Yin3, Yuanyuan Jiang4, Xiuli Zhu1.
Abstract
The identification of diseases is inseparable from artificial intelligence. As an important branch of artificial intelligence, convolutional neural networks play an important role in the identification of gastric cancer. We conducted a systematic review to summarize the current applications of convolutional neural networks in the gastric cancer identification. The original articles published in Embase, Cochrane Library, PubMed and Web of Science database were systematically retrieved according to relevant keywords. Data were extracted from published papers. A total of 27 articles were retrieved for the identification of gastric cancer using medical images. Among them, 19 articles were applied in endoscopic images and 8 articles were applied in pathological images. 16 studies explored the performance of gastric cancer detection, 7 studies explored the performance of gastric cancer classification, 2 studies reported the performance of gastric cancer segmentation and 2 studies analyzed the performance of gastric cancer delineating margins. The convolutional neural network structures involved in the research included AlexNet, ResNet, VGG, Inception, DenseNet and Deeplab, etc. The accuracy of studies was 77.3 - 98.7%. Good performances of the systems based on convolutional neural networks have been showed in the identification of gastric cancer. Artificial intelligence is expected to provide more accurate information and efficient judgments for doctors to diagnose diseases in clinical work.Entities:
Keywords: Classification; Convolutional neural network; Detection; Diagnosis; Gastric cancer
Year: 2022 PMID: 35221775 PMCID: PMC8856868 DOI: 10.1007/s11042-022-12258-8
Source DB: PubMed Journal: Multimed Tools Appl ISSN: 1380-7501 Impact factor: 2.577
Fig. 1The relationship between artificial intelligence, machine learning, deep learning and convolutional neural network
CNN: Convolutional neural network
Fig. 2PRISMA flowchart shows the searched articles using convolutional neural networks in gastric cancer image
Studies on convolutional neural network and gastric cancer
| Ref | Year | Study design | Location | Patients | Images or Lesions | Images | Lesions | Types of medical images or lesions | Task | Independent test set |
|---|---|---|---|---|---|---|---|---|---|---|
| An et al. [ | 2020 | Retrospective | China | 1095 | Images | 2488 | NR | Endoscopic | Delineating Margins | No |
| Cho et al. [ | 2020 | Retrospective | Korea | 846 | Images | 3105 | NR | Endoscopic | Detect invasion depth | Yes |
| Cho et al. [ | 2019 | Retrospective | Korea | 1269 | Images | 5017 | NR | Endoscopic | Classification | Yes |
| Hirasawa et al. [ | 2018 | Prospective | Japan | 69 | Lesions | NR | 77 | Endoscopic | Detection | Yes |
| Horiuchi et al. [ | 2019 | Retrospective | Japan | NR | Images | 2828 | NR | Endoscopic | Detection | Yes |
| Ikenoyama et al. [ | 2020 | Retrospective | Japan | 2779 | Images | 16,524 | NR | Endoscopic | Detection | Yes |
| Lee et al. [ | 2019 | Retrospective | Korea | NR | Images | 787 | NR | Endoscopic | Detection | No |
| Li et al. [ | 2020 | Retrospective | China | NR | Images | 2429 | NR | Endoscopic | Detection | Yes |
| Ling et al. [ | 2020 | Retrospective | China | 342 | Images | 4969 | NR | Endoscopic | Delineating Margins | Yes |
| Liu et al. [ | 2018 | Retrospective | China | NR | Images | 3871 | NR | Endoscopic | Classification | Yes |
| Lui et al. [ | 2019 | Retrospective | Hong Kong, China | NR | Images | 3000 | NR | Endoscopic | Detection | Yes |
| Luo et al. [ | 2019 | Retrospective | China | 84,424 | Images | 1,036,496 | NR | Endoscopic | Detection | Yes |
| Nagao et al. [ | 2020 | Retrospective | Japan | 1084 | Images/ Lesions | 16,557 | 2434 | Endoscopic | Detect invasion depth | Yes |
| Shibata et al. [ | 2020 | Retrospective | Japan | 135 | Images/ Lesions | 1741 | 94 | Endoscopic | Detection | No |
| Ueyama et al. [ | 2020 | Retrospective | Japan | 349 | Images | 7874 | NR | Endoscopic | Detection | Yes |
| Wang et al. [ | 2019 | Retrospective | China | NR | Images | 104,864 | NR | Endoscopic | Detection | Yes |
| Wu et al. [ | 2019 | Retrospective | China | NR | Images | 9351 | NR | Endoscopic | Detection | Yes |
| Yoon et al. [ | 2019 | Retrospective | Korea | 800 | Images | 11,539 | NR | Endoscopic | Detection/ Detect invasion depth | Yes |
| Zhu et al. [ | 2020 | Retrospective | China | 993 | Images | 993 | NR | Endoscopic | Detect invasion depth | Yes |
| Qu et al. [ | 2018 | Retrospective | Japan | NR | Images | 48,000 | NR | Pathological | Classification | No |
| Cho et al. [ | 2020 | Retrospective | Korea | 432 | Images | 803 | NR | Pathological | Classification | Yes |
| Iizuka et al. [ | 2020 | Retrospective | Japan | NR | Images | 5103 | NR | Pathological | Classification | Yes |
| Sun et al. [ | 2019 | Retrospective | China | NR | Images | 500 | NR | Pathological | Segmentation | No |
| Liang et al. [ | 2019 | Retrospective | China | NR | Images | 1900 | NR | Pathological | Segmentation | No |
| Hu et al. [ | 2019 | Retrospective | China | 30 | Images | 65,328 | NR | Pathological | Classification | No |
| Li et al. [ | 2019 | Prospective | China | 120 | Images | 48,000 | NR | Pathological | Classification | No |
| Song et al. [ | 2020 | Retrospective | China | 4210 | Images | 6917 | NR | Pathological | Detection | Yes |
Note: NR: Not reported
Studies of Images that provided evaluation index results
| Ref | Precision | Accuracy (%) | Sensitivity (%) | Specificity (%) | AUC |
|---|---|---|---|---|---|
| An et al. [ | N/A | 88.9 | N/A | N/A | N/A |
| Cho et al. [ | N/A | 77.3 | 80.4 | 80.7 | 88.7 |
| Cho et al. [ | N/A | 81.9 | 75.9 | 85.3 | 87.7 |
| Hirasawa et al. [ | N/A | 98.6 | 92.2 | N/A | N/A |
| Horiuchi et al. [ | N/A | 85.3 | 95.4 | 71.0 | N/A |
| Ikenoyama et al. [ | N/A | N/A | 58.40 | 87.30 | 75.7 |
| Lee et al. [ | N/A | 96.49 | N/A | N/A | 97 |
| Li et al. [ | N/A | 90.91 | 91.18 | 90.64 | N/A |
| Ling et al. [ | N/A | 88.1 | N/A | N/A | N/A |
| Liu et al. [ | 99 | 96 | 99 | N/A | N/A |
| Lui et al. [ | N/A | 91.0 | 97.1 | 85.9 | 91 |
| Luo et al. [ | N/A | 92.8 | 94.2 | 92.3 | N/A |
| Nagao et al. [ | N/A | N/A | N/A | N/A | 95.90 |
| Shibata et al. [ | N/A | N/A | 96 | N/A | N/A |
| Ueyama et al. [ | N/A | 98.70 | 98 | 100 | N/A |
| Wang et al. [ | N/A | N/A | 79.622 | 78.48 | N/A |
| Wu et al. [ | N/A | 92.5 | 94.0 | 91.0 | N/A |
| Yoon et al. [ | N/A | N/A | 91.0 | 97.6 | 98.1 |
| Zhu et al. [ | N/A | 89.16 | 76.47 | 95.56 | 94 |
| Qu et al. [ | 86.9 | N/A | N/A | N/A | 96.3 |
| Cho et al. [ | N/A | 78 | 100 | 56 | 100 |
| Iizuka et al. [ | N/A | N/A | N/A | N/A | 98 |
| Sun et al. [ | 91.6 | N/A | N/A | N/A | N/A |
| Liang et al. [ | N/A | 91.09 | N/A | N/A | N/A |
| Hu et al. [ | N/A | 94.38 | 94.99 | 93.76 | N/A |
| Li et al. [ | N/A | 96.5 | 96.6 | 96.7 | N/A |
| Song et al. [ | N/A | 87.3 | 99.6 | 84.3 | 98.6 |
Note: N/A: Not available
Fig. 3The literature quality evaluation results. High: The relevant entry in each area is high risk in risk of bias; high concerns in applicability concerns; Unclear: The relevant entry in each area is unclear risk in risk of bias; unclear concerns in applicability concerns; Low: The relevant entry in each area is low risk in risk of bias; low concerns in applicability concerns. Cho 2020 a: Refers to the study whose first author is Bum-Joo Cho [7]. Cho 2020 b: Refers to the study whose first author is Kyung-Ok Cho [8]
Fig. 4Keyword analysis of word cloud containing papers