Literature DB >> 35873509

Implementation of artificial intelligence in upper gastrointestinal endoscopy.

Sayaka Nagao1,2, Yasuhiro Tani3, Junichi Shibata4, Yosuke Tsuji1, Tomohiro Tada4,5,6, Ryu Ishihara3, Mitsuhiro Fujishiro1.   

Abstract

The application of artificial intelligence (AI) using deep learning has significantly expanded in the field of esophagogastric endoscopy. Recent studies have shown promising results in detecting and differentiating early gastric cancer using AI tools built using white light, magnified, or image-enhanced endoscopic images. Some studies have reported the use of AI tools to predict the depth of early gastric cancer based on endoscopic images. Similarly, studies based on using AI for detecting early esophageal cancer have also been reported, with an accuracy comparable to that of endoscopy specialists. Moreover, an AI system, developed to diagnose pharyngeal cancer, has shown promising performance with high sensitivity. These reports suggest that, if introduced for regular use in clinical settings, AI systems can significantly reduce the burden on physicians. This review summarizes the current status of AI applications in the upper gastrointestinal tract and presents directions for clinical practice implementation and future research.
© 2022 The Authors. DEN Open published by John Wiley & Sons Australia, Ltd on behalf of Japan Gastroenterological Endoscopy Society.

Entities:  

Keywords:  adenocarcinoma of the esophagus; artificial intelligence; esophageal squamous cell carcinoma; pharyngeal neoplasms; stomach neoplasms

Year:  2022        PMID: 35873509      PMCID: PMC9302271          DOI: 10.1002/deo2.72

Source DB:  PubMed          Journal:  DEN open        ISSN: 2692-4609


INTRODUCTION

In recent years, the application of artificial intelligence (AI) technology using deep learning, especially convolutional neural network technology, is expanding in various medical fields. A similar trend is seen in the field of gastrointestinal (GI) endoscopy. AI systems for detecting colorectal polyps are commercially available in Japan, the United States, and some European countries. In addition, AI for detecting early esophageal cancer in Barrett's esophagus (BE) has been commercialized and is scheduled to be released in European countries. The world is collectively moving the stage of developing AI systems to the stage of implementing them. In this literature review, we discuss the latest findings from papers on convolutional neural network‐based imaging AI for detecting and diagnosing gastric, esophageal, and pharyngeal cancers. In addition, we discuss the role of AI in diagnosing Helicobacter pylori (H. pylori) gastritis and the anatomical classification of the stomach based on endoscopic images. Based on these research papers, we discuss the prospects of endoscopic diagnosis using AI in the field of upper GI tract endoscopy.

AI FOR DETECTION OF GASTRIC CANCER

Gastric cancer is one of the major cancer types diagnosed globally and is the third leading cause of cancer‐related deaths worldwide. Even in Japan, where mass screening for gastric cancer has long been established, the 5‐year overall survival rate of node‐negative early gastric cancer with stage IA is reported to be 91.5%. Therefore, early detection and treatment of gastric cancer are mandatory. Endoscopy plays an important role in diagnosing and treating early gastric cancer; endoscopic diagnosis is imperative, and endoscopic submucosal dissection (ESD) is widely used to treat early gastric cancer. , In recent years, studies have reported the effectiveness of endoscopy using AI support systems (Table 1).
TABLE 1

Summary of artificial intelligence for diagnosing in stomach field

Name (year)Ref Study designImaging modalityTraining dataset (images)Test dataset (images)AUCAccuracy (%)Sensitivity (%)Specificity (%)
Detection
Hirasawa T (2018) 6 RetrospectiveWLIAbnormal 13,5842296n/an/a92.2n/a
Sakai Y (2018) 68 RetrospectiveWLI58 patients58 patients0.95887.68094.8
Ishioka M (2019) 7 RetrospectiveWLIAbnormal 13,58468 videosn/an/a94.1n/a
Wu L (2019) 69 RetrospectiveWLI, NBI, BLI

9151

abnormal 3170

200n/a92.59491
Yoon HJ (2019) 21 RetrospectiveWLI

11,539

(abnormal 1705)

0.981n/a9197.6
Luo H (2019) 9 Multicenter, case‐control (including esophageal cancer)WLI

141,570

(abnormal 35,531)

66750

(abnormal 4317)

0.97492.794.692.6
Tang D (2020) 70 RetrospectiveWLI

35,823

(abnormal 26,172)

9417

(abnormal 4153)

0.9487.895.581.7
Ikenoyama Y (2021) 8 RetrospectiveWLIAbnormal 13,584

2940

(abnormal 209)

0.757n/a58.487.3
Wu L (2021) 10 Randomized controlled trialWLI, NBI, BLI

7321

(abnormal 2530)

302,692n/a84.710084.3
H.pylori infection
Huang CR (2004) 71 ProspectiveWLI30 patients74 patientsn/an/a85.490.9
Shichijo S (2017) 11 RetrospectiveWLI32,20811,4810.9387.788.987.4
Itoh T (2018) 72 ProspectiveWLI149300.956n/a86.786.7
Nakashima H (2018) 73 ProspectiveWLI, BLI‐bright, LCI162 patients60 patients

0.66 (WLI)

0.96 (BLI‐bright)

0.95 (LCI)

n/a

66.7 (WLI)

96.7 (BLI‐bright)

96.7 (LCI)

60 (WLI)

86.7 (BLI‐bright)

83.3 (LCI)

Shichijo S (2019) 12 RetrospectiveWLI98,56423,699n/a

80(H. pylori‐negative)

48(‐positive)

84(‐eradicated)

n/an/a
Zheng W (2019) 74 RetrospectiveWLI11,72937550.9793.891.698.6
Guimarães P (2020) 75 RetrospectiveWLI200700.98192.910087.5
Yasuda T (2020) 76 RetrospectiveLCI32 patients105 patientsn/a87.690.585.7
Zhang Y (2020) 77 RetrospectiveWLI54700.9994.2494.5894.01
Nakashima H (2020) 13 ProspectiveWLI, LCI12,887120 videos

0.90 (LCI, uninfected)

0.82 (LCI, currently infected)

0.77 (LCI, post‐eradication)

75.0 (WLI, uninfected)

84.2 (LCI, uninfected)

77.5 (WLI, currently infected)

82.5 (LCI, currently infected)

74.2 (WLI, post‐eradication)

79.2 (LCI, post‐eradication)

95.0 (WLI, uninfected)

92.5 (LCI, uninfected)

60.0 (WLI, currently infected)

62.5 (LCI, currently infected)

35.0 (WLI, post‐eradication)

65.0 (LCI, post‐eradication)

65.0 (WLI, uninfected)

80.0 (LCI, uninfected)

86.2 (WLI, currently infected)

92.5 (LCI, currently infected)

93.8 (WLI, post‐eradication)

86.2 (LCI, post‐eradication)

Xu M (2021) 78 ProspectiveME‐NBI, ME‐BLI354 patients77 patients0.87887.896.773
Invasion depth
Kubota K (2012) 79 RetrospectiveWLI902n/a64.7n/an/a
Zhu Y (2019) 20 RetrospectiveWLI7902030.9489.1676.4795.56
Yoon HJ (2019) 21 RetrospectiveWLI17050.851n/a79.277.8
Nagao S (2020) 22 RetrospectiveWLI, NBI, indigo‐carmine dye contrast imaging13,6282929

0.9590 (WLI)

0.9048 (NBI)

0.9491 (indigo‐carmine dye contrast imaging)

94.49 (WLI)

94.30 (NBI)

95.50 (indigo‐carmine dye contrast imaging)

84.42 (WLI)

75.00 (NBI)

87.50 (indigo‐carmine dye contrast imaging)

99.37 (WLI)

100 (NBI)

100 (indigo‐carmine dye contrast imaging)

Cho BJ (2020) 80 RetrospectiveWLI28992060.88777.380.480.7
Tang D (2021) 81 RetrospectiveWLI34072280.94288.1690.4885.29

Abbreviations: AUC, area under the curve; BLI, blue laser imaging; H. pylori, Helicobacter pylori; LCI, linked color imaging; ME, magnifying endoscopy; NBI, narrow‐band imaging; WLI, white light imaging.

Summary of artificial intelligence for diagnosing in stomach field 9151 abnormal 3170 11,539 (abnormal 1705) 141,570 (abnormal 35,531) 66750 (abnormal 4317) 35,823 (abnormal 26,172) 9417 (abnormal 4153) 2940 (abnormal 209) 7321 (abnormal 2530) 0.66 (WLI) 0.96 (BLI‐bright) 0.95 (LCI) 66.7 (WLI) 96.7 (BLI‐bright) 96.7 (LCI) 60 (WLI) 86.7 (BLI‐bright) 83.3 (LCI) 80(H. pylori‐negative) 48(‐positive) 84(‐eradicated) 0.90 (LCI, uninfected) 0.82 (LCI, currently infected) 0.77 (LCI, post‐eradication) 75.0 (WLI, uninfected) 84.2 (LCI, uninfected) 77.5 (WLI, currently infected) 82.5 (LCI, currently infected) 74.2 (WLI, post‐eradication) 79.2 (LCI, post‐eradication) 95.0 (WLI, uninfected) 92.5 (LCI, uninfected) 60.0 (WLI, currently infected) 62.5 (LCI, currently infected) 35.0 (WLI, post‐eradication) 65.0 (LCI, post‐eradication) 65.0 (WLI, uninfected) 80.0 (LCI, uninfected) 86.2 (WLI, currently infected) 92.5 (LCI, currently infected) 93.8 (WLI, post‐eradication) 86.2 (LCI, post‐eradication) 0.9590 (WLI) 0.9048 (NBI) 0.9491 (indigo‐carmine dye contrast imaging) 94.49 (WLI) 94.30 (NBI) 95.50 (indigo‐carmine dye contrast imaging) 84.42 (WLI) 75.00 (NBI) 87.50 (indigo‐carmine dye contrast imaging) 99.37 (WLI) 100 (NBI) 100 (indigo‐carmine dye contrast imaging) Abbreviations: AUC, area under the curve; BLI, blue laser imaging; H. pylori, Helicobacter pylori; LCI, linked color imaging; ME, magnifying endoscopy; NBI, narrow‐band imaging; WLI, white light imaging. Although it is known that endoscopy helps detect gastric cancer early, a meta‐analysis revealed that the missed rate of upper GI cancer is 6.4% and 11.3% within 1 and 3 years, respectively, before diagnosis, indicating a certain probability of missed cases. To reduce the number of missed cases to the maximum extent and to detect early gastric cancer with stable performance, researchers have developed an AI support system to detect gastric cancer in recent years. In 2018, Hirasawa et al. developed a gastric cancer detection AI using 13,584 gastric cancer images as a training set. The gastric cancer diagnostic ability for a test set of 2296 images showed a very high sensitivity of 92.2%. In addition, they demonstrated that the developed AI system achieved a sensitivity as high as 94.1% using video images of 68 lesions. According to a report comparing gastric cancer diagnosis rates of AI against endoscopists, AI showed a sensitivity of 58.4%, exceeding that of endoscopists (31.9%). These results imply that using an AI support system might improve the detection rate of gastric cancer. A multicenter, case‐control study conducted by Luo et al., in 2019, to evaluate gastric and esophageal cancers showed an accuracy of 92.7% for cancer detection in the prospective validation set. Wu et al. developed an AI system to reduce the number of blind spots and detect gastric cancer (ENDOANGEL) and conducted a randomized controlled study to verify its diagnostic effectiveness. In their study, AI achieved an accuracy of 84.7%, sensitivity of 100%, and specificity of 84.3% for detecting gastric cancer, demonstrating that the diagnostic ability of the AI‐assisted endoscopy group was better than that of the control group. Several reports have suggested the effectiveness of AI‐assisted endoscopy for the early detection of gastric cancer. These reports might accelerate the adoption of AI‐based tools in real‐world clinical practice in the future.

AI FOR DIAGNOSIS OF H. PYLORI INFECTION

H. pylori infection is one of the most critical risk factors for gastric cancer. Data mining the presence or absence of H. pylori infection by endoscopy can help identify the high‐ or low‐risk population for gastric cancer and contribute to the early diagnosis of gastric cancer. Shichijo et al., in 2017, reported the use of AI to detect the presence of H. pylori infection from gastric mucosal findings by endoscopy. The AI was trained using 32,208 images for the training set. Its discriminative ability to detect H. pylori infection was evaluated on a test set of 11,481 images. The accuracy of detecting H. pylori infection was found to be 87.7%, with a sensitivity of 88.9% and specificity of 87.4%. This indicated excellent diagnostic performance and superiority of detection to that of beginner endoscopists. In 2019, Shichijo et al. developed an AI system that could discriminate between H. pylori‐positive, H. pylori‐negative, and H. pylori‐eradicated using a training set of 98,564 images. The ability to discriminate among H. pylori‐positive, H. pylori‐negative, and H. pylori‐eradicated was evaluated on a test set of 23,699 images, with a diagnostic accuracy of 80% (H. pylori‐negative), 48% (H. pylori‐positive), and 84% (H. pylori‐eradicated), respectively. Nakashima et al. also evaluated the accuracy of H. pylori diagnosis using white light imaging (WLI) and linked color imaging (LCI), a type of equipment‐based image‐enhanced endoscopy (IEE). The accuracy of detection was found to be 75.0% (WLI, uninfected), 84.2% (LCI, uninfected), 77.5% (WLI, currently infected), 82.5% (LCI, currently infected), 74.2% (WLI, post‐eradication), and 79.2% (LCI, post‐eradication), respectively, indicating higher accuracy in LCI than in WLI. These studies suggest the usefulness of AI support systems in diagnosing H. pylori infection. Combining AI screening with IEE will be an interesting topic for exploration in the future.

AI FOR DIAGNOSIS OF THE INVASION DEPTH OF GASTRIC CANCER

Since the 2000s, ESD has been developed as an improved version of endoscopic mucosal resection. The development of ESD has made it possible to perform en bloc resection of many lesions regardless of the presence of ulcer scars or the size of the lesion and achieve a good long‐term prognosis comparable to surgical treatment. , It allowed clinicians to investigate the risk of lymph node metastasis in surgically resected gastric cancer, thereby expanding the range of lesions amenable to ESD. This further established ESD as a minimally invasive and curative treatment for early gastric cancer. , ESD is an excellent treatment method that preserves organs and ensures the patient's quality of life in terms of early recovery of pain and function and subsequent appetite and nutrition. Among the several factors, including histological type, tumor size, presence or absence of lymphovascular infiltration, and presence or absence of ulcerative findings, invasion depth is an essential factor in determining the curability of ESD. In most cases of intramucosal cancer (M cancer) and < 500 μm from the muscularis mucosae cancer (SM1 cancer), follow‐up after ESD is acceptable. However, additional surgical resection is needed for submucosal invasive cancer deeper than 500 μm (SM2 cancer). Therefore, discriminating between M‐SM1 cancer and cancer deeper than SM2 is an essential criterion in determining the treatment strategy for gastric cancer. In recent years, AI tools have been used to diagnose the invasion depth of gastric cancer. Zhu et al. assessed the efficacy of AI tools for assessing invasion depth of gastric cancer (M‐SM1 vs. SM2 or deeper). They observed a sensitivity of 76.5%, specificity of 95.6%, and an accuracy of 89.2%, with higher accuracy and specificity than endoscopists. Yoon et al. also investigated the same topic and reported a sensitivity of 79.2% and specificity of 77.8% for invasion depth. Nagao et al. reported that their AI system accurately predicted the invasion depth of gastric cancer (M‐SM1 vs. SM2 or deeper), with a sensitivity per lesion of 84.4%, specificity of 99.4%, and accuracy of 94.5% (Figure 1). Nagao et al. also evaluated the diagnostic ability of AI systems dedicated to narrow‐band imaging (NBI) and indigo‐carmine dye contrast imaging. They found that in NBI, the sensitivity, specificity, and accuracy per lesion were 75.0%, 100.0%, and 94.3%, respectively. For indigo‐carmine dye contrast imaging, the sensitivity, specificity, and accuracy per lesion were 87.5%, 100.0%, and 95.5%, respectively. There were no significant differences among the three AI systems in terms of diagnostic ability. These reports suggest that the AI support system may be helpful to detect invasion depth. It must be verified whether the prediction is more accurate when the AI system is combined with an endoscopist's guidance in real‐world clinical practice. Improving the accuracy of AI‐supported diagnosis of invasion depth can help select the most appropriate treatment improving the standard of care for all the patients.
FIGURE 1

Gastric cancer depth prediction using artificial intelligence (AI) support system. (a) The AI support system correctly predicted intramucosal cancer. (b) The AI support system correctly predicted submucosal invasive cancer deeper than 500 μm

Gastric cancer depth prediction using artificial intelligence (AI) support system. (a) The AI support system correctly predicted intramucosal cancer. (b) The AI support system correctly predicted submucosal invasive cancer deeper than 500 μm

AI FOR DIAGNOSIS OF ESOPHAGEAL SQUAMOUS CELL CARCINOMA

Esophageal cancer is the seventh most common cancer and the sixth most common cause of cancer‐related mortality worldwide. Squamous cell carcinoma is the predominant type of esophageal cancer in Asia, Africa, and South America. The prognosis for advanced esophageal squamous cell carcinoma (ESCC) is poor. However, if detected at an early stage and resected endoscopically, a favorable prognosis can be expected. IEE, such as NBI, helps detect early ESCC. However, the same can be challenging for less experienced endoscopists. Experienced endoscopists may miss early ESCC due to several reasons, including physical condition and carelessness. As a result, patients with missed early ESCC can lose the opportunity for endoscopic treatment. In such cases, an AI system can potentially reduce the chances of early ESCC being overlooked due to human factors. The usefulness of AI in detecting and characterizing ESCC has already been reported in many studies (Table 2). , , , , , , , , Several studies have used video images as validation sets, , , , , which is more realistic and challenging than still images. Waki et al. evaluated the detection of an AI system using 100 video images (Figure 2). In this study, the AI system had high sensitivity (85.7%, 54 of 63 early ESCCs) for detecting ESCC and increased endoscopists’ sensitivity without reducing specificity. Shiroma et al. evaluated the efficiency of an AI system using slow‐ and high‐speed video images. The sensitivity of the AI system was 100% (32 of 32 early ESCCs) in the slow‐speed videos and 85% (17 of 20 cases) in the high‐speed videos. Moreover, the sensitivity of endoscopists improved with the real‐time assistance of the AI diagnostic system. These studies were unique in such a manner that the validation video images were captured by passing the endoscope through the esophagus at a constant speed without focusing on the lesions or any particular parts to simulate the situation of overlooking ESCC.
TABLE 2

Summary of artificial intelligence in the detection of early esophageal squamous cell carcinoma (ESCC) with non‐magnified endoscopy

Name (year)Ref Study design Histology of cases AI algorithm Endoscopic images Data category Number of cases in test dataset Number of controls in test dataset TP FP FN TN
Cai (2019) 27 RetrospectiveESCC/HGIN/LGINCNNWLIStill images91 images96 normal images8914282
Fukuda (2020) 28 RetrospectiveESCCCNNNBI/BLIVideo images45 ESCCs99 normal and noncancerous lesions4148451
Ohmori (2020) 30 RetrospectiveESCCCNNWLIStill images52 ESCCs83 normal and noncancerous lesions4720563
    NBI/BLIStill images52 ESCCs83 normal and non‐cancerous lesions5231052
Yang (2020) 31 RetrospectiveESCCCNN

WLI/OE/

Iodine stain

Still images76 ESCCs780 normal/benign lesions74112769
    WLI/OEVideo images20 ESCCs28 video images of normal esophagus192126
Li (2021) 32 RetrospectiveESCC/HGIN/LGINCNNWLI/NBIStill images266 images366 normal images2523714329
Shiroma (2021) 33 RetrospectiveESCCCNNWLIVideo images20 ESCC patients20 patients without ESCC151456
    NBIVideo images20 ESCC patients20 patients without ESCC114916
Waki (2021) 34 RetrospectiveESCCCNNNBI/BLIVideo images63 ESCCs (50 video images)50 video images of normal and noncancerous lesions5430920
Wang (2021) 35 RetrospectiveESCC/HGD/LGDCNNWLI/NBIStill images210 images54 images of normal esophagus20216838

Abbreviations: AI, artificial intelligence; BLI, blue‐laser imaging; CNN, convolutional neural network; ESCC, esophageal squamous cell carcinoma; FN, false negative; FP, false positive; HGD, high‐grade dysplasia; HGIN, high‐grade intraepithelial neoplasia; LGD, low‐grade dysplasia; LGIN, low‐grade intraepithelial neoplasia; NBI, narrow‐band imaging; OE, optical enhancement; TN, true negative; TP, true positive; WLI, white‐light imaging.

FIGURE 2

Detection of esophageal squamous cell carcinoma (ESCC) by artificial intelligence (AI) system. (a) The lesion was brownish and slightly depressed in narrow‐band imaging. (b) The lesion was indicated in pink by the AI system

Summary of artificial intelligence in the detection of early esophageal squamous cell carcinoma (ESCC) with non‐magnified endoscopy WLI/OE/ Iodine stain Abbreviations: AI, artificial intelligence; BLI, blue‐laser imaging; CNN, convolutional neural network; ESCC, esophageal squamous cell carcinoma; FN, false negative; FP, false positive; HGD, high‐grade dysplasia; HGIN, high‐grade intraepithelial neoplasia; LGD, low‐grade dysplasia; LGIN, low‐grade intraepithelial neoplasia; NBI, narrow‐band imaging; OE, optical enhancement; TN, true negative; TP, true positive; WLI, white‐light imaging. Detection of esophageal squamous cell carcinoma (ESCC) by artificial intelligence (AI) system. (a) The lesion was brownish and slightly depressed in narrow‐band imaging. (b) The lesion was indicated in pink by the AI system An accurate diagnosis of the invasion depth is essential when determining the treatment strategy for ESCC because clinically diagnosed epithelium (EP)/lamina propria mucosa (LPM) and muscularis mucosa (MM)/submucosal cancers invade up to 200 μm (SM1) are indication for endoscopic resection . In contrast, esophagectomy or chemoradiotherapy is mainly indicated for SM2‐3 ESCC. , Magnified endoscopy (ME) and endoscopic ultrasonography are preferable to non‐ME for diagnosing invasion depth in ESCC. However, extensive knowledge and experience are essential to master these modalities. Furthermore, evaluating the invasion depth using these techniques is susceptible to interobserver differences. Objective evaluation using a high‐performance AI system may help less experienced endoscopists, as well as experienced endoscopists, reach an appropriate diagnosis. There are several reports on the diagnosis of the invasion depth of superficial ESCC using AI. Tokai et al. developed an AI system to distinguish EP‐SM1 ESCC from deeper than SM2 ESCC with non‐ME still images. The accuracy was found to be 80.9%, with an AUC greater than 13 board‐certified endoscopists. Nakagawa et al. developed an AI system to distinguish EP‐SM1 ESCC from SM2‐3 ESCC with non‐ME and ME still images. The accuracy was found to be 91.0%, with a performance similar to 16 experienced endoscopists. Shimamoto et al. developed an AI system to distinguish EP‐SM1 from SM2‐3 in superficial ESCC using 102 video images consisting of two types: non‐ME with WLI and ME with NBI/blue‐laser imaging. The accuracy of the AI system in non‐ME videos and ME videos was found to be 87.3% and 89.2%, respectively, higher than 14 board‐certified endoscopists.

AI FOR DIAGNOSIS OF ESOPHAGEAL ADENOCARCINOMA

Esophageal adenocarcinoma (EAC) is the predominant esophageal cancer in North America and Europe. BE is a known risk factor for EAC, and endoscopic surveillance of BE is recommended. Advanced EAC requires invasive treatment and has a poor prognosis. In contrast, T1 EAC can be cured with less invasive endoscopic treatment. , Early detection is vital to reduce mortality related to EAC. However, early detection remains a challenging task for non‐experts. An AI tool could possibly support the endoscopic diagnosis of EAC. Several studies on the AI system for diagnosing early EAC have been reported in the West , , , , and a few of them were about real‐time diagnosis (Table 3). , de Groof et al. developed an AI system to detect Barrett's neoplasia, which achieved accuracy higher than any of the 53 endoscopists. Furthermore, this AI system detected Barrett's neoplasia with high accuracy during live endoscopic procedures in a prospective pilot study. Ebigbo et al. developed an AI system to capture random images from a real‐time camera and differentiate between normal BE and early EAC; the sensitivity, specificity, and accuracy of this system were 83.7%, 100.0%, and 89.9%, respectively. These studies highlighted the usefulness of AI systems for early EAC. However, most of the studies were performed in Western countries. The characteristics of EAC were different in the West and Asia ; therefore, it is questionable whether the AI system developed using the training set based on Western cases is acceptable for clinical practice in Asia. As the number of EACs in Asia is suggested to increase over coming years, developing an AI system trained with EAC cases in Asia is imperative. Iwagami et al. developed an AI system based on Japanese cases to detect esophageal and esophagogastric junctional adenocarcinoma. They observed a sensitivity, specificity, and accuracy of 94%, 42%, and 66%, respectively.
TABLE 3

Summary of artificial intelligence in the detection of early esophageal adenocarcinoma (EAC) with non‐magnified endoscopy

Reference (year) Study design Histology of cases AI algorithm Endoscopic images Data category Number of cases in test dataset Number of controls in test dataset TP FP FN TN
de Groof (2020) 48 RetrospectiveEAC/HGDCNNWLIStill image209 images248 images of non‐dysplastic BE1863123217
de Groof (2020) 46 ProspectiveEAC/HGDCNNWLIStill image33 images111 images of non‐dysplastic BE2515896
Hashimoto (2020) 55 RetrospectiveEAC/HGDCNNWLI(+near focus)Still image146 images79 images of non‐dysplastic BE14412295
    NBI(+near focus)Still image79 images126 images of non‐dysplastic BE7316125
Iwagami (2021) 51 RetrospectiveEAC(EGJ)CNNWLI/NBI/BLIStill image36 EACs43 non‐cancerous3425218

Abbreviations: AI, artificial intelligence; BE, Barrett's esophagus; BLI, blue‐laser imaging; CNN, convolutional neural network; EAC, esophageal adenocarcinoma; EGJ, esophagogastric junction; FN, false negative; FP, false positive; HGD, high‐grade dysplasia; NBI, narrow‐band imaging; TN, true negative; TP, true positive; WLI, white‐light imaging.

Summary of artificial intelligence in the detection of early esophageal adenocarcinoma (EAC) with non‐magnified endoscopy Abbreviations: AI, artificial intelligence; BE, Barrett's esophagus; BLI, blue‐laser imaging; CNN, convolutional neural network; EAC, esophageal adenocarcinoma; EGJ, esophagogastric junction; FN, false negative; FP, false positive; HGD, high‐grade dysplasia; NBI, narrow‐band imaging; TN, true negative; TP, true positive; WLI, white‐light imaging.

AI FOR DETECTION OF PHARYNGEAL CANCER

Pharyngeal cancer has a poor prognosis because it is often detected at an advanced stage. Patients with advanced pharyngeal cancer require surgery and chemoradiotherapy, which decreases their quality of life. On the other hand, patients with superficial pharyngeal cancer (SPC) can be cured by endoscopic resection, which is less invasive than surgery and chemoradiotherapy. IEE, such as NBI, can help detect SPC. However, it is challenging to perform for less experienced endoscopists. An AI system can possibly improve the detection of SPCs in such cases. Tamashiro et al. evaluated the AI system using 1912 still images from 35 patients with 40 pharyngeal cancers and 40 patients without pharyngeal cancer. The AI system detected all pharyngeal lesions, and the sensitivity and specificity per image were 79.7% and 57.1%, respectively. Kono et al. evaluated an AI system using 25 video images of pharyngeal cancer and 36 video images of non‐pharyngeal cancer. In this study, the sensitivity, specificity, and accuracy for detecting cancer were 92%, 47%, and 66%, respectively.

FUTURE PROSPECTS

The development of AI in the gastric region has progressed significantly, and it is expected to be introduced into real‐world clinical practice in the near future. With the help of diagnostic support from AI tools, trainee endoscopists might be able to reach endoscopic diagnoses similar to expert endoscopists, regardless of their skill level. The use of AI in clinical practice remains an important issue. For example, it remains to be determined whether diagnosis using movies or still images is better for AI‐assisted endoscopy. While real‐time diagnosis is essential for detection, still images might be considered appropriate when detecting H. pylori infection and invasion depth in clinical practice. In addition, it is necessary to investigate how many functions should be included in a single AI system for clinical use in the future. The usefulness of the AI system in diagnosing ESCC has been reported in many studies. However, there are several problems associated with its use in clinical practice. Most of these studies are single‐center retrospective studies, and the images used in validation sets are edited to some extent; therefore, selection bias cannot be ruled out. Well‐designed prospective studies in a multicenter setting are required. The specificity of the AI system for detecting ESCC in studies using video images as a validation set remains very low. This is a further problem in clinical practice because the proportion of ESCC patients in the validation set is higher than in the real world, and, therefore, the positive predictive value would considerably decrease in clinical practice. One of the strategies to solve this problem is to use a combination of two AI systems: a sensitivity‐oriented AI system with non‐ME that focuses on detection and an accuracy‐oriented AI system with ME that focuses on characterization. Although further improvement of the AI system and prospective studies in a multicenter setting is needed, we believe that coming years will witness the use of AI systems for ESCC diagnosis. There are many reports on the usefulness of AI systems for diagnosing EAC, and the AI system will soon help endoscopists diagnose early EAC. However, there are several concerns with its use in clinical practice, such as ESCC. Most of these cases were retrospective studies, and the number of cases in the validation sets was small. Prospective studies with a larger number of cases in a multicenter setting are needed to obtain a better and more accurate algorithm. In these AI systems, still images were used as validation sets. Because the length of BE is short, the AI system based on still images may be helpful in clinical practice. However, an AI system based on video images may be more appropriate for detecting EAC, as it may reduce the chances of overlooking lesions as against an AI trained on pictures with poor quality. Tamashiro et al. and Kono et al. showed high sensitivity in AI‐based diagnosis; however, the performance of AI in terms of specificity was not satisfactory. As Kono et al. mentioned, the complicated structure of the pharyngeal area and poor observation conditions due to the presence of saliva, mucus, or gag reflux might affect specificity, and further training with cancer images and normal structural images under various conditions is required to improve the specificity. An AI system with magnified endoscopic images for characterization may also improve the specificity. However, it is difficult to accumulate sufficient SPC cases in a single institution. It is necessary to train and evaluate an AI system with more SPC and normal structural images from multiple facilities for practical use in clinical practice.

Implementation of AI systems in upper GI endoscopy

AI tools for endoscopic devices, especially for the lower GI tract, have already been certified by regulatory authorities in various countries. Several companies have commercialized AI devices for the real‐time detection of colorectal polyps in Europe. The device authorized for marketing by the US Food and Drug Administration, which uses AI to detect colon polyps and suspected colon tumors in real‐time has been commercialized. In addition, AI devices to detect colorectal polyps and those to differentiate colorectal polyps and to evaluate ulcerative colitis using super‐magnifying endoscopes have been approved by regulatory authorities in Japan. However, there are few authorized AI products for the upper GI tract. AI tools for detecting neoplasia in BE have already obtained CE markings in Europe. However, there are no AI products certified by regulatory authorities to detect gastric cancer or neoplastic lesions of the stomach. As this situation suggests, there are fewer randomized controlled trials and prospective studies on the upper GI tract than on the lower GI tract. , , , , , , , One possible reason for this is the difference in the difficulty of detecting lesions. It has been reported that the false‐negative rate of detection by gastroscopy is higher than that of detection by colonoscopy. Gastric cancer is difficult to recognize, unlike colorectal cancer, and may be overlooked even if the lesion is visible on endoscopic images. ESCC has been reported to be more difficult to detect with white light than with NBI and Lugol chromoendoscopy, which may also be a reason for fewer studies conducted. Moreover, differences in disease incidence by region may have influenced the decision to conduct a major clinical study. The incidence of gastric cancer is high in East Asia, corresponding to the high prevalence of H. pylori. There are two major histological types of esophageal cancer: ESCC and EAC. ESCC is more common in Asia, Africa, and South America, while EAC is more common in North America and Europe. , However, as described in this review, there have been various reports of AI systems for the upper GI tract, and it is expected that many products will emerge in the future that will be certified by the regulatory authorities.

CONCLUSION

This review outlines recent research and the prospects of AI application for the endoscopic diagnosis of the upper GI tract. Unlike the detection of colorectal polyps, the early detection of upper GI cancers by AI can significantly impact prognosis, and its usefulness is highly anticipated. Employing AI‐based endoscopes is expected to enable early cancer detection and, consequently, improve patient prognosis. Due to the difference in diagnostic ability among endoscopists, either due to experience or subjective bias, using an AI tool as an accessory can help reduce the risk of overlooking malignant lesions and equalizing their diagnostic ability. An AI tool can recognize lesions in endoscopic images and determine their probability. However, it cannot perform endoscopy or reach a final diagnosis. Thus, the demand for digestive endoscopists will remain the same despite the introduction of AI tools. In the future, endoscopists will be required to understand the capabilities of AI and its handling and accordingly use endoscopes to navigate and observe the GI tract, including the pharynx.

CONFLICT OF INTEREST

Tada T is a shareholder of AI Medical Service Inc. The authors have no other relevant affiliations or financial involvement with any organization or entity with a financial interest in or financial conflict with the subject matter or materials discussed in the manuscript apart from those disclosed.

FUNDING INFORMATION

None.
  82 in total

1.  Deep-learning based detection of gastric precancerous conditions.

Authors:  Pedro Guimarães; Andreas Keller; Tobias Fehlmann; Frank Lammert; Markus Casper
Journal:  Gut       Date:  2019-08-02       Impact factor: 23.059

2.  Long-term outcomes of endoscopic resection and metachronous cancer after endoscopic resection for adenocarcinoma of the esophagogastric junction in Japan.

Authors:  Seiichiro Abe; Ryu Ishihara; Hiroaki Takahashi; Hiroyuki Ono; Junko Fujisaki; Akira Matsui; Akiko Takahashi; Kenichi Goda; Kenro Kawada; Tomoyuki Koike; Manabu Takeuchi; Yosuke Tsuji; Dai Hirasawa; Tsuneo Oyama
Journal:  Gastrointest Endosc       Date:  2018-12-18       Impact factor: 9.427

3.  Detection of lesions in dysplastic Barrett's esophagus by community and expert endoscopists.

Authors:  Dirk W Schölvinck; Kim van der Meulen; Jacques J G H M Bergman; Bas L A M Weusten
Journal:  Endoscopy       Date:  2016-11-17       Impact factor: 10.093

4.  Real-time artificial intelligence for detection of upper gastrointestinal cancer by endoscopy: a multicentre, case-control, diagnostic study.

Authors:  Huiyan Luo; Guoliang Xu; Chaofeng Li; Longjun He; Linna Luo; Zixian Wang; Bingzhong Jing; Yishu Deng; Ying Jin; Yin Li; Bin Li; Wencheng Tan; Caisheng He; Sharvesh Raj Seeruttun; Qiubao Wu; Jun Huang; De-Wang Huang; Bin Chen; Shao-Bin Lin; Qin-Ming Chen; Chu-Ming Yuan; Hai-Xin Chen; Heng-Ying Pu; Feng Zhou; Yun He; Rui-Hua Xu
Journal:  Lancet Oncol       Date:  2019-10-04       Impact factor: 41.316

5.  Application of convolutional neural networks for evaluating Helicobacter pylori infection status on the basis of endoscopic images.

Authors:  Satoki Shichijo; Yuma Endo; Kazuharu Aoyama; Yoshinori Takeuchi; Tsuyoshi Ozawa; Hirotoshi Takiyama; Keigo Matsuo; Mitsuhiro Fujishiro; Soichiro Ishihara; Ryu Ishihara; Tomohiro Tada
Journal:  Scand J Gastroenterol       Date:  2019-03-17       Impact factor: 2.423

6.  Long-term efficacy and safety of endoscopic resection for patients with mucosal adenocarcinoma of the esophagus.

Authors:  Oliver Pech; Andrea May; Hendrik Manner; Angelika Behrens; Jürgen Pohl; Maren Weferling; Urs Hartmann; Nicola Manner; Josephus Huijsmans; Liebwin Gossner; Thomas Rabenstein; Michael Vieth; Manfred Stolte; Christian Ell
Journal:  Gastroenterology       Date:  2013-11-20       Impact factor: 22.682

7.  Endoscopic detection and differentiation of esophageal lesions using a deep neural network.

Authors:  Masayasu Ohmori; Ryu Ishihara; Kazuharu Aoyama; Kentaro Nakagawa; Hiroyoshi Iwagami; Noriko Matsuura; Satoki Shichijo; Katsumi Yamamoto; Koji Nagaike; Masanori Nakahara; Takuya Inoue; Kenji Aoi; Hiroyuki Okada; Tomohiro Tada
Journal:  Gastrointest Endosc       Date:  2019-10-01       Impact factor: 9.427

8.  Comparative study on artificial intelligence systems for detecting early esophageal squamous cell carcinoma between narrow-band and white-light imaging.

Authors:  Bing Li; Shi-Lun Cai; Wei-Min Tan; Ji-Chun Li; Ayimukedisi Yalikong; Xiao-Shuang Feng; Hon-Ho Yu; Pin-Xiang Lu; Zhen Feng; Li-Qing Yao; Ping-Hong Zhou; Bo Yan; Yun-Shi Zhong
Journal:  World J Gastroenterol       Date:  2021-01-21       Impact factor: 5.742

9.  Application of Convolutional Neural Networks in the Diagnosis of Helicobacter pylori Infection Based on Endoscopic Images.

Authors:  Satoki Shichijo; Shuhei Nomura; Kazuharu Aoyama; Yoshitaka Nishikawa; Motoi Miura; Takahide Shinagawa; Hirotoshi Takiyama; Tetsuya Tanimoto; Soichiro Ishihara; Keigo Matsuo; Tomohiro Tada
Journal:  EBioMedicine       Date:  2017-10-16       Impact factor: 8.143

10.  Artificial Intelligence-Assisted Colonoscopy for Detection of Colon Polyps: a Prospective, Randomized Cohort Study.

Authors:  Yuchen Luo; Yi Zhang; Ming Liu; Yihong Lai; Panpan Liu; Zhen Wang; Tongyin Xing; Ying Huang; Yue Li; Aiming Li; Yadong Wang; Xiaobei Luo; Side Liu; Zelong Han
Journal:  J Gastrointest Surg       Date:  2020-09-23       Impact factor: 3.452

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.