| Literature DB >> 33167043 |
Omer F Ahmad1, Yuichi Mori2,3, Masashi Misawa2, Shin-Ei Kudo2, John T Anderson4, Jorge Bernal5, Tyler M Berzin6, Raf Bisschops7, Michael F Byrne8, Peng-Jen Chen9, James E East10,11, Tom Eelbode12, Daniel S Elson13,14, Suryakanth R Gurudu15, Aymeric Histace16, William E Karnes17, Alessandro Repici18,19, Rajvinder Singh20, Pietro Valdastri21, Michael B Wallace22, Pu Wang23, Danail Stoyanov1, Laurence B Lovat1,24.
Abstract
BACKGROUND : Artificial intelligence (AI) research in colonoscopy is progressing rapidly but widespread clinical implementation is not yet a reality. We aimed to identify the top implementation research priorities. METHODS : An established modified Delphi approach for research priority setting was used. Fifteen international experts, including endoscopists and translational computer scientists/engineers, from nine countries participated in an online survey over 9 months. Questions related to AI implementation in colonoscopy were generated as a long-list in the first round, and then scored in two subsequent rounds to identify the top 10 research questions. RESULTS : The top 10 ranked questions were categorized into five themes. Theme 1: clinical trial design/end points (4 questions), related to optimum trial designs for polyp detection and characterization, determining the optimal end points for evaluation of AI, and demonstrating impact on interval cancer rates. Theme 2: technological developments (3 questions), including improving detection of more challenging and advanced lesions, reduction of false-positive rates, and minimizing latency. Theme 3: clinical adoption/integration (1 question), concerning the effective combination of detection and characterization into one workflow. Theme 4: data access/annotation (1 question), concerning more efficient or automated data annotation methods to reduce the burden on human experts. Theme 5: regulatory approval (1 question), related to making regulatory approval processes more efficient. CONCLUSIONS : This is the first reported international research priority setting exercise for AI in colonoscopy. The study findings should be used as a framework to guide future research with key stakeholders to accelerate the clinical implementation of AI in endoscopy.Entities:
Mesh:
Year: 2021 PMID: 33167043 PMCID: PMC8390295 DOI: 10.1055/a-1306-7590
Source DB: PubMed Journal: Endoscopy ISSN: 0013-726X Impact factor: 9.776
The nine themes and numbers of questions generated for each.
| Research theme | Number of questions |
| Data (access, sharing/privacy, curation) | 8 |
| Technological developments | 11 |
| Clinical adoption and integration into the endoscopy suite | 10 |
| Performance metrics, clinical trial design, and end points | 10 |
| Clinical applications | 5 |
| Training and education of workforce | 3 |
| Regulatory approval | 3 |
| Ethical and legal issues | 6 |
| Health economics | 3 |
Questions in rank order following the final round 3 process.
| Question | Rank | Total score | Mean score | Percentage of responses scored as very high priority (9 or 10) |
| What is the optimum clinical trial design to demonstrate efficacy for polyp detection AI/CAD software? | 1 | 132 | 8.80 | 53 |
| How do we improve the performance of AI/CAD to detect more challenging and advanced lesions (e. g. subtle flat lesions and sessile serrated lesions)? | 2 | 126 | 8.40 | 47 |
| How do we reduce false-positive rates for detection systems to avoid the user developing “alert fatigue”? | 3 | 118 | 7.87 | 47 |
| What are the optimal clinical end points for evaluation of AI/CAD? | 4 | 118 | 7.87 | 27 |
| Can we effectively combine polyp detection and characterization into one workflow? | 5 | 116 | 7.73 | 60 |
| Can we produce more efficient or automated annotation methods for data to reduce the burden on human experts? | 6 | 115 | 7.67 | 40 |
| How do we make the regulatory approval process more efficient and overcome hurdles? | 7 | 113 | 7.53 | 33 |
| How do we demonstrate that AI/CAD detection systems have an impact on interval colorectal cancer rates? | 8 | 112 | 7.47 | 40 |
| What is the optimum clinical trial design to demonstrate efficacy for polyp characterization (optical diagnosis) AI/CAD software? | 9 | 112 | 7.47 | 27 |
| How do we optimize CAD/AI so that it can be used in real-time with minimal latency? | 10 | 111 | 7.40 | 53 |
| What impact might AI/CAD detection and characterization systems have on colonoscopy surveillance intervals and what are the associated costs? | 11 | 111 | 7.40 | 20 |
| Can AI/CAD make endoscopy workflow more efficient (e. g. automated report writing)? | 12 | 109 | 7.27 | 20 |
| Can AI/CAD be used effectively to measure the quality of colonoscopy? | 13 | 107 | 7.13 | 27 |
| How should regulatory agencies deal with the iterative nature of software improvements in AI/CAD? | 14 | 107 | 7.13 | 13 |
| How do we develop quality assurance for annotation/labelling of data? | 15 | 106 | 7.07 | 27 |
| What impact will AI/CAD have on endoscopy training and performance? | 16 | 105 | 7.00 | 33 |
| How do we address data privacy, consent, and ownership issues to effectively share data across different countries and centers for AI/CAD development? | 17 | 105 | 7.00 | 27 |
| What effect will AI/CAD have on colonoscopy outcomes in relation to health economics (e. g. faster workflow, fewer colonoscopies, reduction in colorectal cancer rates) and how do we measure this? | 18 | 105 | 7.00 | 20 |
| How do we define standardized metrics for directly comparing the performance characteristics of different AI software? | 19 | 104 | 6.93 | 13 |
| How do we obtain enough data for categories that might be important for clinical application but are under-represented (e. g. dysplasia detection in inflammatory bowel disease)? | 20 | 102 | 6.80 | 7 |
| What performance thresholds (e. g. ASGE PIVI) are necessary to consider a resect & discard strategy when employing computer-aided diagnosis tools during colonoscopy? | 21 | 100 | 6.67 | 27 |
| Who owns the intellectual property in AI/CAD model development and can this be protected? | 22 | 100 | 6.67 | 13 |
| How do we audit AI/CAD systems once they are deployed in the clinical environment? | 23 | 100 | 6.67 | 7 |
| How do we train AI/CAD systems once they are deployed in order for them to improve and learn continuously in a clinical environment? | 24 | 99 | 6.60 | 13 |
| How do we develop large collaborative, standardized datasets for external validation of AI/CAD systems? | 25 | 98 | 6.53 | 20 |
| Could AI/CAD polyp detection and characterization systems distract endoscopists and impair performance? | 26 | 97 | 6.47 | 13 |
| How do we best train users/clinicians to critically evaluate the AI/CAD system including awareness of limitations to safeguard against incorrect AI/CAD decisions? | 27 | 92 | 6.13 | 13 |
| What is the best type of training data (videos, static images, or both) that should be used for developing polyp detection systems? | 28 | 92 | 6.13 | 7 |
AI, artificial intelligence; CAD, computer-aided diagnosis/detection; ASGE, American Society for Gastrointestinal Endoscopy; PIVI, preservation and incorporation of valuable endoscopic innovations.
Final top 10 questions grouped by themes.
| Theme | Questions |
| Performance metrics, clinical trial design, and end points | What is the optimum clinical trial design to demonstrate efficacy for polyp detection AI/CAD software? |
| Technological developments | How do we improve the performance of AI/CAD to detect more challenging and advanced lesions (e. g. subtle flat lesions and sessile serrated lesions)? |
| Clinical adoption and integration into endoscopy | Can we effectively combine polyp detection and characterization into one workflow? |
| Data (access, sharing/privacy, curation, and annotation) | Can we produce more efficient or automated annotation methods for data to reduce the burden on human experts? |
| Regulatory approval | How do we make the regulatory approval process more efficient and overcome hurdles? |
AI, artificial intelligence; CAD, computer-aided diagnosis/detection.