| Literature DB >> 34248245 |
Chaoran Yu1, Ernest Johann Helwig2.
Abstract
Artificial intelligence (AI) is a fascinating new technology that incorporates machine learning and neural networks to improve existing technology or create new ones. Potential applications of AI are introduced to aid in the fight against colorectal cancer (CRC). This includes how AI will affect the epidemiology of colorectal cancer and the new methods of mass information gathering like GeoAI, digital epidemiology and real-time information collection. Meanwhile, this review also examines existing tools for diagnosing disease like CT/MRI, endoscopes, genetics, and pathological assessments also benefitted greatly from implementation of deep learning. Finally, how treatment and treatment approaches to CRC can be enhanced when applying AI is under discussion. The power of AI regarding the therapeutic recommendation in colorectal cancer demonstrates much promise in clinical and translational field of oncology, which means better and personalized treatments for those in need.Entities:
Keywords: AI technology; Colorectal cancer; Diagnosis; Prediction; Treatment
Year: 2021 PMID: 34248245 PMCID: PMC8255052 DOI: 10.1007/s10462-021-10034-y
Source DB: PubMed Journal: Artif Intell Rev ISSN: 0269-2821 Impact factor: 8.139
Fig. 1Comparison between artificial intelligence (AI), machine learning (ML) and deep learning (DL)
Potential implementation of Artificial intelligence (AI) for epidemiology of colorectal cancer
| Definition | Function example | In colorectal cancer | |
|---|---|---|---|
| GeoAI (Janowicz | Subfield of spatial data science to process geographic information using AI | Image classification, object detection, geo-enrichment | Etiological studies, such as food consumption, genetic predisposition, healthcare variance |
| Digital epidemiology—Global Public Health Intelligence Network (Tarkoma et al. | Increase situational (public health events) awareness and capability and global network links | Early detection of SARS | Increase global capacity to early-detect risks and tumor burdens |
| Digital epidemiology—HealthMap (Tarkoma et al. | Utilizing online informal sources for disease surveillance | Real-time surveillance on COVID-19 | Achieve a comprehensive view of global tumor burden |
| Digital epidemiology—Program for Monitoring Emerging Diseases (Tarkoma et al. | Exploiting the Internet and serving as a warning system | Early reports on COVID-19 | Monitoring of emerging etiological factors associated with colorectal cancer |
SARS: Severe Acute Respiratory Syndrome; COVID-19: Coronavirus disease 2019
Fig. 2Introduction of an artificial neural network (ANN) with input layer (I), hidden layer (H), bias layer (B) and output layer (O). Connections between each node are dynamically adjusted to the feedback of training process. Positive correlation is showed in black line while negative correlation is in grey line. The thickness of line is in proportion to relative significance. Such ANN allows new input and generates given output
Endoscopic studies involving artificial intelligence for diagnosis and prediction of colorectal cancer
| AI algorithm/system | Study design | Sensitivity | Specificity | AUC | Accuracy | |
|---|---|---|---|---|---|---|
| Ichimasa et al. ( | SVM | Prediction of lymph node metastasis post endoscopic resection of T1 colorectal cancer | 100% | 66% | – | 69% |
| Nakajima et al. ( | CNN | Automatic diagnosis system by computer-aided diagnosis (CAD) based on plain endoscopic images | 81% | 87% | 0.888 | 84% |
| Lai et al. ( | DNN | Improve polyp detection and discrimination by CAD | 100% | 100% | – | 74–95% |
| Yamada et al. ( | CNN | Develop a real-time detection system for colorectal neoplasm | 97.3% | 99% | 0.975 | Good and excellent* |
| Chen et al. ( | DNN | Develop a CAD diagnosis system to analyze narrow-bind images | 96.3% | 78.1% | – | 90.1 |
| Repici et al. ( | CNN | To assess the safety and efficacy of a computer-aided detection (CADe) system | – | – | – | – |
| Kudo et al. ( | CNN | To determine diagnostic accuracy of EndoBRAIN | 96.9% | 100% | – | 98% |
| Mori et al. ( | SVM | Evaluate the performance of real-time CADe with endocytoscope | 91.3–95.2% | 65.6–95.9% | – | – |
| Nguyen et al. ( | CNN | To pre-classify the in vivo endoscopic images | 19.6–87.4% | 42.5–90.6% | – | 52.6–68.9% |
| Deding et al. ( | – | To investigate relative sensitivity of colon capsule endoscopies compared with computer tomography colongraphy | 2.67# | – | – | – |
SVM: support vector machines; CNN: convolutional neural network; DNN: deep neural network; AUC: area under the curve from the receiver operating characteristics; *shown using intersection over the union (IOU); #relative sensitivity
Fig. 3Typical flow diagram of WFO procedure; WFO: Watson for Oncology
Fig. 4Specific terms defined in automated performance metrics (APMs). It contained five major categories, including overall kinematic metrics, dominant instrument metrics, nondominant instrument metrics, camera metrics and other event metrics
AI algorithms/models utilized in surgery
| Categorization | Function | Training data | |
|---|---|---|---|
| SVM (Law et al. | Deep learning | Categorize surgical performance into high/low based on video training data | Video data |
| dVLogger tool (Hung et al. | NA | Recording data and synchronizing surgical footage and APMs | Video data |
| Random forest-50 model (Hung et al. | Machine learning | Predict clinical outcomes in the form of hospital stay | Objective metrics |
| DeepSurv (Hung et al. | Deep learning | Predict postoperative urinary continence | Objective metrics |
| Random survival forest (Hung et al. | Deep learning | Predict postoperative urinary continence | Objective metrics |
| CNN RP-Net (Zia et al. | Deep learning | A methodology to recognize key steps in surgery | Objective metrics |
| CNN RP-Net-V2 (Zia et al. | Deep learning | Superior to RP-Net | Objective metrics |
SVM: support vector of machine; APMs: automated performance metrics; CNN: Convolutional Neural Network