| Literature DB >> 36249889 |
Indrani Bhattacharya1,2, Yash S Khandwala2, Sulaiman Vesal2, Wei Shao3, Qianye Yang4,5, Simon J C Soerensen2,6, Richard E Fan2, Pejman Ghanouni3,2, Christian A Kunder7, James D Brooks2, Yipeng Hu4,5, Mirabela Rusu3, Geoffrey A Sonn3,2.
Abstract
A multitude of studies have explored the role of artificial intelligence (AI) in providing diagnostic support to radiologists, pathologists, and urologists in prostate cancer detection, risk-stratification, and management. This review provides a comprehensive overview of relevant literature regarding the use of AI models in (1) detecting prostate cancer on radiology images (magnetic resonance and ultrasound imaging), (2) detecting prostate cancer on histopathology images of prostate biopsy tissue, and (3) assisting in supporting tasks for prostate cancer detection (prostate gland segmentation, MRI-histopathology registration, MRI-ultrasound registration). We discuss both the potential of these AI models to assist in the clinical workflow of prostate cancer diagnosis, as well as the current limitations including variability in training data sets, algorithms, and evaluation criteria. We also discuss ongoing challenges and what is needed to bridge the gap between academic research on AI for prostate cancer and commercial solutions that improve routine clinical care.Entities:
Keywords: artificial intelligence; histopathology images; magnetic resonance imaging; prostate cancer diagnosis; registration; ultrasound images
Year: 2022 PMID: 36249889 PMCID: PMC9554123 DOI: 10.1177/17562872221128791
Source DB: PubMed Journal: Ther Adv Urol ISSN: 1756-2872
Figure 1.Potential of AI to assist prostate cancer diagnosis on imaging. AI models can help in detecting and characterizing cancer aggressiveness on non-invasive radiology images (MRI and ultrasound), as well as on histopathology images acquired through prostate biopsy. Aggressive cancer is shown in yellow, and indolent cancer in green in the ‘AI for cancer diagnosis’ panel. AI models can also help in supporting tasks for cancer detection, namely prostate gland segmentation, MRI-ultrasound registration, and MRI-histopathology registration.
Figure 2.AI models for prostate cancer detection on MRI can be subdivided into two major tasks: lesion classification and lesion detection. Lesion classification involves classifying a radiologist-outlined lesion (region of interest) into categories (cancer vs benign, clinically significant cancer vs benign or indolent, or Gleason grade groups). Lesion detection involves detecting and characterizing cancer aggressiveness on the entire prostate MRI.
AI models for prostate lesion classification on MRI.
| Study | Input data | Cohort size | Data type | Algorithm | Training labels | Evaluation labels | Evaluation metric | Source code availability |
|---|---|---|---|---|---|---|---|---|
| Algohary | T2w, ADC | 231 | Retrospective, 4 inst. | TML | Biopsy | Biopsy | ROC-AUC, Acc. | No |
| Antonelli | T2w, ADC DWI, DCE | 164 | Retrospective, 1 inst. | TML | Biopsy | Biopsy | ROC-AUC, Se. at 50% threshold of Sp. | No |
| Bleker | T2w, ADC DWI, DCE | 206 | Retrospective, public data set | TML | Biopsy | Biopsy | ROC-AUC, Se., Sp. | No |
| Bonekamp | T2w, ADC DWI | 316 | Retrospective, public data set | TML | Biopsy | Biopsy | ROC-AUC, Se., Sp. | No |
| Chen | T2w, ADC | 381 | Retrospective, 1 inst. | TML | Biopsy | Biopsy | ROC-AUC, Acc., Se., Sp., | No |
| Akamine | DWI, DCE | 52 | Retrospective, 1 inst. | Hierarchical clustering | RP | RP | Acc. | No |
| Kwon | T2w, ADC, DWI, DCE | 344 | Retrospective, public data set | TML | biopsy | biopsy | ROC-AUC, Se., PPV | No |
| Chaddad | T2w, ADC | 112 | Retrospective, 1 inst., public data set | TML | Biopsy | Biopsy | ROC-AUC | No |
| Hectors | T2w | 64 | Retrospective, 1 inst. | TML | RP | RP | ROC-AUC | No |
| Xu | T2w | 331 | Retrospective, 1 inst. | TML | RP | RP | ROC-AUC, decision curve analysis | No |
| Viswanath | T2w | 85 | Retrospective, 3 inst. | TML | RP | RP | ROC-AUC | No |
| Transin | ADC, DCE | 74 | Retrospective, 1 inst. | TML | Biopsy/RP | Biopsy/RP | ROC-AUC, Se., Sp. | No |
| Zhang | T2w, ADC | 159 | Retrospective, 2 inst. | TML | Biopsy | Biopsy | ROC-AUC | No |
| Deniffel | T2w, ADC, DWI | 499 | Retrospective, 1 inst. | DL | Biopsy | Biopsy | ROC-AUC, decision-curve analysis | No |
| Song | T2w, ADC, DWI | 185 | Retrospective, public data set | DL | Biopsy | Biopsy | ROC-AUC, Se., Sp., PPV | No |
| Takeuchi | T2w, ADC, DWI | 334 | Retrospective, 1 inst. | DL | Biopsy | Biopsy | ROC-AUC, Net-benefit curve, NPV | No |
| Yuan | T2w, ADC | 244 | Retrospective, 2 inst. | DL | Biospy | Biospy | Acc., Prec., Recall, F1-score | No |
| Aldoj | T2w, ADC, DWI, DCE | 200 | Retrospective, public data set | DL | Biopsy | Biopsy | ROC-AUC, Se., Sp. | No |
| Zhong | T2w, ADC | 140 | Retrospective, 1 inst. | DL | RP | RP | ROC-AUC, Acc., Se. Sp. | No |
| Abraham and Nair[ | T2w, ADC, DWI | 112 | Retrospective, public data set | DL | Biopsy | Biopsy | ROC-AUC, quadratic wtd. kappa, PPV | No |
Acc, accuracy; ADC, apparent diffusion coefficient; AI, artificial intelligence; DCE, dynamic contrast enhanced; DL, deep learning; DWI, diffusion weighted imaging; inst., institution; MRI, magnetic resonance imaging; NPV, negative predictive value; PPV, positive predictive value; Prec, precision; ROC-AUC, receiver operating characteristics – area under the curve; RP, radical prostatectomy; Se, sensitivity; Sp, specificity; T2w, T2-weighted MRI; TML, traditional machine learning.
AI models for prostate lesion detection on MRI.
| Study | Input data | Cohort size | Data type | Algorithm | Training labels | Evaluation labels | Evaluation granularity | Evaluation metric | Source code availability |
|---|---|---|---|---|---|---|---|---|---|
| Saha | T2w, ADC DWI | 2732 | Retrospective, 2 inst., PIRADS or biopsy | DL | Radiologist, w/o path. confirm. | Radiologist, w/o & with path. confirm. from biopsy | Lesion-level, patient-level | ROC, FROC | Yes |
| Yu | T2w, ADC DWI | 1745 | Retrospective, 4 inst., PIRADS or biopsy external validation on public data set | DL | Radiologist, w/o path. confirm. | Radiologist, w/o & with path. confirm. from biopsy | Lesion-level, patient-level | FROC, DSC, ROC-AUC | No |
| Schelb | T2w, DWI | 312 | Retrospective, 1 inst., biopsy | DL | Radiologist, path confirm. from biopsy | Radiologist, path confirm. from biopsy | Sextant-level, patient-level | Se, Sp, Prec, NPV, ROC | Yes |
| Sumathipala | T2w, ADC DWI | 186 | Retrospective, 6 inst., RP or biopsy | DL | Radiologist, path. confirm. from RP or biopsy | Radiologist, path confirm. from RP or biopsy | Patient-level | ROC-AUC | No |
| Bhattacharya | T2w, ADC | 75 | Retrospective, 1 inst., RP | DL | Pathologist, automated registration | Pathologist, automated registration | Pixel-level, lesion-level | ROC-AUC, Se, Sp | No |
| Sanyal | T2w, ADC DWI | 77 | Retrospective, 1 inst., biopsy | DL | Radiologist, path confirm. from biopsy | Radiologist, path confirm. from biopsy | Pixel-level | ROC-AUC | Yes |
| Jin | T2w, ADC, DWI, DCE | 34 | Retrospective, 1 inst. | TML | Pathologist, automated registration | Pathologist, automated registration | Pixel-level | ROC-AUC | Yes |
| McGarry | T2w, ADC, DWI, DCE | 48 | Prospecitvely recruited, 1 inst., RP | TML | Pathologist, automated registration | Pathologist, automated registration | Lesion-level | ROC-AUC | No |
| Cao | T2w, ADC | 417 | Retrospective, 1 inst., 4 scanners, RP | DL | Radiologist, path confirm., cognitive registration with RP | Radiologist, path confirm., cognitive registration with RP | Lesion-level | FROC | No |
| De Vente | T2w, ADC | 162 | Retrospective, 1 inst., public data set, biopsy | DL | Semi-automated region growing from targeted biopsy centroid | Semi-automated region growing from targeted biopsy centroid | Pixel-level, lesion-level | Quadratic weighted kappa-score | No |
| Seetharaman | T2w, ADC | 424 | Retrospective, 1 inst., biopsy & RP | DL | Automated Gleason patterns from RP, automated registration | Automated Gleason patterns from RP, radiologist labels with path confirm. from targeted biopsy | Pixel-level, lesion-level, Patient-level | ROC-AUC, Se, Sp | Yes |
| Bhattacharya | T2w, ADC | 443 | retrospective, 1 inst., biopsy & RP | DL | Pathologist & automated Gleason patterns from RP, automated registration | Automated Gleason patterns from RP, radiologist labels with path confirm. from targeted biopsy | Pixel-level, lesion-level, patient-level | ROC-AUC, PR-AUC, Se, Sp, Prec, NPV, F1-score, DSC, Acc | Soon to be released |
| Zhang | T2w, ADC | 358 | Retrospective, 1 inst., biopsy | DL | Retrospective radiologist outline from biopsy path. | Retrospective radiologist outline from biopsy path. | Pixel-level | DSC, Se, Prec, VOE, RVD | No |
| Alkadi | T2w | 19 | retrospective, 1 inst. (public), biopsy | DL | Radiologist, path confirm. from biopsy | Radiologist, path confirm. from biopsy | Pixel-level | Acc, IoU, Recall, DSC | No |
| Arif | T2w, DWI, ADC | 292 | Retrospective, 1 inst., biopsy | DL | Radiologist, path confirm. from biopsy | Radiologist, path confirm. from biopsy | Patient-level | Acc, IoU, Recall, DSC | No |
| Mehralivand | T2w, DWI, ADC | 236 | Retrospective, multi inst., biopsy | TML | Radiologist, path confirm. from biopsy or RP | Radiologist, path confirm. from biopsy or RP | Lesion-level | AUC, Se, PPV | No |
| Netzer | T2w, DWI | 1488 | Retrospective, multi inst., multi scanner biopsy | DL | Radiologist, path confirm. from biopsy or RP | Radiologist, path confirm. from biopsy or RP | Patient-level, sextant-level | ROC-AUC, Se, Se | No |
| Duran | T2w, ADC | 318 | Retrospective, 2 inst., different scanners, external validation on public data set | DL | Radiologist, cognitive alignment with RP | Radiologist, cognitive alignment with RP | Lesion-level | FROC, Cohen’s Quadratic kappa | Yes (claimed) |
Acc, Accuracy; ADC, apparent diffusion coefficient; AI, artificial intelligence; confim., confirmation; DCE, dynamic contrast enhanced; DL, deep learning; DSC, dice coefficient; DWI, diffusion-weighted imaging; FROC, free-response receiver operating characteristics; inst., institution; IoU, Intersection over Union; MRI, magnetic resonance imaging; NPV, negative predictive value; path., pathology; PIRADS, Prostate Imaging-Reporting and Data System; PPV, positive predictive value; PR-AUC, precision recall–area under the curve; Prec, Precision; ROC-AUC, receiver operating characteristics–area under the curve; RP, radical prostatectomy; RVD, relative volume difference; Se, Sensitivity; Sp, Specificity; T2w, T2-weighted MRI; TML, traditional machine learning; VOE, volumetric overlap error; w/o, without.
AI models for prostate cancer detection on ultrasound.
| Study | Cohort size | Input data | Data type | Algorithm | Training labels | Evaluation granularity | Evaluation metric | Source code availability | Task |
|---|---|---|---|---|---|---|---|---|---|
| Sedghi | 157 | TeUS | Retrospective, 1 inst. | DL | Radiologist, path confirm. from biopsy | Lesion level | Se, Sp ACC, AUC | No | Lesion |
| Azizi | 163 | TeUS | Retrospective, 2 inst. | DL | Radiologist, path confirm. from biopsy | Lesion level | Se, Sp ACC, AUC | No | Lesion |
| Azizi | 157 | TeUS | Retrospective, 1 inst. | DL | Radiologist, path confirm. from biopsy | Lesion level | Se, Sp ACC, AUC | No | Lesion |
| Azizi | 155 | TeUS | Retrospective, 1 inst. | DL | Radiologist, path confirm. from biopsy, biopsy length | Patient level | AUC, MSE | No | Lesion |
| Han | 51 | TRUS | N/A | TML | Biopsy | Patient level, Lesion level | Se, Sp ACC, ROC-AUC | No | Lesion |
| Wildeboer | 50 | TRUS, SWE, DCE-US | Retrospective, 1 inst. | TML | RP, Biopsy | Pixel level, Lesion level | ROC-AUC | No | Lesion |
| Moradi | 16 | RF time series | Retrospective, 1 inst. | ML | RP | Patient level | Se, Sp ACC, ROC-AUC | No | Lesion |
| Imani | 14 | RF time series | Retrospective, 1 inst. | TML | RP, Biopsy | Patient level | Se, Sp ACC, ROC-AUC | No | Lesion |
| Hassan | 1151 | TRUS | Retrospective, 1 inst. public data set | TML & DL | Biopsy | Patient level | ACC | No | Lesion |
Acc, accuracy; AI, artificial intelligence; AUC, area under the curve; confirm., confirmation; DCE-US, dynamic contrast-enhanced ultrasound; DL, deep learning; inst., institution; MSE, Mean Square Error; path, Pathology; RF, radio frequency; ROC-AUC, receiver operating characteristics–area under the curve; RP, radical prostatectomy; Se, sensitivity; Sp, specificity; SWE, shear-wave elastography; TeUS, temporal enhanced ultrasound; TML, traditional machine learning; TRUS, transrectal ultrasound.
AI models for cancer detection and Gleason grading on prostate histopathology whole slide images (WSI).
| Study | Input data | Cohort size | Data type | Algorithm | Training labels | Evaluation labels | Evaluation granularity | Evaluation metric | Code availability |
|---|---|---|---|---|---|---|---|---|---|
| Lucas | WSI, biopsy | 38 slides | Retrospective, 1 inst. | DL | Pathologist, pixel-level | Pathologist, pixel-level | Patch-based | Se, Sp, F1-score | No |
| Campanella | WSI, biopsy | 15,187 slides | Retrospective, multiple inst., | DL | Reported diagnosis | Reported diagnosis | Slide-level | ROC-AUC | Yes |
| Bulten | WSI, biopsy | 1410 slides | Retrospective, multiple inst., | DL | Pathologists’ reports | Pathologists’ reports, Consensus reference standard by 3 expert urologic pathologist | slide-level | ROC-AUC, F1-score, Acc, Prec, Rec, Sp, NPV | Yes |
| Nagpal | WSI, RP | 1557 slides | Retrospective, multiple inst. | DL | Slide-level & region-level annotations by pathologists | Slide-level & region-level annotations by pathologists | slide-level | ROC-AUC | No |
| Pinckaers | WSI, biopsy | 5949 slides | Retrospective, multiple inst. retrospective | DL | Pathologists’ reports | Pathologists’ reports, Consensus reference standard by 3 expert urologic pathologist | slide-level | ROC-AUC | Yes |
| Ström | WSI, biopsy | 1474 patients, 9001 slides | Prospectively collected, multiple inst. | DL | Annotations by single experienced urological pathologist | Annotations by 23 experienced urological pathologist | slide-level | ROC-AUC, Se, Sp, cancer length measurement, Cohen’s kappa | No |
| Marginean | WSI, biopsy | 195 patients, 735 slides | Retrospective, 1 inst., same slide different scanners | DL | Pixel-level annotations by 2 experienced pathologists | Pixel-level annotations by 2 experienced pathologists | Pixel-level slide-level | Correlation, Se, Sp | No |
| Kott | WSI, biopsy | 80 patients, 85 slides | Retrospective, 1 inst. | DL | Pixel-level annotations by pathologists | Pixel-level annotations by pathologists | Patch-level | Acc, Se, Sp, Prec | No |
| Li | WSI, RP | 70 patients, 543 slides | Retrospective, 1 inst. | DL & TML | Pixel-level annotations by pathologists | Pixel-level annotations by pathologists | Pixel-level | Overall Pixel Acc. IoU | No |
| Ryu | WSI, biopsy | 1833 slides | Retrospective, 2 inst. | DL | Pixel-level annotations by 1 experienced pathologist | Slide-level annotations by 3 experienced pathologists, difficulty-level | Slide-level | Cohen’s kappa, Tumor length | No |
AI, artificial intelligence; DL, deep learning; inst., institution; IoU, intersection over union; NPV, negative predictive value; Prec, precision; ROC-AUC, receiver operating characteristics–area under the curve; RP, radical prostatectomy; Se, Sensitivity; Sp, specificity; TML, traditional machine learning; WSI, whole slide images.
Figure 3.The AI-predicted[60] automated aggressive (Gleason pattern 4, green) and indolent (Gleason Pattern 3, blue) cancers visually match the manual cancer annotations by an expert pathologist (black, yellow, orange, red). (a) Whole mount histopathology image with (b–d) Close-up into the two cancer lesions. (c) Cancer labels manually outlined by an expert pathologist (black outline) shows high agreement with overall cancer (combined blue and green) predicted by the AI model. (b, d) It is impractically time-consuming for a human pathologist to manually assign pixel-level Gleason patterns (yellow, orange, red) to each gland in detail as done by the AI model (blue, green).
Figure 4.AI can help in supporting tasks for cancer detection like prostate gland segmentation on MRI and ultrasound (left), and MRI-ultrasound registration (right). The AI-predicted prostate segmentations on MRI and ultrasound can help in automated MRI-ultrasound registration which aligns the two modalities, mapping lesions from MRI onto ultrasound. MRI-ultrasound registration helps guide systematic and targeted fusion biopsy procedures.
AI models for registration between MRI (T2w) and ultrasound (TRUS) images.
| Study | Number of subjects | Data type | Approach | Prostate segmentation | Evaluation metric | Source code availability |
|---|---|---|---|---|---|---|
| Hu | 143 | Retrospective | DL | No | TREs, TDR, RMSE | No |
| Hu | 76 | Retrospective | DL | Yes; manual | TREs, DSC | Yes |
| Hu | 76 | Retrospective | DL | Yes; manual | TREs, DSC | No |
| Ghavami | 59 | Retrospective | DL + TML | Yes; DL | DSC, GVE, TREs | No |
| Hu | 80 | Retrospective | DL | Yes; manual | TREs, DSC | Yes |
| Haskins | 679 | Retrospective | DL + TML | No | TREs | No |
| Guo | 679 | Retrospective | DL | Yes; manual | TREs, SRE | Yes |
| Saeed | 320 | Retrospective | DL | Yes; manual | MAE | No |
| Baum | 108 | Retrospective | DL | Yes; manual | TREs, CD, HD | No |
| Zeng | 36 | Retrospective | DL | Yes; manual | TREs, DSC | No |
| Zeng | 36 | Retrospective | DL | Yes; DL | TREs, DSC | No |
| Song | 528 | Retrospective | DL | Yes; manual | SRE | Yes |
| Fu | 50 | Retrospective | DL | Yes; DL | TREs, DSC, MSD, HD | No |
| Guo | 619 | Retrospective | DL | No | TREs, NCC | Yes |
AI, artificial intelligence; CD, chamfer distance; DL, deep learning; DSC, dice score coefficient; HD, Hausdorff Distance; GVE, gland volume error; MRI, magnetic resonance imaging; MAE, Mean Absolute Error; MSD, Mean Square Distance; NCC, normalized cross correlation; RMSE, Root Mean Square Error; SRE, surface registration error; T2w, T2-weighted MRI; TDR, tumor detection rate; TML, traditional machine learning; TREs, target registration Errors; TRUS, transrectal ultrasound.
MRI-histopathology registration approaches (not exhaustive) for generating ground truth cancer labels on MRI.
| Study | Number of subjects | Pathology type | Registration type | Intermediate modality | Require 2D slice correspondences | Prostate sectioning | Source code availability |
|---|---|---|---|---|---|---|---|
| Chappelow | 25 | Whole-mount | Traditional automated | None | Yes | Manual | No |
| Ward | 13 | Whole-mount | Traditional automated | Fiducial markers | Yes | Image-guided | No |
| Kalavagunta | 35 | Pseudo-whole mount | Traditional automated | Manual landmarks | Yes | Sectioning box | No |
| Reynolds | 6 | Whole-mount | Traditional automated | Ex vivo MRI + Manual landmarks | Yes | Sectioning box | No |
| Li | 19 | Pseudo-whole mount | Traditional automated | None | Yes | Manual | No |
| Losnegård | 12 | Whole-mount | Traditional automated | None | No | Manual | No |
| Wu | 17 | Whole-mount | Traditional automated | Ex vivo MRI + fiducial markers | Yes | 3D-printed mold | No |
| Rusu | 157 | Whole-mount | Traditional automated | None | Yes | 3D-printed mold | Yes |
| Shao | 152 | Whole-mount | Deep learning | None | Yes | 3D-printed mold | Yes |
| Sood | 106 | Whole-mount | Traditional automated | None | No | 3D-printed mold | No |
| Shao | 183 | Whole-mount | Deep learning | None | Yes | 3D-printed mold | No |
MRI, magnetic resonance imaging.