Nader Aldoj1, Steffen Lukas2, Marc Dewey3,4, Tobias Penzkofer2,5. 1. Department of Radiology, Charité - Universitätsmedizin Berlin, Freie Universität Berlin, Humboldt-Universität zu Berlin, Charitéplatz 1, 10117, Berlin, Germany. nader.aldoj@charite.de. 2. Department of Radiology, Charité - Universitätsmedizin Berlin, Freie Universität Berlin, Humboldt-Universität zu Berlin, Charitéplatz 1, 10117, Berlin, Germany. 3. Department of Radiology, Charité - Universitätsmedizin Berlin, Freie Universität Berlin, Humboldt-Universität zu Berlin, Charitéplatz 1, 10117, Berlin, Germany. dewey@charite.de. 4. Berlin Institute of Health (BIH), Anna-Louisa-Karsch-Str. 2, 10178, Berlin, Germany. dewey@charite.de. 5. Berlin Institute of Health (BIH), Anna-Louisa-Karsch-Str. 2, 10178, Berlin, Germany.
Abstract
OBJECTIVE: To present a deep learning-based approach for semi-automatic prostate cancer classification based on multi-parametric magnetic resonance (MR) imaging using a 3D convolutional neural network (CNN). METHODS: Two hundred patients with a total of 318 lesions for which histological correlation was available were analyzed. A novel CNN was designed, trained, and validated using different combinations of distinct MRI sequences as input (e.g., T2-weighted, apparent diffusion coefficient (ADC), diffusion-weighted images, and K-trans) and the effect of different sequences on the network's performance was tested and discussed. The particular choice of modeling approach was justified by testing all relevant data combinations. The model was trained and validated using eightfold cross-validation. RESULTS: In terms of detection of significant prostate cancer defined by biopsy results as the reference standard, the 3D CNN achieved an area under the curve (AUC) of the receiver operating characteristics ranging from 0.89 (88.6% and 90.0% for sensitivity and specificity respectively) to 0.91 (81.2% and 90.5% for sensitivity and specificity respectively) with an average AUC of 0.897 for the ADC, DWI, and K-trans input combination. The other combinations scored less in terms of overall performance and average AUC, where the difference in performance was significant with a p value of 0.02 when using T2w and K-trans; and 0.00025 when using T2w, ADC, and DWI. Prostate cancer classification performance is thus comparable to that reported for experienced radiologists using the prostate imaging reporting and data system (PI-RADS). Lesion size and largest diameter had no effect on the network's performance. CONCLUSION: The diagnostic performance of the 3D CNN in detecting clinically significant prostate cancer is characterized by a good AUC and sensitivity and high specificity. KEY POINTS: • Prostate cancer classification using a deep learning model is feasible and it allows direct processing of MR sequences without prior lesion segmentation. • Prostate cancer classification performance as measured by AUC is comparable to that of an experienced radiologist. • Perfusion MR images (K-trans), followed by DWI and ADC, have the highest effect on the overall performance; whereas T2w images show hardly any improvement.
OBJECTIVE: To present a deep learning-based approach for semi-automatic prostate cancer classification based on multi-parametric magnetic resonance (MR) imaging using a 3D convolutional neural network (CNN). METHODS: Two hundred patients with a total of 318 lesions for which histological correlation was available were analyzed. A novel CNN was designed, trained, and validated using different combinations of distinct MRI sequences as input (e.g., T2-weighted, apparent diffusion coefficient (ADC), diffusion-weighted images, and K-trans) and the effect of different sequences on the network's performance was tested and discussed. The particular choice of modeling approach was justified by testing all relevant data combinations. The model was trained and validated using eightfold cross-validation. RESULTS: In terms of detection of significant prostate cancer defined by biopsy results as the reference standard, the 3D CNN achieved an area under the curve (AUC) of the receiver operating characteristics ranging from 0.89 (88.6% and 90.0% for sensitivity and specificity respectively) to 0.91 (81.2% and 90.5% for sensitivity and specificity respectively) with an average AUC of 0.897 for the ADC, DWI, and K-trans input combination. The other combinations scored less in terms of overall performance and average AUC, where the difference in performance was significant with a p value of 0.02 when using T2w and K-trans; and 0.00025 when using T2w, ADC, and DWI. Prostate cancer classification performance is thus comparable to that reported for experienced radiologists using the prostate imaging reporting and data system (PI-RADS). Lesion size and largest diameter had no effect on the network's performance. CONCLUSION: The diagnostic performance of the 3D CNN in detecting clinically significant prostate cancer is characterized by a good AUC and sensitivity and high specificity. KEY POINTS: • Prostate cancer classification using a deep learning model is feasible and it allows direct processing of MR sequences without prior lesion segmentation. • Prostate cancer classification performance as measured by AUC is comparable to that of an experienced radiologist. • Perfusion MR images (K-trans), followed by DWI and ADC, have the highest effect on the overall performance; whereas T2w images show hardly any improvement.
Authors: Alireza Mehrtash; Alireza Sedghi; Mohsen Ghafoorian; Mehdi Taghipour; Clare M Tempany; William M Wells; Tina Kapur; Parvin Mousavi; Purang Abolmaesumi; Andriy Fedorov Journal: Proc SPIE Int Soc Opt Eng Date: 2017-03-03
Authors: Roger Chou; Jennifer M Croswell; Tracy Dana; Christina Bougatsos; Ian Blazina; Rongwei Fu; Ken Gleitsmann; Helen C Koenig; Clarence Lam; Ashley Maltz; J Bruin Rugge; Kenneth Lin Journal: Ann Intern Med Date: 2011-10-07 Impact factor: 25.391
Authors: Guillaume Lemaître; Robert Martí; Jordi Freixenet; Joan C Vilanova; Paul M Walker; Fabrice Meriaudeau Journal: Comput Biol Med Date: 2015-02-20 Impact factor: 4.589
Authors: Kenneth Clark; Bruce Vendt; Kirk Smith; John Freymann; Justin Kirby; Paul Koppel; Stephen Moore; Stanley Phillips; David Maffitt; Michael Pringle; Lawrence Tarbox; Fred Prior Journal: J Digit Imaging Date: 2013-12 Impact factor: 4.056
Authors: Massimo Valerio; Ian Donaldson; Mark Emberton; Behfar Ehdaie; Boris A Hadaschik; Leonard S Marks; Pierre Mozer; Ardeshir R Rastinehad; Hashim U Ahmed Journal: Eur Urol Date: 2014-11-01 Impact factor: 20.096
Authors: Duc Fehr; Harini Veeraraghavan; Andreas Wibmer; Tatsuo Gondo; Kazuhiro Matsumoto; Herbert Alberto Vargas; Evis Sala; Hedvig Hricak; Joseph O Deasy Journal: Proc Natl Acad Sci U S A Date: 2015-11-02 Impact factor: 11.205
Authors: Yahui Peng; Yulei Jiang; Cheng Yang; Jeremy Bancroft Brown; Tatjana Antic; Ila Sethi; Christine Schmid-Tannwald; Maryellen L Giger; Scott E Eggener; Aytekin Oto Journal: Radiology Date: 2013-02-07 Impact factor: 11.105
Authors: Harbir S Sidhu; Salvatore Benigno; Balaji Ganeshan; Nikos Dikaios; Edward W Johnston; Clare Allen; Alex Kirkham; Ashley M Groves; Hashim U Ahmed; Mark Emberton; Stuart A Taylor; Steve Halligan; Shonit Punwani Journal: Eur Radiol Date: 2016-09-12 Impact factor: 5.315
Authors: Elena Bertelli; Laura Mercatelli; Chiara Marzi; Eva Pachetti; Michela Baccini; Andrea Barucci; Sara Colantonio; Luca Gherardini; Lorenzo Lattavo; Maria Antonietta Pascali; Simone Agostini; Vittorio Miele Journal: Front Oncol Date: 2022-01-13 Impact factor: 6.244
Authors: Jose M Castillo T; Muhammad Arif; Martijn P A Starmans; Wiro J Niessen; Chris H Bangma; Ivo G Schoots; Jifke F Veenland Journal: Cancers (Basel) Date: 2021-12-21 Impact factor: 6.639