Alexander Ushinsky1,2, Michelle Bardis3, Justin Glavis-Bloom1, Edward Uchio4, Chanon Chantaduly3, Michael Nguyentat5, Daniel Chow1,3, Peter D Chang1,3, Roozbeh Houshyar1. 1. Department of Radiological Sciences, University of California, Irvine, CA. 2. Present affiliation: Mallinckrodt Institute of Radiology, Washington University Saint Louis, Saint Louis, MO. 3. Center for Artificial Intelligence in Diagnostic Medicine, California Institute for Telecommunication and Information Technology, University of California, 4100 E Peltason Dr, Calit 2 Bldg, Ste 4500, Irvine, CA 92617. 4. Department of Urology, University of California, Irvine, CA. 5. Department of Radiology, University of Colorado Anschutz Medical Center, Aurora, CO.
Abstract
OBJECTIVE: Prostate cancer is the most commonly diagnosed cancer in men in the United States with more than 200,000 new cases in 2018. Multiparametric MRI (mpMRI) is increasingly used for prostate cancer evaluation. Prostate organ segmentation is an essential step of surgical planning for prostate fusion biopsies. Deep learning convolutional neural networks (CNNs) are the predominant method of machine learning for medical image recognition. In this study, we describe a deep learning approach, a subset of artificial intelligence, for automatic localization and segmentation of prostates from mpMRI. MATERIALS AND METHODS: This retrospective study included patients who underwent prostate MRI and ultrasound-MRI fusion transrectal biopsy between September 2014 and December 2016. Axial T2-weighted images were manually segmented by two abdominal radiologists, which served as ground truth. These manually segmented images were used for training on a customized hybrid 3D-2D U-Net CNN architecture in a fivefold cross-validation paradigm for neural network training and validation. The Dice score, a measure of overlap between manually segmented and automatically derived segmentations, and Pearson linear correlation coefficient of prostate volume were used for statistical evaluation. RESULTS: The CNN was trained on 299 MRI examinations (total number of MR images = 7774) of 287 patients. The customized hybrid 3D-2D U-Net had a mean Dice score of 0.898 (range, 0.890-0.908) and a Pearson correlation coefficient for prostate volume of 0.974. CONCLUSION: A deep learning CNN can automatically segment the prostate organ from clinical MR images. Further studies should examine developing pattern recognition for lesion localization and quantification.
OBJECTIVE:Prostate cancer is the most commonly diagnosed cancer in men in the United States with more than 200,000 new cases in 2018. Multiparametric MRI (mpMRI) is increasingly used for prostate cancer evaluation. Prostate organ segmentation is an essential step of surgical planning for prostate fusion biopsies. Deep learning convolutional neural networks (CNNs) are the predominant method of machine learning for medical image recognition. In this study, we describe a deep learning approach, a subset of artificial intelligence, for automatic localization and segmentation of prostates from mpMRI. MATERIALS AND METHODS: This retrospective study included patients who underwent prostate MRI and ultrasound-MRI fusion transrectal biopsy between September 2014 and December 2016. Axial T2-weighted images were manually segmented by two abdominal radiologists, which served as ground truth. These manually segmented images were used for training on a customized hybrid 3D-2D U-Net CNN architecture in a fivefold cross-validation paradigm for neural network training and validation. The Dice score, a measure of overlap between manually segmented and automatically derived segmentations, and Pearson linear correlation coefficient of prostate volume were used for statistical evaluation. RESULTS: The CNN was trained on 299 MRI examinations (total number of MR images = 7774) of 287 patients. The customized hybrid 3D-2D U-Net had a mean Dice score of 0.898 (range, 0.890-0.908) and a Pearson correlation coefficient for prostate volume of 0.974. CONCLUSION: A deep learning CNN can automatically segment the prostate organ from clinical MR images. Further studies should examine developing pattern recognition for lesion localization and quantification.
Authors: Michelle Bardis; Roozbeh Houshyar; Chanon Chantaduly; Karen Tran-Harding; Alexander Ushinsky; Chantal Chahine; Mark Rupasinghe; Daniel Chow; Peter Chang Journal: Radiol Imaging Cancer Date: 2021-05
Authors: Jiantao Pu; Joseph K Leader; Jacob Sechrist; Cameron A Beeche; Jatin P Singh; Iclal K Ocak; Michael G Risbano Journal: Med Image Anal Date: 2022-01-12 Impact factor: 8.545