Olmo Zavala-Romero1, Adrian L Breto1, Isaac R Xu1, Yu-Cherng C Chang2, Nicole Gautney1, Alan Dal Pra1, Matthew C Abramowitz1, Alan Pollack1, Radka Stoyanova3. 1. Department of Radiation Oncology, Sylvester Comprehensive Cancer Center, University of Miami Miller School of Medicine, Miami, FL, USA. 2. University of Miami Miller School of Medicine, Miami, FL, USA. 3. Department of Radiation Oncology, Sylvester Comprehensive Cancer Center, University of Miami Miller School of Medicine, Miami, FL, USA. rstoyanova@med.miami.edu.
Abstract
PURPOSE: Develop a deep-learning-based segmentation algorithm for prostate and its peripheral zone (PZ) that is reliable across multiple MRI vendors. METHODS: This is a retrospective study. The dataset consisted of 550 MRIs (Siemens-330, General Electric[GE]-220). A multistream 3D convolutional neural network is used for automatic segmentation of the prostate and its PZ using T2-weighted (T2-w) MRI. Prostate and PZ were manually contoured on axial T2‑w. The network uses axial, coronal, and sagittal T2‑w series as input. The preprocessing of the input data includes bias correction, resampling, and image normalization. A dataset from two MRI vendors (Siemens and GE) is used to test the proposed network. Six different models were trained, three for the prostate and three for the PZ. Of the three, two were trained on data from each vendor separately, and a third (Combined) on the aggregate of the datasets. The Dice coefficient (DSC) is used to compare the manual and predicted segmentation. RESULTS: For prostate segmentation, the Combined model obtained DSCs of 0.893 ± 0.036 and 0.825 ± 0.112 (mean ± standard deviation) on Siemens and GE, respectively. For PZ, the best DSCs were from the Combined model: 0.811 ± 0.079 and 0.788 ± 0.093. While the Siemens model underperformed on the GE dataset and vice versa, the Combined model achieved robust performance on both datasets. CONCLUSION: The proposed network has a performance comparable to the interexpert variability for segmenting the prostate and its PZ. Combining images from different MRI vendors on the training of the network is of paramount importance for building a universal model for prostate and PZ segmentation.
PURPOSE: Develop a deep-learning-based segmentation algorithm for prostate and its peripheral zone (PZ) that is reliable across multiple MRI vendors. METHODS: This is a retrospective study. The dataset consisted of 550 MRIs (Siemens-330, General Electric[GE]-220). A multistream 3D convolutional neural network is used for automatic segmentation of the prostate and its PZ using T2-weighted (T2-w) MRI. Prostate and PZ were manually contoured on axial T2‑w. The network uses axial, coronal, and sagittal T2‑w series as input. The preprocessing of the input data includes bias correction, resampling, and image normalization. A dataset from two MRI vendors (Siemens and GE) is used to test the proposed network. Six different models were trained, three for the prostate and three for the PZ. Of the three, two were trained on data from each vendor separately, and a third (Combined) on the aggregate of the datasets. The Dice coefficient (DSC) is used to compare the manual and predicted segmentation. RESULTS: For prostate segmentation, the Combined model obtained DSCs of 0.893 ± 0.036 and 0.825 ± 0.112 (mean ± standard deviation) on Siemens and GE, respectively. For PZ, the best DSCs were from the Combined model: 0.811 ± 0.079 and 0.788 ± 0.093. While the Siemens model underperformed on the GE dataset and vice versa, the Combined model achieved robust performance on both datasets. CONCLUSION: The proposed network has a performance comparable to the interexpert variability for segmenting the prostate and its PZ. Combining images from different MRI vendors on the training of the network is of paramount importance for building a universal model for prostate and PZ segmentation.
Entities:
Keywords:
Convolutional neuro Network; Deep learning; Peripheral zone; Prostate segmentation
Authors: Michelle Bardis; Roozbeh Houshyar; Chanon Chantaduly; Karen Tran-Harding; Alexander Ushinsky; Chantal Chahine; Mark Rupasinghe; Daniel Chow; Peter Chang Journal: Radiol Imaging Cancer Date: 2021-05
Authors: Rodrigo Delgadillo; John C Ford; Matthew C Abramowitz; Alan Dal Pra; Alan Pollack; Radka Stoyanova Journal: Strahlenther Onkol Date: 2020-08-21 Impact factor: 3.621
Authors: Adrian L Breto; Benjamin Spieler; Olmo Zavala-Romero; Mohammad Alhusseini; Nirav V Patel; David A Asher; Isaac R Xu; Jacqueline B Baikovitz; Eric A Mellon; John C Ford; Radka Stoyanova; Lorraine Portelance Journal: Front Oncol Date: 2022-05-18 Impact factor: 5.738
Authors: Hans Theodor Eich; Kambiz Rahbar; Sergiu Scobioala; Christopher Kittel; Heidi Wolters; Sebastian Huss; Khaled Elsayad; Robert Seifert; Lars Stegger; Matthias Weckesser; Uwe Haverkamp Journal: Ann Nucl Med Date: 2021-03-19 Impact factor: 2.668
Authors: Ying-Hwey Nai; Bernice W Teo; Nadya L Tan; Koby Yi Wei Chua; Chun Kit Wong; Sophie O'Doherty; Mary C Stephenson; Josh Schaefferkoetter; Yee Liang Thian; Edmund Chiong; Anthonin Reilhac Journal: Comput Math Methods Med Date: 2020-10-20 Impact factor: 2.238
Authors: Constantinos Zamboglou; Alisa S Bettermann; Xuefeng Qiu; Anca-Ligia Grosu; Christian Gratzke; Michael Mix; Juri Ruf; Selina Kiefer; Cordula A Jilg; Matthias Benndorf; Simon Spohn; Thomas F Fassbender; Peter Bronsert; Mengxia Chen; Hongqian Guo; Feng Wang Journal: Eur J Nucl Med Mol Imaging Date: 2020-11-18 Impact factor: 9.236