PURPOSE: Transrectal ultrasound (TRUS) is a versatile and real-time imaging modality that is commonly used in image-guided prostate cancer interventions (e.g., biopsy and brachytherapy). Accurate segmentation of the prostate is key to biopsy needle placement, brachytherapy treatment planning, and motion management. Manual segmentation during these interventions is time-consuming and subject to inter- and intraobserver variation. To address these drawbacks, we aimed to develop a deep learning-based method which integrates deep supervision into a three-dimensional (3D) patch-based V-Net for prostate segmentation. METHODS AND MATERIALS: We developed a multidirectional deep-learning-based method to automatically segment the prostate for ultrasound-guided radiation therapy. A 3D supervision mechanism is integrated into the V-Net stages to deal with the optimization difficulties when training a deep network with limited training data. We combine a binary cross-entropy (BCE) loss and a batch-based Dice loss into the stage-wise hybrid loss function for a deep supervision training. During the segmentation stage, the patches are extracted from the newly acquired ultrasound image as the input of the well-trained network and the well-trained network adaptively labels the prostate tissue. The final segmented prostate volume is reconstructed using patch fusion and further refined through a contour refinement processing. RESULTS: Forty-four patients' TRUS images were used to test our segmentation method. Our segmentation results were compared with the manually segmented contours (ground truth). The mean prostate volume Dice similarity coefficient (DSC), Hausdorff distance (HD), mean surface distance (MSD), and residual mean surface distance (RMSD) were 0.92 ± 0.03, 3.94 ± 1.55, 0.60 ± 0.23, and 0.90 ± 0.38 mm, respectively. CONCLUSION: We developed a novel deeply supervised deep learning-based approach with reliable contour refinement to automatically segment the TRUS prostate, demonstrated its clinical feasibility, and validated its accuracy compared to manual segmentation. The proposed technique could be a useful tool for diagnostic and therapeutic applications in prostate cancer.
PURPOSE: Transrectal ultrasound (TRUS) is a versatile and real-time imaging modality that is commonly used in image-guided prostate cancer interventions (e.g., biopsy and brachytherapy). Accurate segmentation of the prostate is key to biopsy needle placement, brachytherapy treatment planning, and motion management. Manual segmentation during these interventions is time-consuming and subject to inter- and intraobserver variation. To address these drawbacks, we aimed to develop a deep learning-based method which integrates deep supervision into a three-dimensional (3D) patch-based V-Net for prostate segmentation. METHODS AND MATERIALS: We developed a multidirectional deep-learning-based method to automatically segment the prostate for ultrasound-guided radiation therapy. A 3D supervision mechanism is integrated into the V-Net stages to deal with the optimization difficulties when training a deep network with limited training data. We combine a binary cross-entropy (BCE) loss and a batch-based Dice loss into the stage-wise hybrid loss function for a deep supervision training. During the segmentation stage, the patches are extracted from the newly acquired ultrasound image as the input of the well-trained network and the well-trained network adaptively labels the prostate tissue. The final segmented prostate volume is reconstructed using patch fusion and further refined through a contour refinement processing. RESULTS: Forty-four patients' TRUS images were used to test our segmentation method. Our segmentation results were compared with the manually segmented contours (ground truth). The mean prostate volume Dice similarity coefficient (DSC), Hausdorff distance (HD), mean surface distance (MSD), and residual mean surface distance (RMSD) were 0.92 ± 0.03, 3.94 ± 1.55, 0.60 ± 0.23, and 0.90 ± 0.38 mm, respectively. CONCLUSION: We developed a novel deeply supervised deep learning-based approach with reliable contour refinement to automatically segment the TRUS prostate, demonstrated its clinical feasibility, and validated its accuracy compared to manual segmentation. The proposed technique could be a useful tool for diagnostic and therapeutic applications in prostate cancer.
Authors: Ismail B Tutar; Sayan D Pathak; Lixin Gong; Paul S Cho; Kent Wallner; Yongmin Kim Journal: IEEE Trans Med Imaging Date: 2006-12 Impact factor: 10.048
Authors: S Sara Mahdavi; Nick Chng; Ingrid Spadinger; William J Morris; Septimiu E Salcudean Journal: Med Image Anal Date: 2010-10-26 Impact factor: 8.545
Authors: Mingyue Ding; Bernard Chiu; Igor Gyacskov; Xiaping Yuan; Maria Drangova; Dònal B Downey; Aaron Fenster Journal: Med Phys Date: 2007-11 Impact factor: 4.071
Authors: Xiaofan Xiong; Timothy J Linhardt; Weiren Liu; Brian J Smith; Wenqing Sun; Christian Bauer; John J Sunderland; Michael M Graham; John M Buatti; Reinhard R Beichel Journal: Med Phys Date: 2020-01-06 Impact factor: 4.071
Authors: Xianjin Dai; Yang Lei; Tonghe Wang; Jun Zhou; Soumon Rudra; Mark McDonald; Walter J Curran; Tian Liu; Xiaofeng Yang Journal: Phys Med Biol Date: 2022-01-21 Impact factor: 3.609
Authors: Luke A Matkovic; Tonghe Wang; Yang Lei; Oladunni O Akin-Akintayo; Olayinka A Abiodun Ojo; Akinyemi A Akintayo; Justin Roper; Jeffery D Bradley; Tian Liu; David M Schuster; Xiaofeng Yang Journal: Phys Med Biol Date: 2021-12-07 Impact factor: 3.609
Authors: Tonghe Wang; Yang Lei; Yabo Fu; Walter J Curran; Tian Liu; Jonathon A Nye; Xiaofeng Yang Journal: Phys Med Date: 2020-07-29 Impact factor: 2.685
Authors: Yupei Zhang; Yang Lei; Richard L J Qiu; Tonghe Wang; Hesheng Wang; Ashesh B Jani; Walter J Curran; Pretesh Patel; Tian Liu; Xiaofeng Yang Journal: Med Phys Date: 2020-04-03 Impact factor: 4.071
Authors: Xianjin Dai; Yang Lei; Yupei Zhang; Richard L J Qiu; Tonghe Wang; Sean A Dresser; Walter J Curran; Pretesh Patel; Tian Liu; Xiaofeng Yang Journal: Med Phys Date: 2020-06-15 Impact factor: 4.071