Literature DB >> 28940372

Esophagus segmentation in CT via 3D fully convolutional neural network and random walk.

Tobias Fechter1, Sonja Adebahr2, Dimos Baltas1, Ismail Ben Ayed3, Christian Desrosiers3, Jose Dolz3.   

Abstract

PURPOSE: Precise delineation of organs at risk is a crucial task in radiotherapy treatment planning for delivering high doses to the tumor while sparing healthy tissues. In recent years, automated segmentation methods have shown an increasingly high performance for the delineation of various anatomical structures. However, this task remains challenging for organs like the esophagus, which have a versatile shape and poor contrast to neighboring tissues. For human experts, segmenting the esophagus from CT images is a time-consuming and error-prone process. To tackle these issues, we propose a random walker approach driven by a 3D fully convolutional neural network (CNN) to automatically segment the esophagus from CT images.
METHODS: First, a soft probability map is generated by the CNN. Then, an active contour model (ACM) is fitted to the CNN soft probability map to get a first estimation of the esophagus location. The outputs of the CNN and ACM are then used in conjunction with a probability model based on CT Hounsfield (HU) values to drive the random walker. Training and evaluation were done on 50 CTs from two different datasets, with clinically used peer-reviewed esophagus contours. Results were assessed regarding spatial overlap and shape similarity.
RESULTS: The esophagus contours generated by the proposed algorithm showed a mean Dice coefficient of 0.76 ± 0.11, an average symmetric square distance of 1.36 ± 0.90 mm, and an average Hausdorff distance of 11.68 ± 6.80, compared to the reference contours. These results translate to a very good agreement with reference contours and an increase in accuracy compared to existing methods. Furthermore, when considering the results reported in the literature for the publicly available Synapse dataset, our method outperformed all existing approaches, which suggests that the proposed method represents the current state-of-the-art for automatic esophagus segmentation.
CONCLUSION: We show that a CNN can yield accurate estimations of esophagus location, and that the results of this model can be refined by a random walk step taking pixel intensities and neighborhood relationships into account. One of the main advantages of our network over previous methods is that it performs 3D convolutions, thus fully exploiting the 3D spatial context and performing an efficient volume-wise prediction. The whole segmentation process is fully automatic and yields esophagus delineations in very good agreement with the gold standard, showing that it can compete with previously published methods.
© 2017 American Association of Physicists in Medicine.

Entities:  

Keywords:  CT; convolutional neural network; esophagus; image processing; segmentation

Mesh:

Year:  2017        PMID: 28940372     DOI: 10.1002/mp.12593

Source DB:  PubMed          Journal:  Med Phys        ISSN: 0094-2405            Impact factor:   4.071


  10 in total

1.  Automated Segmentation of Tissues Using CT and MRI: A Systematic Review.

Authors:  Leon Lenchik; Laura Heacock; Ashley A Weaver; Robert D Boutin; Tessa S Cook; Jason Itri; Christopher G Filippi; Rao P Gullapalli; James Lee; Marianna Zagurovskaya; Tara Retson; Kendra Godwin; Joey Nicholson; Ponnada A Narayana
Journal:  Acad Radiol       Date:  2019-08-10       Impact factor: 3.173

Review 2.  Artificial Intelligence: reshaping the practice of radiological sciences in the 21st century.

Authors:  Issam El Naqa; Masoom A Haider; Maryellen L Giger; Randall K Ten Haken
Journal:  Br J Radiol       Date:  2020-02-01       Impact factor: 3.039

3.  3D convolutional neural networks for detection and severity staging of meniscus and PFJ cartilage morphological degenerative changes in osteoarthritis and anterior cruciate ligament subjects.

Authors:  Valentina Pedoia; Berk Norman; Sarah N Mehany; Matthew D Bucknor; Thomas M Link; Sharmila Majumdar
Journal:  J Magn Reson Imaging       Date:  2018-10-10       Impact factor: 4.813

4.  Deep convolutional neural networks for automatic segmentation of thoracic organs-at-risk in radiation oncology - use of non-domain transfer learning.

Authors:  Charles C Vu; Zaid A Siddiqui; Leonid Zamdborg; Andrew B Thompson; Thomas J Quinn; Edward Castillo; Thomas M Guerrero
Journal:  J Appl Clin Med Phys       Date:  2020-06       Impact factor: 2.102

Review 5.  Applications and limitations of machine learning in radiation oncology.

Authors:  Daniel Jarrett; Eleanor Stride; Katherine Vallis; Mark J Gooding
Journal:  Br J Radiol       Date:  2019-06-05       Impact factor: 3.629

6.  Magnetic resonance-guided high-intensity focused ultrasound of uterine fibroids: whole-tumor quantitative perfusion for prediction of immediate ablation response.

Authors:  Chenxia Li; Chao Jin; Ting Liang; Xiang Li; Rong Wang; Yuelang Zhang; Jian Yang
Journal:  Acta Radiol       Date:  2019-11-28       Impact factor: 1.990

7.  Learning to detect boundary information for brain image segmentation.

Authors:  Afifa Khaled; Jian-Jun Han; Taher A Ghaleb
Journal:  BMC Bioinformatics       Date:  2022-08-11       Impact factor: 3.307

8.  COVID-19 ground-glass opacity segmentation based on fuzzy c-means clustering and improved random walk algorithm.

Authors:  Guowei Wang; Shuli Guo; Lina Han; Zhilei Zhao; Xiaowei Song
Journal:  Biomed Signal Process Control       Date:  2022-09-12       Impact factor: 5.076

9.  Multi-Scale Squeeze U-SegNet with Multi Global Attention for Brain MRI Segmentation.

Authors:  Chaitra Dayananda; Jae-Young Choi; Bumshik Lee
Journal:  Sensors (Basel)       Date:  2021-05-12       Impact factor: 3.576

Review 10.  Review of Deep Learning Based Automatic Segmentation for Lung Cancer Radiotherapy.

Authors:  Xi Liu; Kai-Wen Li; Ruijie Yang; Li-Sheng Geng
Journal:  Front Oncol       Date:  2021-07-08       Impact factor: 6.244

  10 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.