Literature DB >> 33474727

Training deep-learning segmentation models from severely limited data.

Yao Zhao1,2, Dong Joo Rhee1,2, Carlos Cardenas1, Laurence E Court1, Jinzhong Yang1.   

Abstract

PURPOSE: To enable generation of high-quality deep learning segmentation models from severely limited contoured cases (e.g., ~10 cases).
METHODS: Thirty head and neck computed tomography (CT) scans with well-defined contours were deformably registered to 200 CT scans of the same anatomic site without contours. Acquired deformation vector fields were used to train a principal component analysis (PCA) model for each of the 30 contoured CT scans by capturing the mean deformation and most prominent variations. Each PCA model can produce an infinite number of synthetic CT scans and corresponding contours by applying random deformations. We used 300, 600, 1000, and 2000 synthetic CT scans and contours generated from one PCA model to train V-Net, a 3D convolutional neural network architecture, to segment parotid and submandibular glands. We repeated the training using same numbers of training cases generated from 7, 10, 20, and 30 PCA models, with the data distributed evenly between each PCA model. Performance of the segmentation models was evaluated with Dice similarity coefficients between auto-generated contours and physician-drawn contours on 162 test CT scans for parotid glands and another 21 test CT scans for submandibular glands.
RESULTS: Dice values varied with the number of synthetic CT scans and the number of PCA models used to train the network. By using 2000 synthetic CT scans generated from 10 PCA models, we achieved Dice values of 82.8% ± 6.8% for right parotid, 82.0% ± 6.9% for left parotid, and 74.2% ± 6.8% for submandibular glands. These results are comparable with those obtained from state-of-the-art auto-contouring approaches, including a deep learning network trained from more than 1000 contoured patients and a multi-atlas algorithm from 12 well-contoured atlases. Improvement was marginal when >10 PCA models or >2000 synthetic CT scans were used.
CONCLUSIONS: We demonstrated an effective data augmentation approach to train high-quality deep learning segmentation models from a limited number of well-contoured patient cases.
© 2021 American Association of Physicists in Medicine.

Entities:  

Keywords:  Auto-segmentation; convolutional neural networks; data augmentation; deep learning; principal component analysis

Mesh:

Year:  2021        PMID: 33474727      PMCID: PMC8058262          DOI: 10.1002/mp.14728

Source DB:  PubMed          Journal:  Med Phys        ISSN: 0094-2405            Impact factor:   4.071


  26 in total

1.  Heterogeneity in head and neck IMRT target design and clinical practice.

Authors:  Theodore S Hong; Wolfgang A Tomé; Paul M Harari
Journal:  Radiother Oncol       Date:  2012-03-09       Impact factor: 6.280

2.  A pseudoinverse deformation vector field generator and its applications.

Authors:  C Yan; H Zhong; M Murphy; E Weiss; J V Siebers
Journal:  Med Phys       Date:  2010-03       Impact factor: 4.071

3.  Discriminative Unsupervised Feature Learning with Exemplar Convolutional Neural Networks.

Authors:  Alexey Dosovitskiy; Philipp Fischer; Jost Tobias Springenberg; Martin Riedmiller; Thomas Brox
Journal:  IEEE Trans Pattern Anal Mach Intell       Date:  2015-10-29       Impact factor: 6.226

Review 4.  Advances in Auto-Segmentation.

Authors:  Carlos E Cardenas; Jinzhong Yang; Brian M Anderson; Laurence E Court; Kristy B Brock
Journal:  Semin Radiat Oncol       Date:  2019-07       Impact factor: 5.934

5.  Fully Convolutional Networks for Semantic Segmentation.

Authors:  Evan Shelhamer; Jonathan Long; Trevor Darrell
Journal:  IEEE Trans Pattern Anal Mach Intell       Date:  2016-05-24       Impact factor: 6.226

6.  Generative adversarial network in medical imaging: A review.

Authors:  Xin Yi; Ekta Walia; Paul Babyn
Journal:  Med Image Anal       Date:  2019-08-31       Impact factor: 8.545

7.  Brain Tumor Segmentation Using Convolutional Neural Networks in MRI Images.

Authors:  Sergio Pereira; Adriano Pinto; Victor Alves; Carlos A Silva
Journal:  IEEE Trans Med Imaging       Date:  2016-03-04       Impact factor: 10.048

8.  Segmentation of organs-at-risks in head and neck CT images using convolutional neural networks.

Authors:  Bulat Ibragimov; Lei Xing
Journal:  Med Phys       Date:  2017-02       Impact factor: 4.071

9.  Performance evaluation of automatic anatomy segmentation algorithm on repeat or four-dimensional computed tomography images using deformable image registration method.

Authors:  He Wang; Adam S Garden; Lifei Zhang; Xiong Wei; Anesa Ahamad; Deborah A Kuban; Ritsuko Komaki; Jennifer O'Daniel; Yongbin Zhang; Radhe Mohan; Lei Dong
Journal:  Int J Radiat Oncol Biol Phys       Date:  2008-09-01       Impact factor: 7.038

10.  Automatic detection of contouring errors using convolutional neural networks.

Authors:  Dong Joo Rhee; Carlos E Cardenas; Hesham Elhalawani; Rachel McCarroll; Lifei Zhang; Jinzhong Yang; Adam S Garden; Christine B Peterson; Beth M Beadle; Laurence E Court
Journal:  Med Phys       Date:  2019-09-26       Impact factor: 4.071

View more
  3 in total

1.  Feasibility of Continual Deep Learning-Based Segmentation for Personalized Adaptive Radiation Therapy in Head and Neck Area.

Authors:  Nalee Kim; Jaehee Chun; Jee Suk Chang; Chang Geol Lee; Ki Chang Keum; Jin Sung Kim
Journal:  Cancers (Basel)       Date:  2021-02-09       Impact factor: 6.639

2.  Knowledge-based planning for the radiation therapy treatment plan quality assurance for patients with head and neck cancer.

Authors:  Wenhua Cao; Mary Gronberg; Adenike Olanrewaju; Thomas Whitaker; Karen Hoffman; Carlos Cardenas; Adam Garden; Heath Skinner; Beth Beadle; Laurence Court
Journal:  J Appl Clin Med Phys       Date:  2022-04-30       Impact factor: 2.243

3.  Auto-segmentation for total marrow irradiation.

Authors:  William Tyler Watkins; Kun Qing; Chunhui Han; Susanta Hui; An Liu
Journal:  Front Oncol       Date:  2022-08-30       Impact factor: 5.738

  3 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.