Literature DB >> 28035663

Using deep learning to segment breast and fibroglandular tissue in MRI volumes.

Mehmet Ufuk Dalmış1, Geert Litjens1, Katharina Holland1, Arnaud Setio1, Ritse Mann1, Nico Karssemeijer1, Albert Gubern-Mérida1.   

Abstract

PURPOSE: Automated segmentation of breast and fibroglandular tissue (FGT) is required for various computer-aided applications of breast MRI. Traditional image analysis and computer vision techniques, such atlas, template matching, or, edge and surface detection, have been applied to solve this task. However, applicability of these methods is usually limited by the characteristics of the images used in the study datasets, while breast MRI varies with respect to the different MRI protocols used, in addition to the variability in breast shapes. All this variability, in addition to various MRI artifacts, makes it a challenging task to develop a robust breast and FGT segmentation method using traditional approaches. Therefore, in this study, we investigated the use of a deep-learning approach known as "U-net."
MATERIALS AND METHODS: We used a dataset of 66 breast MRI's randomly selected from our scientific archive, which includes five different MRI acquisition protocols and breasts from four breast density categories in a balanced distribution. To prepare reference segmentations, we manually segmented breast and FGT for all images using an in-house developed workstation. We experimented with the application of U-net in two different ways for breast and FGT segmentation. In the first method, following the same pipeline used in traditional approaches, we trained two consecutive (2C) U-nets: first for segmenting the breast in the whole MRI volume and the second for segmenting FGT inside the segmented breast. In the second method, we used a single 3-class (3C) U-net, which performs both tasks simultaneously by segmenting the volume into three regions: nonbreast, fat inside the breast, and FGT inside the breast. For comparison, we applied two existing and published methods to our dataset: an atlas-based method and a sheetness-based method. We used Dice Similarity Coefficient (DSC) to measure the performances of the automated methods, with respect to the manual segmentations. Additionally, we computed Pearson's correlation between the breast density values computed based on manual and automated segmentations.
RESULTS: The average DSC values for breast segmentation were 0.933, 0.944, 0.863, and 0.848 obtained from 3C U-net, 2C U-nets, atlas-based method, and sheetness-based method, respectively. The average DSC values for FGT segmentation obtained from 3C U-net, 2C U-nets, and atlas-based methods were 0.850, 0.811, and 0.671, respectively. The correlation between breast density values based on 3C U-net and manual segmentations was 0.974. This value was significantly higher than 0.957 as obtained from 2C U-nets (P < 0.0001, Steiger's Z-test with Bonferoni correction) and 0.938 as obtained from atlas-based method (P = 0.0016).
CONCLUSIONS: In conclusion, we applied a deep-learning method, U-net, for segmenting breast and FGT in MRI in a dataset that includes a variety of MRI protocols and breast densities. Our results showed that U-net-based methods significantly outperformed the existing algorithms and resulted in significantly more accurate breast density computation.
© 2016 American Association of Physicists in Medicine.

Keywords:  MRI; breast segmentation; deep learning

Mesh:

Year:  2017        PMID: 28035663     DOI: 10.1002/mp.12079

Source DB:  PubMed          Journal:  Med Phys        ISSN: 0094-2405            Impact factor:   4.071


  46 in total

1.  Convolutional Neural Networks for the Detection and Measurement of Cerebral Aneurysms on Magnetic Resonance Angiography.

Authors:  Joseph N Stember; Peter Chang; Danielle M Stember; Michael Liu; Jack Grinband; Christopher G Filippi; Philip Meyers; Sachin Jambawalikar
Journal:  J Digit Imaging       Date:  2019-10       Impact factor: 4.056

2.  Automated Segmentation of Tissues Using CT and MRI: A Systematic Review.

Authors:  Leon Lenchik; Laura Heacock; Ashley A Weaver; Robert D Boutin; Tessa S Cook; Jason Itri; Christopher G Filippi; Rao P Gullapalli; James Lee; Marianna Zagurovskaya; Tara Retson; Kendra Godwin; Joey Nicholson; Ponnada A Narayana
Journal:  Acad Radiol       Date:  2019-08-10       Impact factor: 3.173

3.  A 3D deep convolutional neural network approach for the automated measurement of cerebellum tracer uptake in FDG PET-CT scans.

Authors:  Xiaofan Xiong; Timothy J Linhardt; Weiren Liu; Brian J Smith; Wenqing Sun; Christian Bauer; John J Sunderland; Michael M Graham; John M Buatti; Reinhard R Beichel
Journal:  Med Phys       Date:  2020-01-06       Impact factor: 4.071

4.  Automatic Breast and Fibroglandular Tissue Segmentation in Breast MRI Using Deep Learning by a Fully-Convolutional Residual Neural Network U-Net.

Authors:  Yang Zhang; Jeon-Hor Chen; Kai-Ting Chang; Vivian Youngjean Park; Min Jung Kim; Siwa Chan; Peter Chang; Daniel Chow; Alex Luk; Tiffany Kwong; Min-Ying Su
Journal:  Acad Radiol       Date:  2019-01-31       Impact factor: 3.173

5.  A deep learning framework for efficient analysis of breast volume and fibroglandular tissue using MR data with strong artifacts.

Authors:  Tatyana Ivanovska; Thomas G Jentschke; Amro Daboul; Katrin Hegenscheid; Henry Völzke; Florentin Wörgötter
Journal:  Int J Comput Assist Radiol Surg       Date:  2019-03-06       Impact factor: 2.924

6.  Quantitative Volumetric K-Means Cluster Segmentation of Fibroglandular Tissue and Skin in Breast MRI.

Authors:  Anton Niukkanen; Otso Arponen; Aki Nykänen; Amro Masarwah; Anna Sutela; Timo Liimatainen; Ritva Vanninen; Mazen Sudah
Journal:  J Digit Imaging       Date:  2018-08       Impact factor: 4.056

Review 7.  Deep learning beyond cats and dogs: recent advances in diagnosing breast cancer with deep neural networks.

Authors:  Jeremy R Burt; Neslisah Torosdagli; Naji Khosravan; Harish RaviPrakash; Aliasghar Mortazi; Fiona Tissavirasingham; Sarfaraz Hussein; Ulas Bagci
Journal:  Br J Radiol       Date:  2018-04-10       Impact factor: 3.039

8.  Context-aware stacked convolutional neural networks for classification of breast carcinomas in whole-slide histopathology images.

Authors:  Babak Ehteshami Bejnordi; Guido Zuidhof; Maschenka Balkenhol; Meyke Hermsen; Peter Bult; Bram van Ginneken; Nico Karssemeijer; Geert Litjens; Jeroen van der Laak
Journal:  J Med Imaging (Bellingham)       Date:  2017-12-14

9.  Fully automated detection of breast cancer in screening MRI using convolutional neural networks.

Authors:  Mehmet Ufuk Dalmış; Suzan Vreemann; Thijs Kooi; Ritse M Mann; Nico Karssemeijer; Albert Gubern-Mérida
Journal:  J Med Imaging (Bellingham)       Date:  2018-01-11

10.  Fully automated multiorgan segmentation in abdominal magnetic resonance imaging with deep neural networks.

Authors:  Yuhua Chen; Dan Ruan; Jiayu Xiao; Lixia Wang; Bin Sun; Rola Saouaf; Wensha Yang; Debiao Li; Zhaoyang Fan
Journal:  Med Phys       Date:  2020-08-30       Impact factor: 4.071

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.