Wei Zha1, Sean B Fain1,2,3, Mark L Schiebler2, Michael D Evans4, Scott K Nagle1,2,5, Fang Liu2. 1. Department of Medical Physics, University of Wisconsin-Madison, Madison, Wisconsin, USA. 2. Department of Radiology, University of Wisconsin-Madison, Madison, Wisconsin, USA. 3. Department of Biomedical Engineering, University of Wisconsin-Madison, Madison, Wisconsin, USA. 4. Clinical and Translational Science Institute, University of Minnesota Twin Cities, Minneapolis, Minnesota, USA. 5. Department of Pediatrics, University of Wisconsin-Madison, Madison, Wisconsin, USA.
Abstract
BACKGROUND: Ultrashort echo time (UTE) proton MRI has gained popularity for assessing lung structure and function in pulmonary imaging; however, the development of rapid biomarker extraction and regional quantification has lagged behind due to labor-intensive lung segmentation. PURPOSE: To evaluate a deep learning (DL) approach for automated lung segmentation to extract image-based biomarkers from functional lung imaging using 3D radial UTE oxygen-enhanced (OE) MRI. STUDY TYPE: Retrospective study aimed to evaluate a technical development. POPULATION: Forty-five human subjects, including 16 healthy volunteers, 5 asthma, and 24 patients with cystic fibrosis. FIELD STRENGTH/SEQUENCE: 1.5T MRI, 3D radial UTE (TE = 0.08 msec) sequence. ASSESSMENT: Two 3D radial UTE volumes were acquired sequentially under normoxic (21% O2 ) and hyperoxic (100% O2 ) conditions. Automated segmentation of the lungs using 2D convolutional encoder-decoder based DL method, and the subsequent functional quantification via adaptive K-means were compared with the results obtained from the reference method, supervised region growing. STATISTICAL TESTS: Relative to the reference method, the performance of DL on volumetric quantification was assessed using Dice coefficient with 95% confidence interval (CI) for accuracy, two-sided Wilcoxon signed-rank test for computation time, and Bland-Altman analysis on the functional measure derived from the OE images. RESULTS: The DL method produced strong agreement with supervised region growing for the right (Dice: 0.97; 95% CI = [0.96, 0.97]; P < 0.001) and left lungs (Dice: 0.96; 95% CI = [0.96, 0.97]; P < 0.001). The DL method averaged 46 seconds to generate the automatic segmentations in contrast to 1.93 hours using the reference method (P < 0.001). Bland-Altman analysis showed nonsignificant intermethod differences of volumetric (P ≥ 0.12) and functional measurements (P ≥ 0.34) in the left and right lungs. DATA CONCLUSION: DL provides rapid, automated, and robust lung segmentation for quantification of regional lung function using UTE proton MRI. LEVEL OF EVIDENCE: 2 Technical Efficacy: Stage 1 J. Magn. Reson. Imaging 2019;50:1169-1181.
BACKGROUND: Ultrashort echo time (UTE) proton MRI has gained popularity for assessing lung structure and function in pulmonary imaging; however, the development of rapid biomarker extraction and regional quantification has lagged behind due to labor-intensive lung segmentation. PURPOSE: To evaluate a deep learning (DL) approach for automated lung segmentation to extract image-based biomarkers from functional lung imaging using 3D radial UTE oxygen-enhanced (OE) MRI. STUDY TYPE: Retrospective study aimed to evaluate a technical development. POPULATION: Forty-five human subjects, including 16 healthy volunteers, 5 asthma, and 24 patients with cystic fibrosis. FIELD STRENGTH/SEQUENCE: 1.5T MRI, 3D radial UTE (TE = 0.08 msec) sequence. ASSESSMENT: Two 3D radial UTE volumes were acquired sequentially under normoxic (21% O2 ) and hyperoxic (100% O2 ) conditions. Automated segmentation of the lungs using 2D convolutional encoder-decoder based DL method, and the subsequent functional quantification via adaptive K-means were compared with the results obtained from the reference method, supervised region growing. STATISTICAL TESTS: Relative to the reference method, the performance of DL on volumetric quantification was assessed using Dice coefficient with 95% confidence interval (CI) for accuracy, two-sided Wilcoxon signed-rank test for computation time, and Bland-Altman analysis on the functional measure derived from the OE images. RESULTS: The DL method produced strong agreement with supervised region growing for the right (Dice: 0.97; 95% CI = [0.96, 0.97]; P < 0.001) and left lungs (Dice: 0.96; 95% CI = [0.96, 0.97]; P < 0.001). The DL method averaged 46 seconds to generate the automatic segmentations in contrast to 1.93 hours using the reference method (P < 0.001). Bland-Altman analysis showed nonsignificant intermethod differences of volumetric (P ≥ 0.12) and functional measurements (P ≥ 0.34) in the left and right lungs. DATA CONCLUSION:DL provides rapid, automated, and robust lung segmentation for quantification of regional lung function using UTE proton MRI. LEVEL OF EVIDENCE: 2 Technical Efficacy: Stage 1 J. Magn. Reson. Imaging 2019;50:1169-1181.
Authors: Helen Marshall; Alex Horsley; Chris J Taylor; Laurie Smith; David Hughes; Felix C Horn; Andrew J Swift; Juan Parra-Robles; Paul J Hughes; Graham Norquay; Neil J Stewart; Guilhem J Collier; Dawn Teare; Steve Cunningham; Ina Aldag; Jim M Wild Journal: Thorax Date: 2017-03-06 Impact factor: 9.139
Authors: Marcus J Couch; Iain K Ball; Tao Li; Matthew S Fox; Alexei V Ouriadov; Birubi Biman; Mitchell S Albert Journal: NMR Biomed Date: 2014-07-26 Impact factor: 4.044
Authors: Nima Tajbakhsh; Jae Y Shin; Suryakanth R Gurudu; R Todd Hurst; Christopher B Kendall; Michael B Gotway Journal: IEEE Trans Med Imaging Date: 2016-03-07 Impact factor: 10.048
Authors: Gengyan Zhao; Fang Liu; Jonathan A Oler; Mary E Meyerand; Ned H Kalin; Rasmus M Birn Journal: Neuroimage Date: 2018-03-28 Impact factor: 6.556
Authors: Wei Zha; Chun G Schiros; Gautam Reddy; Wei Feng; Thomas S Denney; Steven G Lloyd; Louis J Dell'Italia; Himanshu Gupta Journal: Front Cardiovasc Med Date: 2015-04-30
Authors: Orso Pusterla; Rahel Heule; Francesco Santini; Thomas Weikert; Corin Willers; Simon Andermatt; Robin Sandkühler; Sylvia Nyilas; Philipp Latzin; Oliver Bieri; Grzegorz Bauman Journal: Magn Reson Med Date: 2022-03-29 Impact factor: 3.737