Max Torop1, Satya V V N Kothapalli2, Yu Sun1, Jiaming Liu3, Sayan Kahali2, Dmitriy A Yablonskiy2, Ulugbek S Kamilov1,3. 1. Department of Computer Science and Engineering, Washington University in St. Louis, St. Louis, MO, USA. 2. Department of Radiology, Washington University in St. Louis, St. Louis, MO, USA. 3. Department of Electrical and Systems Engineering, University in St. Louis, St. Louis, MO, USA.
Abstract
PURPOSE: To introduce a novel deep learning method for Robust and Accelerated Reconstruction (RoAR) of quantitative and B0-inhomogeneity-corrected R 2 * maps from multi-gradient recalled echo (mGRE) MRI data. METHODS: RoAR trains a convolutional neural network (CNN) to generate quantitative R 2 ∗ maps free from field inhomogeneity artifacts by adopting a self-supervised learning strategy given (a) mGRE magnitude images, (b) the biophysical model describing mGRE signal decay, and (c) preliminary-evaluated F-function accounting for contribution of macroscopic B0 field inhomogeneities. Importantly, no ground-truth R 2 * images are required and F-function is only needed during RoAR training but not application. RESULTS: We show that RoAR preserves all features of R 2 * maps while offering significant improvements over existing methods in computation speed (seconds vs. hours) and reduced sensitivity to noise. Even for data with SNR = 5 RoAR produced R 2 * maps with accuracy of 22% while voxel-wise analysis accuracy was 47%. For SNR = 10 the RoAR accuracy increased to 17% vs. 24% for direct voxel-wise analysis. CONCLUSIONS: RoAR is trained to recognize the macroscopic magnetic field inhomogeneities directly from the input magnitude-only mGRE data and eliminate their effect on R 2 ∗ measurements. RoAR training is based on the biophysical model and does not require ground-truth R 2 * maps. Since RoAR utilizes signal information not just from individual voxels but also accounts for spatial patterns of the signals in the images, it reduces the sensitivity of R 2 * maps to the noise in the data. These features plus high computational speed provide significant benefits for the potential usage of RoAR in clinical settings.
PURPOSE: To introduce a novel deep learning method for Robust and Accelerated Reconstruction (RoAR) of quantitative and B0-inhomogeneity-corrected R 2 * maps from multi-gradient recalled echo (mGRE) MRI data. METHODS: RoAR trains a convolutional neural network (CNN) to generate quantitative R 2 ∗ maps free from field inhomogeneity artifacts by adopting a self-supervised learning strategy given (a) mGRE magnitude images, (b) the biophysical model describing mGRE signal decay, and (c) preliminary-evaluated F-function accounting for contribution of macroscopic B0 field inhomogeneities. Importantly, no ground-truth R 2 * images are required and F-function is only needed during RoAR training but not application. RESULTS: We show that RoAR preserves all features of R 2 * maps while offering significant improvements over existing methods in computation speed (seconds vs. hours) and reduced sensitivity to noise. Even for data with SNR = 5 RoAR produced R 2 * maps with accuracy of 22% while voxel-wise analysis accuracy was 47%. For SNR = 10 the RoAR accuracy increased to 17% vs. 24% for direct voxel-wise analysis. CONCLUSIONS: RoAR is trained to recognize the macroscopic magnetic field inhomogeneities directly from the input magnitude-only mGRE data and eliminate their effect on R 2 ∗ measurements. RoAR training is based on the biophysical model and does not require ground-truth R 2 * maps. Since RoAR utilizes signal information not just from individual voxels but also accounts for spatial patterns of the signals in the images, it reduces the sensitivity of R 2 * maps to the noise in the data. These features plus high computational speed provide significant benefits for the potential usage of RoAR in clinical settings.
Authors: Satya V V N Kothapalli; Tammie L Benzinger; Andrew J Aschenbrenner; Richard J Perrin; Charles F Hildebolt; Manu S Goyal; Anne M Fagan; Marcus E Raichle; John C Morris; Dmitriy A Yablonskiy Journal: J Alzheimers Dis Date: 2022 Impact factor: 4.472
Authors: Francesca Inglese; Minseon Kim; Gerda M Steup-Beekman; Tom W J Huizinga; Mark A van Buchem; Jeroen de Bresser; Dae-Shik Kim; Itamar Ronen Journal: Front Neurosci Date: 2022-02-16 Impact factor: 4.677