P Su1,2, S Guo1,3, S Roys1,3, F Maier4, H Bhat2, E R Melhem1, D Gandhi1, R P Gullapalli1,3, J Zhuo5,3. 1. From the Department of Diagnostic Radiology and Nuclear Medicine (P.S., S.G., S.R., E.R.M., D.G., R.G., J.Z.), University of Maryland School of Medicine, Baltimore, Maryland. 2. Siemens Medical Solutions USA (P.S., H.B.), Malvern, Pennsylvania. 3. Center for Metabolic Imaging and Therapeutics (S.G., S.R., R.G., J.Z.), University of Maryland Medical Center, Baltimore, Maryland. 4. Siemens Healthcare GmbH (F.M.), Erlangen, Germany. 5. From the Department of Diagnostic Radiology and Nuclear Medicine (P.S., S.G., S.R., E.R.M., D.G., R.G., J.Z.), University of Maryland School of Medicine, Baltimore, Maryland jzhuo@umm.edu.
Abstract
BACKGROUND AND PURPOSE: Transcranial MR imaging-guided focused ultrasound is a promising novel technique to treat multiple disorders and diseases. Planning for transcranial MR imaging-guided focused ultrasound requires both a CT scan for skull density estimation and treatment-planning simulation and an MR imaging for target identification. It is desirable to simplify the clinical workflow of transcranial MR imaging-guided focused ultrasound treatment planning. The purpose of this study was to examine the feasibility of deep learning techniques to convert MR imaging ultrashort TE images directly to synthetic CT of the skull images for use in transcranial MR imaging-guided focused ultrasound treatment planning. MATERIALS AND METHODS: The U-Net neural network was trained and tested on data obtained from 41 subjects (mean age, 66.4 ± 11.0 years; 15 women). The derived neural network model was evaluated using a k-fold cross-validation method. Derived acoustic properties were verified by comparing the whole skull-density ratio from deep learning synthesized CT of the skull with the reference CT of the skull. In addition, acoustic and temperature simulations were performed using the deep learning CT to predict the target temperature rise during transcranial MR imaging-guided focused ultrasound. RESULTS: The derived deep learning model generates synthetic CT of the skull images that are highly comparable with the true CT of the skull images. Their intensities in Hounsfield units have a spatial correlation coefficient of 0.80 ± 0.08, a mean absolute error of 104.57 ± 21.33 HU, and a subject-wise correlation coefficient of 0.91. Furthermore, deep learning CT of the skull is reliable in the skull-density ratio estimation (r = 0.96). A simulation study showed that both the peak target temperatures and temperature distribution from deep learning CT are comparable with those of the reference CT. CONCLUSIONS: The deep learning method can be used to simplify workflow associated with transcranial MR imaging-guided focused ultrasound.
BACKGROUND AND PURPOSE: Transcranial MR imaging-guided focused ultrasound is a promising novel technique to treat multiple disorders and diseases. Planning for transcranial MR imaging-guided focused ultrasound requires both a CT scan for skull density estimation and treatment-planning simulation and an MR imaging for target identification. It is desirable to simplify the clinical workflow of transcranial MR imaging-guided focused ultrasound treatment planning. The purpose of this study was to examine the feasibility of deep learning techniques to convert MR imaging ultrashort TE images directly to synthetic CT of the skull images for use in transcranial MR imaging-guided focused ultrasound treatment planning. MATERIALS AND METHODS: The U-Net neural network was trained and tested on data obtained from 41 subjects (mean age, 66.4 ± 11.0 years; 15 women). The derived neural network model was evaluated using a k-fold cross-validation method. Derived acoustic properties were verified by comparing the whole skull-density ratio from deep learning synthesized CT of the skull with the reference CT of the skull. In addition, acoustic and temperature simulations were performed using the deep learning CT to predict the target temperature rise during transcranial MR imaging-guided focused ultrasound. RESULTS: The derived deep learning model generates synthetic CT of the skull images that are highly comparable with the true CT of the skull images. Their intensities in Hounsfield units have a spatial correlation coefficient of 0.80 ± 0.08, a mean absolute error of 104.57 ± 21.33 HU, and a subject-wise correlation coefficient of 0.91. Furthermore, deep learning CT of the skull is reliable in the skull-density ratio estimation (r = 0.96). A simulation study showed that both the peak target temperatures and temperature distribution from deep learning CT are comparable with those of the reference CT. CONCLUSIONS: The deep learning method can be used to simplify workflow associated with transcranial MR imaging-guided focused ultrasound.
Authors: Florian Wiesinger; Laura I Sacolick; Anne Menini; Sandeep S Kaushik; Sangtae Ahn; Patrick Veit-Haibach; Gaspar Delso; Dattesh D Shanbhag Journal: Magn Reson Med Date: 2015-01-16 Impact factor: 4.668
Authors: Sijia Guo; Jiachen Zhuo; Guang Li; Dheeraj Gandhi; Mor Dayan; Paul Fishman; Howard Eisenberg; Elias R Melhem; Rao P Gullapalli Journal: Phys Med Biol Date: 2019-04-26 Impact factor: 3.609
Authors: Stephen Monteith; Jason Sheehan; Ricky Medel; Max Wintermark; Matthew Eames; John Snell; Neal F Kassell; W Jeff Elias Journal: J Neurosurg Date: 2012-11-23 Impact factor: 5.115
Authors: W Jeffrey Elias; Diane Huss; Tiffini Voss; Johanna Loomba; Mohamad Khaled; Eyal Zadicario; Robert C Frysinger; Scott A Sperling; Scott Wylie; Stephen J Monteith; Jason Druzgal; Binit B Shah; Madaline Harrison; Max Wintermark Journal: N Engl J Med Date: 2013-08-15 Impact factor: 91.245
Authors: Anna M Dinkla; Jelmer M Wolterink; Matteo Maspero; Mark H F Savenije; Joost J C Verhoeff; Enrica Seravalli; Ivana Išgum; Peter R Seevinck; Cornelis A T van den Berg Journal: Int J Radiat Oncol Biol Phys Date: 2018-06-04 Impact factor: 7.038
Authors: Steven A Leung; David Moore; Yekaterina Gilbo; John Snell; Taylor D Webb; Craig H Meyer; G Wilson Miller; Pejman Ghanouni; Kim Butts Pauly Journal: Sci Rep Date: 2022-08-04 Impact factor: 4.996