| Literature DB >> 30009283 |
Dong Nie1,2, Roger Trullo1,3, Jun Lian4, Caroline Petitjean3, Su Ruan3, Qian Wang5, Dinggang Shen1.
Abstract
Computed tomography (CT) is critical for various clinical applications, e.g., radiation treatment planning and also PET attenuation correction in MRI/PET scanner. However, CT exposes radiation during acquisition, which may cause side effects to patients. Compared to CT, magnetic resonance imaging (MRI) is much safer and does not involve radiations. Therefore, recently researchers are greatly motivated to estimate CT image from its corresponding MR image of the same subject for the case of radiation planning. In this paper, we propose a data-driven approach to address this challenging problem. Specifically, we train a fully convolutional network (FCN) to generate CT given the MR image. To better model the nonlinear mapping from MRI to CT and produce more realistic images, we propose to use the adversarial training strategy to train the FCN. Moreover, we propose an image-gradient-difference based loss function to alleviate the blurriness of the generated CT. We further apply Auto-Context Model (ACM) to implement a context-aware generative adversarial network. Experimental results show that our method is accurate and robust for predicting CT images from MR images, and also outperforms three state-of-the-art methods under comparison.Entities:
Keywords: Auto-context; Deep learning; GAN; Generative models; Image synthesis
Year: 2017 PMID: 30009283 PMCID: PMC6044459 DOI: 10.1007/978-3-319-66179-7_48
Source DB: PubMed Journal: Med Image Comput Comput Assist Interv