| Literature DB >> 29423650 |
Zheng Xu1, Sheng Wang2, Yeqing Li2, Feiyun Zhu2, Junzhou Huang2.
Abstract
The most recent history of parallel Magnetic Resonance Imaging (pMRI) has in large part been devoted to finding ways to reduce acquisition time. While joint total variation (JTV) regularized model has been demonstrated as a powerful tool in increasing sampling speed for pMRI, however, the major bottleneck is the inefficiency of the optimization method. While all present state-of-the-art optimizations for the JTV model could only reach a sublinear convergence rate, in this paper, we squeeze the performance by proposing a linear-convergent optimization method for the JTV model. The proposed method is based on the Iterative Reweighted Least Squares algorithm. Due to the complexity of the tangled JTV objective, we design a novel preconditioner to further accelerate the proposed method. Extensive experiments demonstrate the superior performance of the proposed algorithm for pMRI regarding both accuracy and efficiency compared with state-of-the-art methods.Entities:
Keywords: Iterative reweighted least squares; Joint total variation; Parallel MRI; Preconditioning conjugate gradient descent
Mesh:
Year: 2018 PMID: 29423650 DOI: 10.1007/s12021-017-9354-9
Source DB: PubMed Journal: Neuroinformatics ISSN: 1539-2791