Literature DB >> 29737620

Few-view CT reconstruction with group-sparsity regularization.

Peng Bao1, Jiliu Zhou1, Yi Zhang1.   

Abstract

Classical total variation-based iterative reconstruction algorithm is effective for the reconstruction of piecewise smooth image, but it causes oversmoothing effect for textured regions in the reconstructed image. To address this problem, this work presents a novel computed tomography reconstruction method for the few-view problem called the group-sparsity regularization-based simultaneous algebraic reconstruction technique (SART). Group-based sparse representation, which uses the concept of a group as the basic unit of sparse representation instead of a patch, is introduced as the image domain prior regularization term to eliminate the oversmoothing effect. By grouping the nonlocal patches into different clusters with similarity measured by Euclidean distance, the sparsity and nonlocal similarity in a single image are simultaneously explored. The split Bregman iteration algorithm is applied to obtain the numerical scheme. Experimental results demonstrate that our method both qualitatively and quantitatively outperforms several existing reconstruction methods, including filtered back projection, SART, total variation-based projections onto convex sets, and SART-based dictionary learning.
Copyright © 2018 John Wiley & Sons, Ltd.

Keywords:  computed tomography; few-view reconstruction; sparse representation; total variation

Mesh:

Year:  2018        PMID: 29737620     DOI: 10.1002/cnm.3101

Source DB:  PubMed          Journal:  Int J Numer Method Biomed Eng        ISSN: 2040-7939            Impact factor:   2.747


  1 in total

1.  Accelerated Stimulated Raman Projection Tomography by Sparse Reconstruction From Sparse-View Data.

Authors:  Xueli Chen; Shouping Zhu; Huiyuan Wang; Cuiping Bao; Defu Yang; Chi Zhang; Peng Lin; Ji-Xin Cheng; Yonghua Zhan; Jimin Liang; Jie Tian
Journal:  IEEE Trans Biomed Eng       Date:  2019-08-14       Impact factor: 4.538

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.