Literature DB >> 28991744

Convolutional Dictionary Learning: Acceleration and Convergence.

Il Yong Chun, Jeffrey A Fessler.   

Abstract

Convolutional dictionary learning (CDL or sparsifying CDL) has many applications in image processing and computer vision. There has been growing interest in developing efficient algorithms for CDL, mostly relying on the augmented Lagrangian (AL) method or the variant alternating direction method of multipliers (ADMM). When their parameters are properly tuned, AL methods have shown fast convergence in CDL. However, the parameter tuning process is not trivial due to its data dependence and, in practice, the convergence of AL methods depends on the AL parameters for nonconvex CDL problems. To moderate these problems, this paper proposes a new practically feasible and convergent Block Proximal Gradient method using a Majorizer (BPG-M) for CDL. The BPG-M-based CDL is investigated with different block updating schemes and majorization matrix designs, and further accelerated by incorporating some momentum coefficient formulas and restarting techniques. All of the methods investigated incorporate a boundary artifacts removal (or, more generally, sampling) operator in the learning model. Numerical experiments show that, without needing any parameter tuning process, the proposed BPG-M approach converges more stably to desirable solutions of lower objective values than the existing state-of-the-art ADMM algorithm and its memory-efficient variant do. Compared with the ADMM approaches, the BPG-M method using a multi-block updating scheme is particularly useful in single-threaded CDL algorithm handling large data sets, due to its lower memory requirement and no polynomial computational complexity. Image denoising experiments show that, for relatively strong additive white Gaussian noise, the filters learned by BPG-M-based CDL outperform those trained by the ADMM approach.

Entities:  

Year:  2017        PMID: 28991744     DOI: 10.1109/TIP.2017.2761545

Source DB:  PubMed          Journal:  IEEE Trans Image Process        ISSN: 1057-7149            Impact factor:   10.856


  5 in total

1.  Convolutional Analysis Operator Learning: Acceleration and Convergence.

Authors:  Il Yong Chun; Jeffrey A Fessler
Journal:  IEEE Trans Image Process       Date:  2019-09-02       Impact factor: 10.856

2.  Convolutional Analysis Operator Learning: Dependence on Training Data.

Authors:  Il Yong Chun; David Hong; Ben Adcock; Jeffrey A Fessler
Journal:  IEEE Signal Process Lett       Date:  2019-06-07       Impact factor: 3.109

3.  PALMNUT: An Enhanced Proximal Alternating Linearized Minimization Algorithm with Application to Separate Regularization of Magnitude and Phase.

Authors:  Yunsong Liu; Justin P Haldar
Journal:  IEEE Trans Comput Imaging       Date:  2021-05-06

4.  Image Reconstruction: From Sparsity to Data-adaptive Methods and Machine Learning.

Authors:  Saiprasad Ravishankar; Jong Chul Ye; Jeffrey A Fessler
Journal:  Proc IEEE Inst Electr Electron Eng       Date:  2019-09-19       Impact factor: 10.961

5.  Momentum-Net: Fast and convergent iterative neural network for inverse problems.

Authors:  Il Yong Chun; Zhengyu Huang; Hongki Lim; Jeff Fessler
Journal:  IEEE Trans Pattern Anal Mach Intell       Date:  2020-07-29       Impact factor: 6.226

  5 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.