| Literature DB >> 33343894 |
Saiprasad Ravishankar1, Anna Ma2, Deanna Needell3.
Abstract
Sparsity-based models and techniques have been exploited in many signal processing and imaging applications. Data-driven methods based on dictionary and sparsifying transform learning enable learning rich image features from data and can outperform analytical models. In particular, alternating optimization algorithms have been popular for learning such models. In this work, we focus on alternating minimization for a specific structured unitary sparsifying operator learning problem and provide a convergence analysis. While the algorithm converges to the critical points of the problem generally, our analysis establishes under mild assumptions, the local linear convergence of the algorithm to the underlying sparsifying model of the data. Analysis and numerical simulations show that our assumptions hold for standard probabilistic data models. In practice, the algorithm is robust to initialization.Keywords: alternating minimization; convergence guarantees; dictionary learning; fast algorithms; generative models; sparse representations; transform learning
Year: 2019 PMID: 33343894 PMCID: PMC7737167 DOI: 10.1093/imaiai/iaz028
Source DB: PubMed Journal: Inf inference ISSN: 2049-8764