| Literature DB >> 27765996 |
Donghwan Kim1, Jeffrey A Fessler1.
Abstract
We introduce new optimized first-order methods for smooth unconstrained convex minimization. Drori and Teboulle [5] recently described a numerical method for computing the N-iteration optimal step coefficients in a class of first-order algorithms that includes gradient methods, heavy-ball methods [15], and Nesterov's fast gradient methods [10,12]. However, the numerical method in [5] is computationally expensive for large N, and the corresponding numerically optimized first-order algorithm in [5] requires impractical memory and computation for large-scale optimization problems. In this paper, we propose optimized first-order algorithms that achieve a convergence bound that is two times smaller than for Nesterov's fast gradient methods; our bound is found analytically and refines the numerical bound in [5]. Furthermore, the proposed optimized first-order methods have efficient forms that are remarkably similar to Nesterov's fast gradient methods.Entities:
Keywords: Convergence bound; Fast gradient methods; First-order algorithms; Smooth convex minimization
Year: 2015 PMID: 27765996 PMCID: PMC5067109 DOI: 10.1007/s10107-015-0949-3
Source DB: PubMed Journal: Math Program ISSN: 0025-5610 Impact factor: 3.995