| Literature DB >> 27330264 |
Christopher De Sa1, Ce Zhang2, Kunle Olukotun1, Christopher Ré1.
Abstract
Stochastic gradient descent (SGD) is a ubiquitous algorithm for a variety of machine learning problems. Researchers and industry have developed several techniques to optimize SGD's runtime performance, including asynchronous execution and reduced precision. Our main result is a martingale-based analysis that enables us to capture the rich noise models that may arise from such techniques. Specifically, we use our new analysis in three ways: (1) we derive convergence rates for the convex case (Hogwild!) with relaxed assumptions on the sparsity of the problem; (2) we analyze asynchronous SGD algorithms for non-convex matrix problems including matrix completion; and (3) we design and analyze an asynchronous SGD algorithm, called Buckwild!, that uses lower-precision arithmetic. We show experimentally that our algorithms run efficiently for a variety of problems on modern hardware.Entities:
Year: 2015 PMID: 27330264 PMCID: PMC4907892
Source DB: PubMed Journal: Adv Neural Inf Process Syst ISSN: 1049-5258