| Literature DB >> 35248809 |
Tao Sun1, Dongsheng Li2.
Abstract
Sign-based Stochastic Gradient Descents (Sign-based SGDs) use the signs of the stochastic gradients for communication costs reduction. Nevertheless, current convergence results of sign-based SGDs applied to the finite sum optimization are established on the bounded assumption of the gradient, which fails to hold in various cases. This paper presents a convergence framework about sign-based SGDs with the elimination of the bounded gradient assumption. The ergodic convergence rates are provided only with the smooth assumption of the objective functions. The Sign Stochastic Gradient Descent (signSGD) and its two variants, including majority vote and zeroth-order version, are developed for different application settings. Our framework also removes the bounded gradient assumption used in the previous analysis of these three algorithms.Entities:
Keywords: Convergence; Majority vote; Sign Stochastic Gradient Descent; Unbounded gradient; Zeroth-order
Mesh:
Year: 2022 PMID: 35248809 DOI: 10.1016/j.neunet.2022.02.012
Source DB: PubMed Journal: Neural Netw ISSN: 0893-6080