Literature DB >> 35248809

Sign Stochastic Gradient Descents without bounded gradient assumption for the finite sum minimization.

Tao Sun1, Dongsheng Li2.   

Abstract

Sign-based Stochastic Gradient Descents (Sign-based SGDs) use the signs of the stochastic gradients for communication costs reduction. Nevertheless, current convergence results of sign-based SGDs applied to the finite sum optimization are established on the bounded assumption of the gradient, which fails to hold in various cases. This paper presents a convergence framework about sign-based SGDs with the elimination of the bounded gradient assumption. The ergodic convergence rates are provided only with the smooth assumption of the objective functions. The Sign Stochastic Gradient Descent (signSGD) and its two variants, including majority vote and zeroth-order version, are developed for different application settings. Our framework also removes the bounded gradient assumption used in the previous analysis of these three algorithms.
Copyright © 2022 Elsevier Ltd. All rights reserved.

Entities:  

Keywords:  Convergence; Majority vote; Sign Stochastic Gradient Descent; Unbounded gradient; Zeroth-order

Mesh:

Year:  2022        PMID: 35248809     DOI: 10.1016/j.neunet.2022.02.012

Source DB:  PubMed          Journal:  Neural Netw        ISSN: 0893-6080


  1 in total

1.  Deep-Neural-Network-Based Modelling of Longitudinal-Lateral Dynamics to Predict the Vehicle States for Autonomous Driving.

Authors:  Xiaobo Nie; Chuan Min; Yongjun Pan; Ke Li; Zhixiong Li
Journal:  Sensors (Basel)       Date:  2022-03-04       Impact factor: 3.576

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.