Literature DB >> 15940984

Deterministic convergence of an online gradient method for BP neural networks.

Wei Wu1, Guorui Feng, Zhengxue Li, Yuesheng Xu.   

Abstract

Online gradient methods are widely used for training feedforward neural networks. We prove in this paper a convergence theorem for an online gradient method with variable step size for backward propagation (BP) neural networks with a hidden layer. Unlike most of the convergence results that are of probabilistic and nonmonotone nature, the convergence result that we establish here has a deterministic and monotone nature.

Entities:  

Mesh:

Year:  2005        PMID: 15940984     DOI: 10.1109/TNN.2005.844903

Source DB:  PubMed          Journal:  IEEE Trans Neural Netw        ISSN: 1045-9227


  3 in total

1.  Deterministic convergence of chaos injection-based gradient method for training feedforward neural networks.

Authors:  Huisheng Zhang; Ying Zhang; Dongpo Xu; Xiaodong Liu
Journal:  Cogn Neurodyn       Date:  2015-01-01       Impact factor: 5.082

2.  Convergence analysis of fully complex backpropagation algorithm based on Wirtinger calculus.

Authors:  Huisheng Zhang; Xiaodong Liu; Dongpo Xu; Ying Zhang
Journal:  Cogn Neurodyn       Date:  2014-01-03       Impact factor: 5.082

3.  Fractional-Order Deep Backpropagation Neural Network.

Authors:  Chunhui Bao; Yifei Pu; Yi Zhang
Journal:  Comput Intell Neurosci       Date:  2018-07-03
  3 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.