Literature DB >> 25972981

Deterministic convergence of chaos injection-based gradient method for training feedforward neural networks.

Huisheng Zhang1, Ying Zhang2, Dongpo Xu3, Xiaodong Liu4.   

Abstract

It has been shown that, by adding a chaotic sequence to the weight update during the training of neural networks, the chaos injection-based gradient method (CIBGM) is superior to the standard backpropagation algorithm. This paper presents the theoretical convergence analysis of CIBGM for training feedforward neural networks. We consider both the case of batch learning as well as the case of online learning. Under mild conditions, we prove the weak convergence, i.e., the training error tends to a constant and the gradient of the error function tends to zero. Moreover, the strong convergence of CIBGM is also obtained with the help of an extra condition. The theoretical results are substantiated by a simulation example.

Keywords:  Batch learning; Chaos injection-based gradient method; Convergence; Feedforward neural networks; Online learning

Year:  2015        PMID: 25972981      PMCID: PMC4427592          DOI: 10.1007/s11571-014-9323-z

Source DB:  PubMed          Journal:  Cogn Neurodyn        ISSN: 1871-4080            Impact factor:   5.082


  19 in total

Review 1.  Parameter convergence and learning curves for neural networks.

Authors:  T L Fine; S Mukherjee
Journal:  Neural Comput       Date:  1999-04-01       Impact factor: 2.026

2.  Convergence and objective functions of some fault/noise-injection-based online learning algorithms for RBF networks.

Authors:  Kevin I-J Ho; Chi-Sing Leung; John Sum
Journal:  IEEE Trans Neural Netw       Date:  2010-04-12

3.  Convergence of gradient method with momentum for two-layer feedforward neural networks.

Authors:  Naimin Zhang; Wei Wu; Gaofeng Zheng
Journal:  IEEE Trans Neural Netw       Date:  2006-03

4.  A simple procedure for pruning back-propagation trained neural networks.

Authors:  E D Karnin
Journal:  IEEE Trans Neural Netw       Date:  1990

5.  Novel tracking function of moving target using chaotic dynamics in a recurrent neural network model.

Authors:  Yongtao Li; Shigetoshi Nara
Journal:  Cogn Neurodyn       Date:  2007-10-09       Impact factor: 5.082

6.  Chaotic neural network applied to two-dimensional motion control.

Authors:  Hiroyuki Yoshida; Shuhei Kurata; Yongtao Li; Shigetoshi Nara
Journal:  Cogn Neurodyn       Date:  2009-12-11       Impact factor: 5.082

7.  Convergence analysis of fully complex backpropagation algorithm based on Wirtinger calculus.

Authors:  Huisheng Zhang; Xiaodong Liu; Dongpo Xu; Ying Zhang
Journal:  Cogn Neurodyn       Date:  2014-01-03       Impact factor: 5.082

8.  Convergence analyses on on-line weight noise injection-based training algorithms for MLPs.

Authors:  John Sum; Chi-Sing Leung; Kevin Ho
Journal:  IEEE Trans Neural Netw Learn Syst       Date:  2012-11       Impact factor: 10.451

9.  Deterministic convergence of an online gradient method for BP neural networks.

Authors:  Wei Wu; Guorui Feng; Zhengxue Li; Yuesheng Xu
Journal:  IEEE Trans Neural Netw       Date:  2005-05

10.  Noise-induced spatiotemporal patterns in Hodgkin-Huxley neuronal network.

Authors:  Ying Wu; Jiajia Li; Shaobao Liu; Jiazhi Pang; Mengmeng Du; Pan Lin
Journal:  Cogn Neurodyn       Date:  2013-02-05       Impact factor: 5.082

View more
  2 in total

Review 1.  Towards a fourth spatial dimension of brain activity.

Authors:  Arturo Tozzi; James F Peters
Journal:  Cogn Neurodyn       Date:  2016-02-03       Impact factor: 5.082

2.  Convergence of batch gradient learning with smoothing regularization and adaptive momentum for neural networks.

Authors:  Qinwei Fan; Wei Wu; Jacek M Zurada
Journal:  Springerplus       Date:  2016-03-08
  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.