Literature DB >> 18244504

Two highly efficient second-order algorithms for training feedforward networks.

N Ampazis1, S J Perantonis.   

Abstract

We present two highly efficient second-order algorithms for the training of multilayer feedforward neural networks. The algorithms are based on iterations of the form employed in the Levenberg-Marquardt (LM) method for nonlinear least squares problems with the inclusion of an additional adaptive momentum term arising from the formulation of the training task as a constrained optimization problem. Their implementation requires minimal additional computations compared to a standard LM iteration. Simulations of large scale classical neural-network benchmarks are presented which reveal the power of the two methods to obtain solutions in difficult problems, whereas other standard second-order techniques (including LM) fail to converge.

Year:  2002        PMID: 18244504     DOI: 10.1109/TNN.2002.1031939

Source DB:  PubMed          Journal:  IEEE Trans Neural Netw        ISSN: 1045-9227


  3 in total

1.  A novel single neuron perceptron with universal approximation and XOR computation properties.

Authors:  Ehsan Lotfi; M-R Akbarzadeh-T
Journal:  Comput Intell Neurosci       Date:  2014-04-28

2.  An NN-based SRD decomposition algorithm and its application in nonlinear compensation.

Authors:  Honghang Yan; Fang Deng; Jian Sun; Jie Chen
Journal:  Sensors (Basel)       Date:  2014-09-17       Impact factor: 3.576

3.  Noise Induces Biased Estimation of the Correction Gain.

Authors:  Jooeun Ahn; Zhaoran Zhang; Dagmar Sternad
Journal:  PLoS One       Date:  2016-07-27       Impact factor: 3.240

  3 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.