Literature DB >> 18238010

A new class of quasi-Newtonian methods for optimal learning in MLP-networks.

A Bortoletti1, C Di Fiore, S Fanelli, P Zellini.   

Abstract

In this paper, we present a new class of quasi-Newton methods for an effective learning in large multilayer perceptron (MLP)-networks. The algorithms introduced in this work, named LQN, utilize an iterative scheme of a generalized BFGS-type method, involving a suitable family of matrix algebras L. The main advantages of these innovative methods are based upon the fact that they have an O(nlogn) complexity per step and that they require O(n) memory allocations. Numerical experiences, performed on a set of standard benchmarks of MLP-networks, show the competitivity of the LQN methods, especially for large values of n.

Year:  2003        PMID: 18238010     DOI: 10.1109/TNN.2003.809425

Source DB:  PubMed          Journal:  IEEE Trans Neural Netw        ISSN: 1045-9227


  2 in total

1.  A novel single neuron perceptron with universal approximation and XOR computation properties.

Authors:  Ehsan Lotfi; M-R Akbarzadeh-T
Journal:  Comput Intell Neurosci       Date:  2014-04-28

2.  Fuzzy Counter Propagation Neural Network Control for a Class of Nonlinear Dynamical Systems.

Authors:  Vandana Sakhre; Sanjeev Jain; Vilas S Sapkal; Dev P Agarwal
Journal:  Comput Intell Neurosci       Date:  2015-08-20
  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.