Literature DB >> 18255733

Fast training of multilayer perceptrons.

B Verma1.   

Abstract

Training a multilayer perceptron by an error backpropagation algorithm is slow and uncertain. This paper describes a new approach which is much faster and certain than error backpropagation. The proposed approach is based on combined iterative and direct solution methods. In this approach, we use an inverse transformation for linearization of nonlinear output activation functions, direct solution matrix methods for training the weights of the output layer; and gradient descent, the delta rule, and other proposed techniques for training the weights of the hidden layers. The approach has been implemented and tested on many problems. Experimental results, including training times and recognition accuracy, are given. Generally, the approach achieves accuracy as good as or better than perceptrons trained using error backpropagation, and the training process is much faster than the error backpropagation algorithm and also avoids local minima and paralysis.

Entities:  

Year:  1997        PMID: 18255733     DOI: 10.1109/72.641454

Source DB:  PubMed          Journal:  IEEE Trans Neural Netw        ISSN: 1045-9227


  1 in total

1.  A novel single neuron perceptron with universal approximation and XOR computation properties.

Authors:  Ehsan Lotfi; M-R Akbarzadeh-T
Journal:  Comput Intell Neurosci       Date:  2014-04-28
  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.