Literature DB >> 20858577

Neural network learning without backpropagation.

Bogdan M Wilamowski1, Hao Yu.   

Abstract

The method introduced in this paper allows for training arbitrarily connected neural networks, therefore, more powerful neural network architectures with connections across layers can be efficiently trained. The proposed method also simplifies neural network training, by using the forward-only computation instead of the traditionally used forward and backward computation.

Mesh:

Year:  2010        PMID: 20858577     DOI: 10.1109/TNN.2010.2073482

Source DB:  PubMed          Journal:  IEEE Trans Neural Netw        ISSN: 1045-9227


  2 in total

1.  A novel single neuron perceptron with universal approximation and XOR computation properties.

Authors:  Ehsan Lotfi; M-R Akbarzadeh-T
Journal:  Comput Intell Neurosci       Date:  2014-04-28

2.  A Novel Online Sequential Extreme Learning Machine for Gas Utilization Ratio Prediction in Blast Furnaces.

Authors:  Yanjiao Li; Sen Zhang; Yixin Yin; Wendong Xiao; Jie Zhang
Journal:  Sensors (Basel)       Date:  2017-08-10       Impact factor: 3.576

  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.