Literature DB >> 9804675

Complexity issues in natural gradient descent method for training multilayer perceptrons.

H H Yang1, S Amari.   

Abstract

The natural gradient descent method is applied to train an n-m-1 multilayer perceptron. Based on an efficient scheme to represent the Fisher information matrix for an n-m-1 stochastic multilayer perceptron, a new algorithm is proposed to calculate the natural gradient without inverting the Fisher information matrix explicitly. When the input dimension n is much larger than the number of hidden neurons m, the time complexity of computing the natural gradient is O(n).

Mesh:

Year:  1998        PMID: 9804675     DOI: 10.1162/089976698300017007

Source DB:  PubMed          Journal:  Neural Comput        ISSN: 0899-7667            Impact factor:   2.026


  3 in total

1.  Natural-gradient learning for spiking neurons.

Authors:  Elena Kreutzer; Walter Senn; Mihai A Petrovici
Journal:  Elife       Date:  2022-04-25       Impact factor: 8.140

2.  On the choice of metric in gradient-based theories of brain function.

Authors:  Simone Carlo Surace; Jean-Pascal Pfister; Wulfram Gerstner; Johanni Brea
Journal:  PLoS Comput Biol       Date:  2020-04-09       Impact factor: 4.475

3.  Parsimonious Optimization of Multitask Neural Network Hyperparameters.

Authors:  Cecile Valsecchi; Viviana Consonni; Roberto Todeschini; Marco Emilio Orlandi; Fabio Gosetti; Davide Ballabio
Journal:  Molecules       Date:  2021-11-30       Impact factor: 4.411

  3 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.