Literature DB >> 18624656

A mathematical analysis of the effects of Hebbian learning rules on the dynamics and structure of discrete-time random recurrent neural networks.

Benoît Siri1, Hugues Berry, Bruno Cessac, Bruno Delord, Mathias Quoy.   

Abstract

We present a mathematical analysis of the effects of Hebbian learning in random recurrent neural networks, with a generic Hebbian learning rule, including passive forgetting and different timescales, for neuronal activity and learning dynamics. Previous numerical work has reported that Hebbian learning drives the system from chaos to a steady state through a sequence of bifurcations. Here, we interpret these results mathematically and show that these effects, involving a complex coupling between neuronal dynamics and synaptic graph structure, can be analyzed using Jacobian matrices, which introduce both a structural and a dynamical point of view on neural network evolution. Furthermore, we show that sensitivity to a learned pattern is maximal when the largest Lyapunov exponent is close to 0. We discuss how neural networks may take advantage of this regime of high functional interest.

Mesh:

Year:  2008        PMID: 18624656     DOI: 10.1162/neco.2008.05-07-530

Source DB:  PubMed          Journal:  Neural Comput        ISSN: 0899-7667            Impact factor:   2.026


  7 in total

1.  Combined effects of LTP/LTD and synaptic scaling in formation of discrete and line attractors with persistent activity from non-trivial baseline.

Authors:  Timothee Leleu; Kazuyuki Aihara
Journal:  Cogn Neurodyn       Date:  2012-07-14       Impact factor: 5.082

2.  Effects of cellular homeostatic intrinsic plasticity on dynamical and computational properties of biological recurrent neural networks.

Authors:  Jérémie Naudé; Bruno Cessac; Hugues Berry; Bruno Delord
Journal:  J Neurosci       Date:  2013-09-18       Impact factor: 6.167

3.  On dynamics of integrate-and-fire neural networks with conductance based synapses.

Authors:  Bruno Cessac; Thierry Viéville
Journal:  Front Comput Neurosci       Date:  2008-07-04       Impact factor: 2.380

4.  Dynamic Organization of Hierarchical Memories.

Authors:  Tomoki Kurikawa; Kunihiko Kaneko
Journal:  PLoS One       Date:  2016-09-12       Impact factor: 3.240

5.  Self-Optimization in Continuous-Time Recurrent Neural Networks.

Authors:  Mario Zarco; Tom Froese
Journal:  Front Robot AI       Date:  2018-08-21

Review 6.  Criticality, Connectivity, and Neural Disorder: A Multifaceted Approach to Neural Computation.

Authors:  Kristine Heiney; Ola Huse Ramstad; Vegard Fiskum; Nicholas Christiansen; Axel Sandvig; Stefano Nichele; Ioanna Sandvig
Journal:  Front Comput Neurosci       Date:  2021-02-10       Impact factor: 2.380

7.  A spike-timing pattern based neural network model for the study of memory dynamics.

Authors:  Jian K Liu; Zhen-Su She
Journal:  PLoS One       Date:  2009-07-24       Impact factor: 3.240

  7 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.