Literature DB >> 28696758

Local Dynamics in Trained Recurrent Neural Networks.

Alexander Rivkind1,2, Omri Barak1,2.   

Abstract

Learning a task induces connectivity changes in neural circuits, thereby changing their dynamics. To elucidate task-related neural dynamics, we study trained recurrent neural networks. We develop a mean field theory for reservoir computing networks trained to have multiple fixed point attractors. Our main result is that the dynamics of the network's output in the vicinity of attractors is governed by a low-order linear ordinary differential equation. The stability of the resulting equation can be assessed, predicting training success or failure. As a consequence, networks of rectified linear units and of sigmoidal nonlinearities are shown to have diametrically different properties when it comes to learning attractors. Furthermore, a characteristic time constant, which remains finite at the edge of chaos, offers an explanation of the network's output robustness in the presence of variability of the internal neural dynamics. Finally, the proposed theory predicts state-dependent frequency selectivity in the network response.

Year:  2017        PMID: 28696758     DOI: 10.1103/PhysRevLett.118.258101

Source DB:  PubMed          Journal:  Phys Rev Lett        ISSN: 0031-9007            Impact factor:   9.161


  14 in total

Review 1.  Building functional networks of spiking model neurons.

Authors:  L F Abbott; Brian DePasquale; Raoul-Martin Memmesheimer
Journal:  Nat Neurosci       Date:  2016-03       Impact factor: 24.884

Review 2.  Evolutionary aspects of reservoir computing.

Authors:  Luís F Seoane
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  2019-06-10       Impact factor: 6.237

3.  Universality and individuality in neural dynamics across large populations of recurrent networks.

Authors:  Niru Maheswaranathan; Alex H Williams; Matthew D Golub; Surya Ganguli; David Sussillo
Journal:  Adv Neural Inf Process Syst       Date:  2019-12

4.  Unsupervised Discovery of Demixed, Low-Dimensional Neural Dynamics across Multiple Timescales through Tensor Component Analysis.

Authors:  Alex H Williams; Tony Hyun Kim; Forea Wang; Saurabh Vyas; Stephen I Ryu; Krishna V Shenoy; Mark Schnitzer; Tamara G Kolda; Surya Ganguli
Journal:  Neuron       Date:  2018-06-07       Impact factor: 17.173

5.  Supervised learning in spiking neural networks with FORCE training.

Authors:  Wilten Nicola; Claudia Clopath
Journal:  Nat Commun       Date:  2017-12-20       Impact factor: 14.919

6.  Scale free topology as an effective feedback system.

Authors:  Alexander Rivkind; Hallel Schreier; Naama Brenner; Omri Barak
Journal:  PLoS Comput Biol       Date:  2020-05-11       Impact factor: 4.475

7.  Learning recurrent dynamics in spiking networks.

Authors:  Christopher M Kim; Carson C Chow
Journal:  Elife       Date:  2018-09-20       Impact factor: 8.140

8.  Coherent chaos in a recurrent neural network with structured connectivity.

Authors:  Itamar Daniel Landau; Haim Sompolinsky
Journal:  PLoS Comput Biol       Date:  2018-12-13       Impact factor: 4.475

9.  Thalamic control of cortical dynamics in a model of flexible motor sequencing.

Authors:  Laureline Logiaco; L F Abbott; Sean Escola
Journal:  Cell Rep       Date:  2021-06-01       Impact factor: 9.423

10.  Interactive reservoir computing for chunking information streams.

Authors:  Toshitake Asabuki; Naoki Hiratani; Tomoki Fukai
Journal:  PLoS Comput Biol       Date:  2018-10-08       Impact factor: 4.475

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.