Literature DB >> 34530451

Stimulus-Driven and Spontaneous Dynamics in Excitatory-Inhibitory Recurrent Neural Networks for Sequence Representation.

Alfred Rajakumar1, John Rinzel2, Zhe S Chen3.   

Abstract

Recurrent neural networks (RNNs) have been widely used to model sequential neural dynamics ("neural sequences") of cortical circuits in cognitive and motor tasks. Efforts to incorporate biological constraints and Dale's principle will help elucidate the neural representations and mechanisms of underlying circuits. We trained an excitatory-inhibitory RNN to learn neural sequences in a supervised manner and studied the representations and dynamic attractors of the trained network. The trained RNN was robust to trigger the sequence in response to various input signals and interpolated a time-warped input for sequence representation. Interestingly, a learned sequence can repeat periodically when the RNN evolved beyond the duration of a single sequence. The eigenspectrum of the learned recurrent connectivity matrix with growing or damping modes, together with the RNN's nonlinearity, were adequate to generate a limit cycle attractor. We further examined the stability of dynamic attractors while training the RNN to learn two sequences. Together, our results provide a general framework for understanding neural sequence representation in the excitatory-inhibitory RNN.
© 2021 Massachusetts Institute of Technology.

Entities:  

Mesh:

Year:  2021        PMID: 34530451      PMCID: PMC8750453          DOI: 10.1162/neco_a_01418

Source DB:  PubMed          Journal:  Neural Comput        ISSN: 0899-7667            Impact factor:   2.026


  53 in total

1.  Overcoming catastrophic forgetting in neural networks.

Authors:  James Kirkpatrick; Razvan Pascanu; Neil Rabinowitz; Joel Veness; Guillaume Desjardins; Andrei A Rusu; Kieran Milan; John Quan; Tiago Ramalho; Agnieszka Grabska-Barwinska; Demis Hassabis; Claudia Clopath; Dharshan Kumaran; Raia Hadsell
Journal:  Proc Natl Acad Sci U S A       Date:  2017-03-14       Impact factor: 11.205

Review 2.  Recurrent neural networks as versatile tools of neuroscience research.

Authors:  Omri Barak
Journal:  Curr Opin Neurobiol       Date:  2017-06-29       Impact factor: 6.627

3.  Opening the black box: low-dimensional dynamics in high-dimensional recurrent neural networks.

Authors:  David Sussillo; Omri Barak
Journal:  Neural Comput       Date:  2012-12-28       Impact factor: 2.026

Review 4.  Space and Time: The Hippocampus as a Sequence Generator.

Authors:  György Buzsáki; David Tingley
Journal:  Trends Cogn Sci       Date:  2018-10       Impact factor: 20.229

5.  Considerations in using recurrent neural networks to probe neural dynamics.

Authors:  Jonathan C Kao
Journal:  J Neurophysiol       Date:  2019-10-16       Impact factor: 2.714

6.  Memory without feedback in a neural network.

Authors:  Mark S Goldman
Journal:  Neuron       Date:  2009-02-26       Impact factor: 17.173

7.  Training Excitatory-Inhibitory Recurrent Neural Networks for Cognitive Tasks: A Simple and Flexible Framework.

Authors:  H Francis Song; Guangyu R Yang; Xiao-Jing Wang
Journal:  PLoS Comput Biol       Date:  2016-02-29       Impact factor: 4.475

8.  Stable memory with unstable synapses.

Authors:  Lee Susman; Naama Brenner; Omri Barak
Journal:  Nat Commun       Date:  2019-09-30       Impact factor: 14.919

9.  Intrinsically-generated fluctuating activity in excitatory-inhibitory networks.

Authors:  Francesca Mastrogiuseppe; Srdjan Ostojic
Journal:  PLoS Comput Biol       Date:  2017-04-24       Impact factor: 4.475

10.  Characteristics of sequential activity in networks with temporally asymmetric Hebbian learning.

Authors:  Maxwell Gillett; Ulises Pereira; Nicolas Brunel
Journal:  Proc Natl Acad Sci U S A       Date:  2020-11-11       Impact factor: 11.205

View more
  1 in total

1.  Retinal Processing: Insights from Mathematical Modelling.

Authors:  Bruno Cessac
Journal:  J Imaging       Date:  2022-01-17
  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.