Literature DB >> 30057201

Linking Connectivity, Dynamics, and Computations in Low-Rank Recurrent Neural Networks.

Francesca Mastrogiuseppe1, Srdjan Ostojic2.   

Abstract

Large-scale neural recordings have established that the transformation of sensory stimuli into motor outputs relies on low-dimensional dynamics at the population level, while individual neurons exhibit complex selectivity. Understanding how low-dimensional computations on mixed, distributed representations emerge from the structure of the recurrent connectivity and inputs to cortical networks is a major challenge. Here, we study a class of recurrent network models in which the connectivity is a sum of a random part and a minimal, low-dimensional structure. We show that, in such networks, the dynamics are low dimensional and can be directly inferred from connectivity using a geometrical approach. We exploit this understanding to determine minimal connectivity required to implement specific computations and find that the dynamical range and computational capacity quickly increase with the dimensionality of the connectivity structure. This framework produces testable experimental predictions for the relationship between connectivity, low-dimensional dynamics, and computational features of recorded neurons.
Copyright © 2018 Elsevier Inc. All rights reserved.

Keywords:  low dimensional dynamics; mixed selectivity; neural computations; recurrent neural networks

Mesh:

Year:  2018        PMID: 30057201     DOI: 10.1016/j.neuron.2018.07.003

Source DB:  PubMed          Journal:  Neuron        ISSN: 0896-6273            Impact factor:   17.173


  50 in total

1.  Single-Cell Membrane Potential Fluctuations Evince Network Scale-Freeness and Quasicriticality.

Authors:  James K Johnson; Nathaniel C Wright; Jì Xià; Ralf Wessel
Journal:  J Neurosci       Date:  2019-04-05       Impact factor: 6.167

2.  Acetylcholine acts on songbird premotor circuitry to invigorate vocal output.

Authors:  Paul I Jaffe; Michael S Brainard
Journal:  Elife       Date:  2020-05-19       Impact factor: 8.140

3.  Temporal chunking as a mechanism for unsupervised learning of task-sets.

Authors:  Flora Bouchacourt; Stefano Palminteri; Etienne Koechlin; Srdjan Ostojic
Journal:  Elife       Date:  2020-03-09       Impact factor: 8.140

4.  Simple framework for constructing functional spiking recurrent neural networks.

Authors:  Robert Kim; Yinghao Li; Terrence J Sejnowski
Journal:  Proc Natl Acad Sci U S A       Date:  2019-10-21       Impact factor: 11.205

5.  The asynchronous state's relation to large-scale potentials in cortex.

Authors:  A Alishbayli; J G Tichelaar; U Gorska; M X Cohen; B Englitz
Journal:  J Neurophysiol       Date:  2019-10-23       Impact factor: 2.714

6.  Bayesian Computation through Cortical Latent Dynamics.

Authors:  Hansem Sohn; Devika Narain; Nicolas Meirhaeghe; Mehrdad Jazayeri
Journal:  Neuron       Date:  2019-07-15       Impact factor: 17.173

Review 7.  A roadmap to integrate astrocytes into Systems Neuroscience.

Authors:  Ksenia V Kastanenka; Rubén Moreno-Bote; Maurizio De Pittà; Gertrudis Perea; Abel Eraso-Pichot; Roser Masgrau; Kira E Poskanzer; Elena Galea
Journal:  Glia       Date:  2019-05-06       Impact factor: 7.452

8.  Quantifying the impact of network structure on speed and accuracy in collective decision-making.

Authors:  Bryan C Daniels; Pawel Romanczuk
Journal:  Theory Biosci       Date:  2021-02-26       Impact factor: 1.919

9.  Optimal anticipatory control as a theory of motor preparation: A thalamo-cortical circuit model.

Authors:  Ta-Chu Kao; Mahdieh S Sadabadi; Guillaume Hennequin
Journal:  Neuron       Date:  2021-03-30       Impact factor: 17.173

10.  Global organization of neuronal activity only requires unstructured local connectivity.

Authors:  David Dahmen; Moritz Layer; Lukas Deutz; Paulina Anna Dąbrowska; Nicole Voges; Michael von Papen; Thomas Brochier; Alexa Riehle; Markus Diesmann; Sonja Grün; Moritz Helias
Journal:  Elife       Date:  2022-01-20       Impact factor: 8.140

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.