Literature DB >> 30249794

Neural Classifiers with Limited Connectivity and Recurrent Readouts.

Lyudmila Kushnir1,2, Stefano Fusi3,4,5.   

Abstract

For many neural network models in which neurons are trained to classify inputs like perceptrons, the number of inputs that can be classified is limited by the connectivity of each neuron, even when the total number of neurons is very large. This poses the problem of how the biological brain can take advantage of its huge number of neurons given that the connectivity is sparse. One solution is to combine multiple perceptrons together, as in committee machines. The number of classifiable random patterns would then grow linearly with the number of perceptrons, even when each perceptron has limited connectivity. However, the problem is moved to the downstream readout neurons, which would need a number of connections as large as the number of perceptrons. Here we propose a different approach in which the readout is implemented by connecting multiple perceptrons in a recurrent attractor neural network. We prove analytically that the number of classifiable random patterns can grow unboundedly with the number of perceptrons, even when the connectivity of each perceptron remains finite. Most importantly, both the recurrent connectivity and the connectivity of downstream readouts also remain finite. Our study shows that feedforward neural classifiers with numerous long-range afferent connections can be replaced by recurrent networks with sparse long-range connectivity without sacrificing the classification performance. Our strategy could be used to design more general scalable network architectures with limited connectivity, which resemble more closely the brain neural circuits that are dominated by recurrent connectivity.SIGNIFICANCE STATEMENT The mammalian brain has a huge number of neurons, but the connectivity is rather sparse. This observation seems to contrast with the theoretical studies showing that for many neural network models the performance scales with the number of connections per neuron and not with the total number of neurons. To solve this dilemma, we propose a model in which a recurrent network reads out multiple neural classifiers. Its performance scales with the total number of neurons even when each neuron of the network has limited connectivity. Our study reveals an important role of recurrent connections in neural systems like the hippocampus, in which the computational limitations due to sparse long-range feedforward connectivity might be compensated by local recurrent connections.
Copyright © 2018 the authors 0270-6474/18/389900-25$15.00/0.

Entities:  

Keywords:  attractor networks; classifier; committee machines; perceptron; sparse connectivity

Mesh:

Year:  2018        PMID: 30249794      PMCID: PMC6596245          DOI: 10.1523/JNEUROSCI.3506-17.2018

Source DB:  PubMed          Journal:  J Neurosci        ISSN: 0270-6474            Impact factor:   6.167


  19 in total

1.  Synaptic basis of cortical persistent activity: the importance of NMDA receptors to working memory.

Authors:  X J Wang
Journal:  J Neurosci       Date:  1999-11-01       Impact factor: 6.167

Review 2.  The economy of brain network organization.

Authors:  Ed Bullmore; Olaf Sporns
Journal:  Nat Rev Neurosci       Date:  2012-04-13       Impact factor: 34.870

Review 3.  Neurons, numbers and the hippocampal network.

Authors:  D G Amaral; N Ishizuka; B Claiborne
Journal:  Prog Brain Res       Date:  1990       Impact factor: 2.453

4.  Sparse, environmentally selective expression of Arc RNA in the upper blade of the rodent fascia dentata by brief spatial experience.

Authors:  M K Chawla; J F Guzowski; V Ramirez-Amaya; P Lipa; K L Hoffman; L K Marriott; P F Worley; B L McNaughton; C A Barnes
Journal:  Hippocampus       Date:  2005       Impact factor: 3.899

5.  Model of global spontaneous activity and local structured activity during delay periods in the cerebral cortex.

Authors:  D J Amit; N Brunel
Journal:  Cereb Cortex       Date:  1997 Apr-May       Impact factor: 5.357

6.  Optimal Degrees of Synaptic Connectivity.

Authors:  Ashok Litwin-Kumar; Kameron Decker Harris; Richard Axel; Haim Sompolinsky; L F Abbott
Journal:  Neuron       Date:  2017-02-16       Impact factor: 17.173

7.  Neural networks and physical systems with emergent collective computational abilities.

Authors:  J J Hopfield
Journal:  Proc Natl Acad Sci U S A       Date:  1982-04       Impact factor: 11.205

8.  Sparse synaptic connectivity is required for decorrelation and pattern separation in feedforward networks.

Authors:  N Alex Cayco-Gajic; Claudia Clopath; R Angus Silver
Journal:  Nat Commun       Date:  2017-10-24       Impact factor: 14.919

9.  Stimulus onset quenches neural variability: a widespread cortical phenomenon.

Authors:  Mark M Churchland; Byron M Yu; John P Cunningham; Leo P Sugrue; Marlene R Cohen; Greg S Corrado; William T Newsome; Andrew M Clark; Paymon Hosseini; Benjamin B Scott; David C Bradley; Matthew A Smith; Adam Kohn; J Anthony Movshon; Katherine M Armstrong; Tirin Moore; Steve W Chang; Lawrence H Snyder; Stephen G Lisberger; Nicholas J Priebe; Ian M Finn; David Ferster; Stephen I Ryu; Gopal Santhanam; Maneesh Sahani; Krishna V Shenoy
Journal:  Nat Neurosci       Date:  2010-02-21       Impact factor: 24.884

10.  A balanced memory network.

Authors:  Yasser Roudi; Peter E Latham
Journal:  PLoS Comput Biol       Date:  2007-06-05       Impact factor: 4.475

View more
  1 in total

1.  Towards a more general understanding of the algorithmic utility of recurrent connections.

Authors:  Brett W Larsen; Shaul Druckmann
Journal:  PLoS Comput Biol       Date:  2022-06-21       Impact factor: 4.779

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.