Literature DB >> 34634045

The information theory of developmental pruning: Optimizing global network architectures using local synaptic rules.

Carolin Scholl1, Michael E Rule2, Matthias H Hennig3.   

Abstract

During development, biological neural networks produce more synapses and neurons than needed. Many of these synapses and neurons are later removed in a process known as neural pruning. Why networks should initially be over-populated, and the processes that determine which synapses and neurons are ultimately pruned, remains unclear. We study the mechanisms and significance of neural pruning in model neural networks. In a deep Boltzmann machine model of sensory encoding, we find that (1) synaptic pruning is necessary to learn efficient network architectures that retain computationally-relevant connections, (2) pruning by synaptic weight alone does not optimize network size and (3) pruning based on a locally-available measure of importance based on Fisher information allows the network to identify structurally important vs. unimportant connections and neurons. This locally-available measure of importance has a biological interpretation in terms of the correlations between presynaptic and postsynaptic neurons, and implies an efficient activity-driven pruning rule. Overall, we show how local activity-dependent synaptic pruning can solve the global problem of optimizing a network architecture. We relate these findings to biology as follows: (I) Synaptic over-production is necessary for activity-dependent connectivity optimization. (II) In networks that have more neurons than needed, cells compete for activity, and only the most important and selective neurons are retained. (III) Cells may also be pruned due to a loss of synapses on their axons. This occurs when the information they convey is not relevant to the target population.

Entities:  

Mesh:

Year:  2021        PMID: 34634045      PMCID: PMC8584672          DOI: 10.1371/journal.pcbi.1009458

Source DB:  PubMed          Journal:  PLoS Comput Biol        ISSN: 1553-734X            Impact factor:   4.475


  49 in total

1.  A fast learning algorithm for deep belief nets.

Authors:  Geoffrey E Hinton; Simon Osindero; Yee-Whye Teh
Journal:  Neural Comput       Date:  2006-07       Impact factor: 2.026

Review 2.  Sloppiness, robustness, and evolvability in systems biology.

Authors:  Bryan C Daniels; Yan-Jiun Chen; James P Sethna; Ryan N Gutenkunst; Christopher R Myers
Journal:  Curr Opin Biotechnol       Date:  2008-07-25       Impact factor: 9.740

Review 3.  The neurotrophic theory and naturally occurring motoneuron death.

Authors:  R W Oppenheim
Journal:  Trends Neurosci       Date:  1989-07       Impact factor: 13.837

4.  Why Do Similarity Matching Objectives Lead to Hebbian/Anti-Hebbian Networks?

Authors:  Cengiz Pehlevan; Anirvan M Sengupta; Dmitri B Chklovskii
Journal:  Neural Comput       Date:  2017-09-28       Impact factor: 2.026

5.  Regional differences in synaptogenesis in human cerebral cortex.

Authors:  P R Huttenlocher; A S Dabholkar
Journal:  J Comp Neurol       Date:  1997-10-20       Impact factor: 3.215

Review 6.  Clinical disorders of brain plasticity.

Authors:  Michael V Johnston
Journal:  Brain Dev       Date:  2004-03       Impact factor: 1.961

7.  Long-term relationships between synaptic tenacity, synaptic remodeling, and network activity.

Authors:  Amir Minerbi; Roni Kahana; Larissa Goldfeld; Maya Kaufman; Shimon Marom; Noam E Ziv
Journal:  PLoS Biol       Date:  2009-06-23       Impact factor: 8.029

8.  Structural plasticity controlled by calcium based correlation detection. helias@bccn.uni-freiburg.de.

Authors:  Moritz Helias; Stefan Rotter; Marc-Oliver Gewaltig; Markus Diesmann
Journal:  Front Comput Neurosci       Date:  2008-12-24       Impact factor: 2.380

9.  Universally sloppy parameter sensitivities in systems biology models.

Authors:  Ryan N Gutenkunst; Joshua J Waterfall; Fergal P Casey; Kevin S Brown; Christopher R Myers; James P Sethna
Journal:  PLoS Comput Biol       Date:  2007-08-15       Impact factor: 4.475

10.  Schizophrenia risk from complex variation of complement component 4.

Authors:  Aswin Sekar; Allison R Bialas; Heather de Rivera; Avery Davis; Timothy R Hammond; Nolan Kamitaki; Katherine Tooley; Jessy Presumey; Matthew Baum; Vanessa Van Doren; Giulio Genovese; Samuel A Rose; Robert E Handsaker; Mark J Daly; Michael C Carroll; Beth Stevens; Steven A McCarroll
Journal:  Nature       Date:  2016-01-27       Impact factor: 49.962

View more
  2 in total

1.  Pruning recurrent neural networks replicates adolescent changes in working memory and reinforcement learning.

Authors:  Bruno B Averbeck
Journal:  Proc Natl Acad Sci U S A       Date:  2022-05-27       Impact factor: 12.779

2.  Periodicity Pitch Perception Part III: Sensibility and Pachinko Volatility.

Authors:  Frank Feldhoff; Hannes Toepfer; Tamas Harczos; Frank Klefenz
Journal:  Front Neurosci       Date:  2022-03-08       Impact factor: 4.677

  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.