Literature DB >> 33564396

Neurons with dendrites can perform linearly separable computations with low resolution synaptic weights.

Romain D Cazé1, Marcel Stimberg2.   

Abstract

In theory, neurons modelled as single layer perceptrons can implement all linearly separable computations. In practice, however, these computations may require arbitrarily precise synaptic weights. This is a strong constraint since both biological neurons and their artificial counterparts have to cope with limited precision. Here, we explore how non-linear processing in dendrites helps overcome this constraint. We start by finding a class of computations which requires increasing precision with the number of inputs in a perceptron and show that it can be implemented without this constraint in a neuron with sub-linear dendritic subunits. Then, we complement this analytical study by a simulation of a biophysical neuron model with two passive dendrites and a soma, and show that it can implement this computation. This work demonstrates a new role of dendrites in neural computation: by distributing the computation across independent subunits, the same computation can be performed more efficiently with less precise tuning of the synaptic weights. This work not only offers new insight into the importance of dendrites for biological neurons, but also paves the way for new, more efficient architectures of artificial neuromorphic chips. Copyright:
© 2021 Cazé RD and Stimberg M.

Entities:  

Keywords:  Dendrites; computation; implementation; linearly separable

Mesh:

Year:  2020        PMID: 33564396      PMCID: PMC7848858          DOI: 10.12688/f1000research.26486.2

Source DB:  PubMed          Journal:  F1000Res        ISSN: 2046-1402


  19 in total

1.  Somatic EPSP amplitude is independent of synapse location in hippocampal pyramidal neurons.

Authors:  J C Magee; E P Cook
Journal:  Nat Neurosci       Date:  2000-09       Impact factor: 24.884

2.  Pyramidal neuron as two-layer neural network.

Authors:  Panayiota Poirazi; Terrence Brannon; Bartlett W Mel
Journal:  Neuron       Date:  2003-03-27       Impact factor: 17.173

3.  On the capabilities of neural networks using limited precision weights.

Authors:  Sorin Draghici
Journal:  Neural Netw       Date:  2002-04

4.  Computational subunits in thin dendrites of pyramidal cells.

Authors:  Alon Polsky; Bartlett W Mel; Jackie Schiller
Journal:  Nat Neurosci       Date:  2004-05-23       Impact factor: 24.884

5.  On the actions that one nerve cell can have on another: distinguishing "drivers" from "modulators".

Authors:  S M Sherman; R W Guillery
Journal:  Proc Natl Acad Sci U S A       Date:  1998-06-09       Impact factor: 11.205

6.  A logical calculus of the ideas immanent in nervous activity. 1943.

Authors:  W S McCulloch; W Pitts
Journal:  Bull Math Biol       Date:  1990       Impact factor: 1.758

7.  Is a 4-bit synaptic weight resolution enough? - constraints on enabling spike-timing dependent plasticity in neuromorphic hardware.

Authors:  Thomas Pfeil; Tobias C Potjans; Sven Schrader; Wiebke Potjans; Johannes Schemmel; Markus Diesmann; Karlheinz Meier
Journal:  Front Neurosci       Date:  2012-07-17       Impact factor: 4.677

8.  Challenging the point neuron dogma: FS basket cells as 2-stage nonlinear integrators.

Authors:  Alexandra Tzilivaki; George Kastellakis; Panayiota Poirazi
Journal:  Nat Commun       Date:  2019-08-14       Impact factor: 14.919

9.  Neurons with dendrites can perform linearly separable computations with low resolution synaptic weights.

Authors:  Romain D Cazé; Marcel Stimberg
Journal:  F1000Res       Date:  2020-09-28

10.  Emergence of Stable Synaptic Clusters on Dendrites Through Synaptic Rewiring.

Authors:  Thomas Limbacher; Robert Legenstein
Journal:  Front Comput Neurosci       Date:  2020-08-06       Impact factor: 2.380

View more
  3 in total

1.  Neurons with dendrites can perform linearly separable computations with low resolution synaptic weights.

Authors:  Romain D Cazé; Marcel Stimberg
Journal:  F1000Res       Date:  2020-09-28

2.  Dynamical Characteristics of Recurrent Neuronal Networks Are Robust Against Low Synaptic Weight Resolution.

Authors:  Stefan Dasbach; Tom Tetzlaff; Markus Diesmann; Johanna Senk
Journal:  Front Neurosci       Date:  2021-12-24       Impact factor: 4.677

3.  SAM: A Unified Self-Adaptive Multicompartmental Spiking Neuron Model for Learning With Working Memory.

Authors:  Shuangming Yang; Tian Gao; Jiang Wang; Bin Deng; Mostafa Rahimi Azghadi; Tao Lei; Bernabe Linares-Barranco
Journal:  Front Neurosci       Date:  2022-04-18       Impact factor: 5.152

  3 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.