Literature DB >> 20337537

A spiking neuron as information bottleneck.

Lars Buesing1, Wolfgang Maass.   

Abstract

Neurons receive thousands of presynaptic input spike trains while emitting a single output spike train. This drastic dimensionality reduction suggests considering a neuron as a bottleneck for information transmission. Extending recent results, we propose a simple learning rule for the weights of spiking neurons derived from the information bottleneck (IB) framework that minimizes the loss of relevant information transmitted in the output spike train. In the IB framework, relevance of information is defined with respect to contextual information, the latter entering the proposed learning rule as a "third" factor besides pre- and postsynaptic activities. This renders the theoretically motivated learning rule a plausible model for experimentally observed synaptic plasticity phenomena involving three factors. Furthermore, we show that the proposed IB learning rule allows spiking neurons to learn a predictive code, that is, to extract those parts of their input that are predictive for future input.

Mesh:

Year:  2010        PMID: 20337537     DOI: 10.1162/neco.2010.08-09-1084

Source DB:  PubMed          Journal:  Neural Comput        ISSN: 0899-7667            Impact factor:   2.026


  3 in total

1.  Toward a unified theory of efficient, predictive, and sparse coding.

Authors:  Matthew Chalk; Olivier Marre; Gašper Tkačik
Journal:  Proc Natl Acad Sci U S A       Date:  2017-12-19       Impact factor: 11.205

2.  Pareto-Optimal Clustering with the Primal Deterministic Information Bottleneck.

Authors:  Andrew K Tan; Max Tegmark; Isaac L Chuang
Journal:  Entropy (Basel)       Date:  2022-05-30       Impact factor: 2.738

3.  Spatiotemporal computations of an excitable and plastic brain: neuronal plasticity leads to noise-robust and noise-constructive computations.

Authors:  Hazem Toutounji; Gordon Pipa
Journal:  PLoS Comput Biol       Date:  2014-03-20       Impact factor: 4.475

  3 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.