Literature DB >> 19018706

Bayesian filtering in spiking neural networks: noise, adaptation, and multisensory integration.

Omer Bobrowski1, Ron Meir, Yonina C Eldar.   

Abstract

A key requirement facing organisms acting in uncertain dynamic environments is the real-time estimation and prediction of environmental states, based on which effective actions can be selected. While it is becoming evident that organisms employ exact or approximate Bayesian statistical calculations for these purposes, it is far less clear how these putative computations are implemented by neural networks in a strictly dynamic setting. In this work, we make use of rigorous mathematical results from the theory of continuous time point process filtering and show how optimal real-time state estimation and prediction may be implemented in a general setting using simple recurrent neural networks. The framework is applicable to many situations of common interest, including noisy observations, non-Poisson spike trains (incorporating adaptation), multisensory integration, and state prediction. The optimal network properties are shown to relate to the statistical structure of the environment, and the benefits of adaptation are studied and explicitly demonstrated. Finally, we recover several existing results as appropriate limits of our general setting.

Mesh:

Year:  2009        PMID: 19018706     DOI: 10.1162/neco.2008.01-08-692

Source DB:  PubMed          Journal:  Neural Comput        ISSN: 0899-7667            Impact factor:   2.026


  12 in total

1.  Anticipatory Neural Activity Improves the Decoding Accuracy for Dynamic Head-Direction Signals.

Authors:  Johannes Zirkelbach; Martin Stemmler; Andreas V M Herz
Journal:  J Neurosci       Date:  2019-01-28       Impact factor: 6.167

2.  Error-based analysis of optimal tuning functions explains phenomena observed in sensory neurons.

Authors:  Steve Yaeli; Ron Meir
Journal:  Front Comput Neurosci       Date:  2010-10-14       Impact factor: 2.380

3.  Real-Time Point Process Filter for Multidimensional Decoding Problems Using Mixture Models.

Authors:  Mohammad Reza Rezaei; Kensuke Arai; Loren M Frank; Uri T Eden; Ali Yousefi
Journal:  J Neurosci Methods       Date:  2020-11-21       Impact factor: 2.390

4.  Synapses with short-term plasticity are optimal estimators of presynaptic membrane potentials.

Authors:  Jean-Pascal Pfister; Peter Dayan; Máté Lengyel
Journal:  Nat Neurosci       Date:  2010-09-19       Impact factor: 24.884

5.  Probabilistic inference in general graphical models through sampling in stochastic networks of spiking neurons.

Authors:  Dejan Pecevski; Lars Buesing; Wolfgang Maass
Journal:  PLoS Comput Biol       Date:  2011-12-15       Impact factor: 4.475

6.  STDP installs in Winner-Take-All circuits an online approximation to hidden Markov model learning.

Authors:  David Kappel; Bernhard Nessler; Wolfgang Maass
Journal:  PLoS Comput Biol       Date:  2014-03-27       Impact factor: 4.475

7.  Ensembles of spiking neurons with noise support optimal probabilistic inference in a dynamically changing environment.

Authors:  Robert Legenstein; Wolfgang Maass
Journal:  PLoS Comput Biol       Date:  2014-10-23       Impact factor: 4.475

8.  Contributions to a neurophysiology of meaning: the interpretation of written messages could be an automatic stimulus-reaction mechanism before becoming conscious processing of information.

Authors:  Roberto Maffei; Livia S Convertini; Sabrina Quatraro; Stefania Ressa; Annalisa Velasco
Journal:  PeerJ       Date:  2015-10-27       Impact factor: 2.984

9.  Recurrent Spiking Networks Solve Planning Tasks.

Authors:  Elmar Rueckert; David Kappel; Daniel Tanneberg; Dejan Pecevski; Jan Peters
Journal:  Sci Rep       Date:  2016-02-18       Impact factor: 4.379

10.  Bayesian Estimation and Inference Using Stochastic Electronics.

Authors:  Chetan Singh Thakur; Saeed Afshar; Runchun M Wang; Tara J Hamilton; Jonathan Tapson; André van Schaik
Journal:  Front Neurosci       Date:  2016-03-18       Impact factor: 4.677

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.