Literature DB >> 14511515

Neural representation of probabilistic information.

M J Barber1, J W Clark, C H Anderson.   

Abstract

It has been proposed that populations of neurons process information in terms of probability density functions (PDFs) of analog variables. Such analog variables range, for example, from target luminance and depth on the sensory interface to eye position and joint angles on the motor output side. The requirement that analog variables must be processed leads inevitably to a probabilistic description, while the limited precision and lifetime of the neuronal processing units lead naturally to a population representation of information. We show how a time-dependent probability density rho(x; t) over variable x, residing in a specified function space of dimension D, may be decoded from the neuronal activities in a population as a linear combination of certain decoding functions phi(i)(x), with coefficients given by the N firing rates a(i)(t) (generally with D << N). We show how the neuronal encoding process may be described by projecting a set of complementary encoding functions phi;(i)(x) on the probability density rho(x; t), and passing the result through a rectifying nonlinear activation function. We show how both encoders phi;(i)(x) and decoders phi(i)(x) may be determined by minimizing cost functions that quantify the inaccuracy of the representation. Expressing a given computation in terms of manipulation and transformation of probabilities, we show how this representation leads to a neural circuit that can carry out the required computation within a consistent Bayesian framework, with the synaptic weights being explicitly generated in terms of encoders, decoders, conditional probabilities, and priors.

Mesh:

Year:  2003        PMID: 14511515     DOI: 10.1162/08997660360675062

Source DB:  PubMed          Journal:  Neural Comput        ISSN: 0899-7667            Impact factor:   2.026


  9 in total

1.  Optimal sensorimotor integration in recurrent cortical networks: a neural implementation of Kalman filters.

Authors:  Sophie Denève; Jean-René Duhamel; Alexandre Pouget
Journal:  J Neurosci       Date:  2007-05-23       Impact factor: 6.167

2.  Neural representation of probabilities for Bayesian inference.

Authors:  Dylan Rich; Fanny Cazettes; Yunyan Wang; José Luis Peña; Brian J Fischer
Journal:  J Comput Neurosci       Date:  2015-01-06       Impact factor: 1.621

3.  Linearization of excitatory synaptic integration at no extra cost.

Authors:  Danielle Morel; Chandan Singh; William B Levy
Journal:  J Comput Neurosci       Date:  2018-01-25       Impact factor: 1.621

Review 4.  Probabilistic brains: knowns and unknowns.

Authors:  Alexandre Pouget; Jeffrey M Beck; Wei Ji Ma; Peter E Latham
Journal:  Nat Neurosci       Date:  2013-08-18       Impact factor: 24.884

5.  Probabilistic population codes for Bayesian decision making.

Authors:  Jeffrey M Beck; Wei Ji Ma; Roozbeh Kiani; Tim Hanks; Anne K Churchland; Jamie Roitman; Michael N Shadlen; Peter E Latham; Alexandre Pouget
Journal:  Neuron       Date:  2008-12-26       Impact factor: 17.173

6.  The visual system's internal model of the world.

Authors:  Tai Sing Lee
Journal:  Proc IEEE Inst Electr Electron Eng       Date:  2015-07-06       Impact factor: 10.961

Review 7.  A review of human sensory dynamics for application to models of driver steering and speed control.

Authors:  Christopher J Nash; David J Cole; Robert S Bigler
Journal:  Biol Cybern       Date:  2016-04-16       Impact factor: 2.086

8.  A consensus layer V pyramidal neuron can sustain interpulse-interval coding.

Authors:  Chandan Singh; William B Levy
Journal:  PLoS One       Date:  2017-07-13       Impact factor: 3.240

9.  Ecological expected utility and the mythical neural code.

Authors:  Jerome Feldman
Journal:  Cogn Neurodyn       Date:  2009-09-04       Impact factor: 5.082

  9 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.