Literature DB >> 15901407

Estimating entropy rates with Bayesian confidence intervals.

Matthew B Kennel1, Jonathon Shlens, Henry D I Abarbanel, E J Chichilnisky.   

Abstract

The entropy rate quantifies the amount of uncertainty or disorder produced by any dynamical system. In a spiking neuron, this uncertainty translates into the amount of information potentially encoded and thus the subject of intense theoretical and experimental investigation. Estimating this quantity in observed, experimental data is difficult and requires a judicious selection of probabilistic models, balancing between two opposing biases. We use a model weighting principle originally developed for lossless data compression, following the minimum description length principle. This weighting yields a direct estimator of the entropy rate, which, compared to existing methods, exhibits significantly less bias and converges faster in simulation. With Monte Carlo techinques, we estimate a Bayesian confidence interval for the entropy rate. In related work, we apply these ideas to estimate the information rates between sensory stimuli and neural responses in experimental data (Shlens, Kennel, Abarbanel, & Chichilnisky, in preparation).

Mesh:

Year:  2005        PMID: 15901407     DOI: 10.1162/0899766053723050

Source DB:  PubMed          Journal:  Neural Comput        ISSN: 0899-7667            Impact factor:   2.026


  20 in total

Review 1.  Spike train metrics.

Authors:  Jonathan D Victor
Journal:  Curr Opin Neurobiol       Date:  2005-10       Impact factor: 6.627

2.  Dejittered spike-conditioned stimulus waveforms yield improved estimates of neuronal feature selectivity and spike-timing precision of sensory interneurons.

Authors:  Zane N Aldworth; John P Miller; Tomás Gedeon; Graham I Cummins; Alexander G Dimitrov
Journal:  J Neurosci       Date:  2005-06-01       Impact factor: 6.167

Review 3.  Synergy, redundancy, and multivariate information measures: an experimentalist's perspective.

Authors:  Nicholas Timme; Wesley Alford; Benjamin Flecker; John M Beggs
Journal:  J Comput Neurosci       Date:  2013-07-03       Impact factor: 1.621

4.  Approaches to Information-Theoretic Analysis of Neural Activity.

Authors:  Jonathan D Victor
Journal:  Biol Theory       Date:  2006

5.  Information theory in neuroscience.

Authors:  Alexander G Dimitrov; Aurel A Lazar; Jonathan D Victor
Journal:  J Comput Neurosci       Date:  2011-02       Impact factor: 1.621

6.  Dendritic excitability modulates dendritic information processing in a purkinje cell model.

Authors:  Allan D Coop; Hugo Cornelis; Fidel Santamaria
Journal:  Front Comput Neurosci       Date:  2010-03-30       Impact factor: 2.380

7.  Recurrent, robust and scalable patterns underlie human approach and avoidance.

Authors:  Byoung Woo Kim; David N Kennedy; Joseph Lehár; Myung Joo Lee; Anne J Blood; Sang Lee; Roy H Perlis; Jordan W Smoller; Robert Morris; Maurizio Fava; Hans C Breiter
Journal:  PLoS One       Date:  2010-05-26       Impact factor: 3.240

8.  Sparse coding and high-order correlations in fine-scale cortical networks.

Authors:  Ifije E Ohiorhenuan; Ferenc Mechler; Keith P Purpura; Anita M Schmid; Qin Hu; Jonathan D Victor
Journal:  Nature       Date:  2010-07-04       Impact factor: 49.962

9.  Indices for testing neural codes.

Authors:  Jonathan D Victor; Sheila Nirenberg
Journal:  Neural Comput       Date:  2008-12       Impact factor: 2.026

10.  Information in the nonstationary case.

Authors:  Vincent Q Vu; Bin Yu; Robert E Kass
Journal:  Neural Comput       Date:  2009-03       Impact factor: 2.026

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.