Literature DB >> 23410306

Bivariate measure of redundant information.

Malte Harder1, Christoph Salge, Daniel Polani.   

Abstract

We define a measure of redundant information based on projections in the space of probability distributions. Redundant information between random variables is information that is shared between those variables. But, in contrast to mutual information, redundant information denotes information that is shared about the outcome of a third variable. Formalizing this concept, and being able to measure it, is required for the non-negative decomposition of mutual information into redundant and synergistic information. Previous attempts to formalize redundant or synergistic information struggle to capture some desired properties. We introduce a new formalism for redundant information and prove that it satisfies all the properties necessary outlined in earlier work, as well as an additional criterion that we propose to be necessary to capture redundancy. We also demonstrate the behavior of this new measure for several examples, compare it to previous measures, and apply it to the decomposition of transfer entropy.

Mesh:

Year:  2013        PMID: 23410306     DOI: 10.1103/PhysRevE.87.012130

Source DB:  PubMed          Journal:  Phys Rev E Stat Nonlin Soft Matter Phys        ISSN: 1539-3755


  25 in total

1.  Unique Information and Secret Key Agreement.

Authors:  Ryan G James; Jeffrey Emenheiser; James P Crutchfield
Journal:  Entropy (Basel)       Date:  2018-12-24       Impact factor: 2.524

2.  The Identity of Information: How Deterministic Dependencies Constrain Information Synergy and Redundancy.

Authors:  Daniel Chicharro; Giuseppe Pica; Stefano Panzeri
Journal:  Entropy (Basel)       Date:  2018-03-05       Impact factor: 2.524

3.  Pointwise Partial Information Decomposition Using the Specificity and Ambiguity Lattices.

Authors:  Conor Finn; Joseph T Lizier
Journal:  Entropy (Basel)       Date:  2018-04-18       Impact factor: 2.524

Review 4.  A Tutorial for Information Theory in Neuroscience.

Authors:  Nicholas M Timme; Christopher Lapish
Journal:  eNeuro       Date:  2018-09-11

5.  BROJA-2PID: A Robust Estimator for Bivariate Partial Information Decomposition.

Authors:  Abdullah Makkeh; Dirk Oliver Theis; Raul Vicente
Journal:  Entropy (Basel)       Date:  2018-04-11       Impact factor: 2.524

6.  Probability Mass Exclusions and the Directed Components of Mutual Information.

Authors:  Conor Finn; Joseph T Lizier
Journal:  Entropy (Basel)       Date:  2018-10-28       Impact factor: 2.524

7.  A Graph Algorithmic Approach to Separate Direct from Indirect Neural Interactions.

Authors:  Patricia Wollstadt; Ulrich Meyer; Michael Wibral
Journal:  PLoS One       Date:  2015-10-19       Impact factor: 3.240

8.  Estimating the Unique Information of Continuous Variables.

Authors:  Ari Pakman; Amin Nejatbakhsh; Dar Gilboa; Abdullah Makkeh; Luca Mazzucato; Michael Wibral; Elad Schneidman
Journal:  Adv Neural Inf Process Syst       Date:  2021-12

9.  The information theory of individuality.

Authors:  David Krakauer; Nils Bertschinger; Eckehard Olbrich; Jessica C Flack; Nihat Ay
Journal:  Theory Biosci       Date:  2020-03-24       Impact factor: 1.919

10.  Efficient transfer entropy analysis of non-stationary neural time series.

Authors:  Patricia Wollstadt; Mario Martínez-Zarzuela; Raul Vicente; Francisco J Díaz-Pernas; Michael Wibral
Journal:  PLoS One       Date:  2014-07-28       Impact factor: 3.240

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.