Literature DB >> 26066207

Exploration of synergistic and redundant information sharing in static and dynamical Gaussian systems.

Adam B Barrett1.   

Abstract

To fully characterize the information that two source variables carry about a third target variable, one must decompose the total information into redundant, unique, and synergistic components, i.e., obtain a partial information decomposition (PID). However, Shannon's theory of information does not provide formulas to fully determine these quantities. Several recent studies have begun addressing this. Some possible definitions for PID quantities have been proposed and some analyses have been carried out on systems composed of discrete variables. Here we present an in-depth analysis of PIDs on Gaussian systems, both static and dynamical. We show that, for a broad class of Gaussian systems, previously proposed PID formulas imply that (i) redundancy reduces to the minimum information provided by either source variable and hence is independent of correlation between sources, and (ii) synergy is the extra information contributed by the weaker source when the stronger source is known and can either increase or decrease with correlation between sources. We find that Gaussian systems frequently exhibit net synergy, i.e., the information carried jointly by both sources is greater than the sum of information carried by each source individually. Drawing from several explicit examples, we discuss the implications of these findings for measures of information transfer and information-based measures of complexity, both generally and within a neuroscience setting. Importantly, by providing independent formulas for synergy and redundancy applicable to continuous time-series data, we provide an approach to characterizing and quantifying information sharing amongst complex system variables.

Year:  2015        PMID: 26066207     DOI: 10.1103/PhysRevE.91.052802

Source DB:  PubMed          Journal:  Phys Rev E Stat Nonlin Soft Matter Phys        ISSN: 1539-3755


  23 in total

1.  The Identity of Information: How Deterministic Dependencies Constrain Information Synergy and Redundancy.

Authors:  Daniel Chicharro; Giuseppe Pica; Stefano Panzeri
Journal:  Entropy (Basel)       Date:  2018-03-05       Impact factor: 2.524

2.  Information Thermodynamics for Time Series of Signal-Response Models.

Authors:  Andrea Auconi; Andrea Giansanti; Edda Klipp
Journal:  Entropy (Basel)       Date:  2019-02-14       Impact factor: 2.524

3.  Pointwise Partial Information Decomposition Using the Specificity and Ambiguity Lattices.

Authors:  Conor Finn; Joseph T Lizier
Journal:  Entropy (Basel)       Date:  2018-04-18       Impact factor: 2.524

4.  Predictability decomposition detects the impairment of brain-heart dynamical networks during sleep disorders and their recovery with treatment.

Authors:  Luca Faes; Daniele Marinazzo; Sebastiano Stramaglia; Fabrice Jurysta; Alberto Porta; Nollo Giandomenico
Journal:  Philos Trans A Math Phys Eng Sci       Date:  2016-05-13       Impact factor: 4.226

5.  Information Transfer in Linear Multivariate Processes Assessed through Penalized Regression Techniques: Validation and Application to Physiological Networks.

Authors:  Yuri Antonacci; Laura Astolfi; Giandomenico Nollo; Luca Faes
Journal:  Entropy (Basel)       Date:  2020-07-01       Impact factor: 2.524

6.  Dynamic process connectivity explains ecohydrologic responses to rainfall pulses and drought.

Authors:  Allison E Goodwell; Praveen Kumar; Aaron W Fellows; Gerald N Flerchinger
Journal:  Proc Natl Acad Sci U S A       Date:  2018-08-27       Impact factor: 11.205

7.  Estimating the Unique Information of Continuous Variables.

Authors:  Ari Pakman; Amin Nejatbakhsh; Dar Gilboa; Abdullah Makkeh; Luca Mazzucato; Michael Wibral; Elad Schneidman
Journal:  Adv Neural Inf Process Syst       Date:  2021-12

8.  Disentangling cardiovascular control mechanisms during head-down tilt via joint transfer entropy and self-entropy decompositions.

Authors:  Alberto Porta; Luca Faes; Andrea Marchi; Vlasta Bari; Beatrice De Maria; Stefano Guzzetti; Riccardo Colombo; Ferdinando Raimondi
Journal:  Front Physiol       Date:  2015-10-27       Impact factor: 4.566

9.  Cross-Scale Causality and Information Transfer in Simulated Epileptic Seizures.

Authors:  Kajari Gupta; Milan Paluš
Journal:  Entropy (Basel)       Date:  2021-04-25       Impact factor: 2.524

10.  New Insights into Signed Path Coefficient Granger Causality Analysis.

Authors:  Jian Zhang; Chong Li; Tianzi Jiang
Journal:  Front Neuroinform       Date:  2016-10-27       Impact factor: 4.081

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.