Literature DB >> 26475739

Partial information decomposition as a unified approach to the specification of neural goal functions.

Michael Wibral1, Viola Priesemann2, Jim W Kay3, Joseph T Lizier4, William A Phillips5.   

Abstract

In many neural systems anatomical motifs are present repeatedly, but despite their structural similarity they can serve very different tasks. A prime example for such a motif is the canonical microcircuit of six-layered neo-cortex, which is repeated across cortical areas, and is involved in a number of different tasks (e.g. sensory, cognitive, or motor tasks). This observation has spawned interest in finding a common underlying principle, a 'goal function', of information processing implemented in this structure. By definition such a goal function, if universal, cannot be cast in processing-domain specific language (e.g. 'edge filtering', 'working memory'). Thus, to formulate such a principle, we have to use a domain-independent framework. Information theory offers such a framework. However, while the classical framework of information theory focuses on the relation between one input and one output (Shannon's mutual information), we argue that neural information processing crucially depends on the combination of multiple inputs to create the output of a processor. To account for this, we use a very recent extension of Shannon Information theory, called partial information decomposition (PID). PID allows to quantify the information that several inputs provide individually (unique information), redundantly (shared information) or only jointly (synergistic information) about the output. First, we review the framework of PID. Then we apply it to reevaluate and analyze several earlier proposals of information theoretic neural goal functions (predictive coding, infomax and coherent infomax, efficient coding). We find that PID allows to compare these goal functions in a common framework, and also provides a versatile approach to design new goal functions from first principles. Building on this, we design and analyze a novel goal function, called 'coding with synergy', which builds on combining external input and prior knowledge in a synergistic manner. We suggest that this novel goal function may be highly useful in neural information processing.
Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

Keywords:  Coherent infomax; Information theory; Neural coding; Neural goal function; Predictive coding; Redundancy; Shared information; Synergy; Unique information

Mesh:

Year:  2015        PMID: 26475739     DOI: 10.1016/j.bandc.2015.09.004

Source DB:  PubMed          Journal:  Brain Cogn        ISSN: 0278-2626            Impact factor:   2.310


  20 in total

1.  Control of criticality and computation in spiking neuromorphic networks with plasticity.

Authors:  Benjamin Cramer; David Stöckel; Markus Kreft; Michael Wibral; Johannes Schemmel; Karlheinz Meier; Viola Priesemann
Journal:  Nat Commun       Date:  2020-06-05       Impact factor: 14.919

2.  Electrophysiological Brain Connectivity: Theory and Implementation.

Authors:  Bin He; Laura Astolfi; Pedro A Valdes-Sosa; Daniele Marinazzo; Satu Palva; Christian G Benar; Christoph M Michel; Thomas Koenig
Journal:  IEEE Trans Biomed Eng       Date:  2019-05-07       Impact factor: 4.538

3.  Bits and pieces: understanding information decomposition from part-whole relationships and formal logic.

Authors:  A J Gutknecht; M Wibral; A Makkeh
Journal:  Proc Math Phys Eng Sci       Date:  2021-07-07       Impact factor: 2.704

4.  Estimating the Unique Information of Continuous Variables.

Authors:  Ari Pakman; Amin Nejatbakhsh; Dar Gilboa; Abdullah Makkeh; Luca Mazzucato; Michael Wibral; Elad Schneidman
Journal:  Adv Neural Inf Process Syst       Date:  2021-12

5.  A synergistic core for human brain evolution and cognition.

Authors:  Andrea I Luppi; Pedro A M Mediano; Fernando E Rosas; Negin Holland; Tim D Fryer; John T O'Brien; James B Rowe; David K Menon; Daniel Bor; Emmanuel A Stamatakis
Journal:  Nat Neurosci       Date:  2022-05-26       Impact factor: 28.771

Review 6.  Greater than the parts: a review of the information decomposition approach to causal emergence.

Authors:  Pedro A M Mediano; Fernando E Rosas; Andrea I Luppi; Henrik J Jensen; Anil K Seth; Adam B Barrett; Robin L Carhart-Harris; Daniel Bor
Journal:  Philos Trans A Math Phys Eng Sci       Date:  2022-05-23       Impact factor: 4.019

Review 7.  The effects of arousal on apical amplification and conscious state.

Authors:  W A Phillips; M E Larkum; C W Harley; S M Silverstein
Journal:  Neurosci Conscious       Date:  2016-09-11

8.  Implications of Information Theory for Computational Modeling of Schizophrenia.

Authors:  Steven M Silverstein; Michael Wibral; William A Phillips
Journal:  Comput Psychiatr       Date:  2017-10-01

9.  Representational interactions during audiovisual speech entrainment: Redundancy in left posterior superior temporal gyrus and synergy in left motor cortex.

Authors:  Hyojin Park; Robin A A Ince; Philippe G Schyns; Gregor Thut; Joachim Gross
Journal:  PLoS Biol       Date:  2018-08-06       Impact factor: 8.029

Review 10.  Apical Function in Neocortical Pyramidal Cells: A Common Pathway by Which General Anesthetics Can Affect Mental State.

Authors:  William A Phillips; Talis Bachmann; Johan F Storm
Journal:  Front Neural Circuits       Date:  2018-07-02       Impact factor: 3.492

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.