Literature DB >> 33430463

Discovering Higher-Order Interactions Through Neural Information Decomposition.

Kyle Reing1, Greg Ver Steeg1, Aram Galstyan1.   

Abstract

If regularity in data takes the form of higher-order functions among groups of variables, models which are biased towards lower-order functions may easily mistake the data for noise. To distinguish whether this is the case, one must be able to quantify the contribution of different orders of dependence to the total information. Recent work in information theory attempts to do this through measures of multivariate mutual information (MMI) and information decomposition (ID). Despite substantial theoretical progress, practical issues related to tractability and learnability of higher-order functions are still largely unaddressed. In this work, we introduce a new approach to information decomposition-termed Neural Information Decomposition (NID)-which is both theoretically grounded, and can be efficiently estimated in practice using neural networks. We show on synthetic data that NID can learn to distinguish higher-order functions from noise, while many unsupervised probability models cannot. Additionally, we demonstrate the usefulness of this framework as a tool for exploring biological and artificial neural networks.

Entities:  

Keywords:  information decomposition; information theory; neural coding

Year:  2021        PMID: 33430463      PMCID: PMC7827712          DOI: 10.3390/e23010079

Source DB:  PubMed          Journal:  Entropy (Basel)        ISSN: 1099-4300            Impact factor:   2.524


  12 in total

1.  Synergy, redundancy, and independence in population codes.

Authors:  Elad Schneidman; William Bialek; Michael J Berry
Journal:  J Neurosci       Date:  2003-12-17       Impact factor: 6.167

2.  A geometric approach to complexity.

Authors:  Nihat Ay; Eckehard Olbrich; Nils Bertschinger; Jürgen Jost
Journal:  Chaos       Date:  2011-09       Impact factor: 3.642

Review 3.  Synergy, redundancy, and multivariate information measures: an experimentalist's perspective.

Authors:  Nicholas Timme; Wesley Alford; Benjamin Flecker; John M Beggs
Journal:  J Comput Neurosci       Date:  2013-07-03       Impact factor: 1.621

4.  BROJA-2PID: A Robust Estimator for Bivariate Partial Information Decomposition.

Authors:  Abdullah Makkeh; Dirk Oliver Theis; Raul Vicente
Journal:  Entropy (Basel)       Date:  2018-04-11       Impact factor: 2.524

5.  Generalised Measures of Multivariate Information Content.

Authors:  Conor Finn; Joseph T Lizier
Journal:  Entropy (Basel)       Date:  2020-02-14       Impact factor: 2.524

6.  Searching for collective behavior in a small brain.

Authors:  Xiaowen Chen; Francesco Randi; Andrew M Leifer; William Bialek
Journal:  Phys Rev E       Date:  2019-05       Impact factor: 2.529

7.  Automatic discovery of cell types and microcircuitry from neural connectomics.

Authors:  Eric Jonas; Konrad Kording
Journal:  Elife       Date:  2015-04-30       Impact factor: 8.140

8.  Searching for collective behavior in a large network of sensory neurons.

Authors:  Gašper Tkačik; Olivier Marre; Dario Amodei; Elad Schneidman; William Bialek; Michael J Berry
Journal:  PLoS Comput Biol       Date:  2014-01-02       Impact factor: 4.475

9.  Could a Neuroscientist Understand a Microprocessor?

Authors:  Eric Jonas; Konrad Paul Kording
Journal:  PLoS Comput Biol       Date:  2017-01-12       Impact factor: 4.475

10.  Gene Regulatory Network Inference from Single-Cell Data Using Multivariate Information Measures.

Authors:  Thalia E Chan; Michael P H Stumpf; Ann C Babtie
Journal:  Cell Syst       Date:  2017-09-27       Impact factor: 10.304

View more
  1 in total

1.  Multivariate Gaussian Copula Mutual Information to Estimate Functional Connectivity with Less Random Architecture.

Authors:  Mahnaz Ashrafi; Hamid Soltanian-Zadeh
Journal:  Entropy (Basel)       Date:  2022-04-29       Impact factor: 2.738

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.