Literature DB >> 30984901

Information estimation using nonparametric copulas.

Houman Safaai1,2, Arno Onken3, Christopher D Harvey1, Stefano Panzeri2.   

Abstract

Estimation of mutual information between random variables has become crucial in a range of fields, from physics to neuroscience to finance. Estimating information accurately over a wide range of conditions relies on the development of flexible methods to describe statistical dependencies among variables, without imposing potentially invalid assumptions on the data. Such methods are needed in cases that lack prior knowledge of their statistical properties and that have limited sample numbers. Here we propose a powerful and generally applicable information estimator based on non-parametric copulas. This estimator, called the non-parametric copula-based estimator (NPC), is tailored to take into account detailed stochastic relationships in the data independently of the data's marginal distributions. The NPC estimator can be used both for continuous and discrete numerical variables and thus provides a single framework for the mutual information estimation of both continuous and discrete data. By extensive validation on artificial samples drawn from various statistical distributions, we found that the NPC estimator compares well against commonly used alternatives. Unlike methods not based on copulas, it allows an estimation of information that is robust to changes of the details of the marginal distributions. Unlike parametric copula methods, it remains accurate regardless of the precise form of the interactions between the variables. In addition, the NPC estimator had accurate information estimates even at low sample numbers, in comparison to alternative estimators. The NPC estimator therefore provides a good balance between general applicability to arbitrarily shaped statistical dependencies in the data and shows accurate and robust performance when working with small sample sizes. We anticipate that the non-parametric copula information estimator will be a powerful tool in estimating mutual information between a broad range of data.

Entities:  

Year:  2018        PMID: 30984901      PMCID: PMC6458593          DOI: 10.1103/PhysRevE.98.053302

Source DB:  PubMed          Journal:  Phys Rev E        ISSN: 2470-0045            Impact factor:   2.529


  7 in total

Review 1.  The structures and functions of correlations in neural population codes.

Authors:  Stefano Panzeri; Monica Moroni; Houman Safaai; Christopher D Harvey
Journal:  Nat Rev Neurosci       Date:  2022-06-22       Impact factor: 38.755

2.  Multivariate Gaussian Copula Mutual Information to Estimate Functional Connectivity with Less Random Architecture.

Authors:  Mahnaz Ashrafi; Hamid Soltanian-Zadeh
Journal:  Entropy (Basel)       Date:  2022-04-29       Impact factor: 2.738

3.  Estimating the Mutual Information between Two Discrete, Asymmetric Variables with Limited Samples.

Authors:  Damián G Hernández; Inés Samengo
Journal:  Entropy (Basel)       Date:  2019-06-25       Impact factor: 2.524

4.  Approximations of Shannon Mutual Information for Discrete Variables with Applications to Neural Population Coding.

Authors:  Wentao Huang; Kechen Zhang
Journal:  Entropy (Basel)       Date:  2019-03-04       Impact factor: 2.524

5.  Predicting synchronous firing of large neural populations from sequential recordings.

Authors:  Oleksandr Sorochynskyi; Stéphane Deny; Olivier Marre; Ulisse Ferrari
Journal:  PLoS Comput Biol       Date:  2021-01-28       Impact factor: 4.475

6.  Parametric Copula-GP model for analyzing multidimensional neuronal and behavioral relationships.

Authors:  Nina Kudryashova; Theoklitos Amvrosiadis; Nathalie Dupuy; Nathalie Rochefort; Arno Onken
Journal:  PLoS Comput Biol       Date:  2022-01-28       Impact factor: 4.475

7.  Inferring a Property of a Large System from a Small Number of Samples.

Authors:  Damián G Hernández; Inés Samengo
Journal:  Entropy (Basel)       Date:  2022-01-14       Impact factor: 2.524

  7 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.