Literature DB >> 34064014

Neural Estimator of Information for Time-Series Data with Dependency.

Sina Molavipour1, Hamid Ghourchian1, Germán Bassi2, Mikael Skoglund1.   

Abstract

Novel approaches to estimate information measures using neural networks are well-celebrated in recent years both in the information theory and machine learning communities. These neural-based estimators are shown to converge to the true values when estimating mutual information and conditional mutual information using independent samples. However, if the samples in the dataset are not independent, the consistency of these estimators requires further investigation. This is of particular interest for a more complex measure such as the directed information, which is pivotal in characterizing causality and is meaningful over time-dependent variables. The extension of the convergence proof for such cases is not trivial and demands further assumptions on the data. In this paper, we show that our neural estimator for conditional mutual information is consistent when the dataset is generated with samples of a stationary and ergodic source. In other words, we show that our information estimator using neural networks converges asymptotically to the true value with probability one. Besides universal functional approximation of neural networks, a core lemma to show the convergence is Birkhoff's ergodic theorem. Additionally, we use the technique to estimate directed information and demonstrate the effectiveness of our approach in simulations.

Entities:  

Keywords:  Markov source; conditional mutual information; directed information; neural networks; variational bound

Year:  2021        PMID: 34064014     DOI: 10.3390/e23060641

Source DB:  PubMed          Journal:  Entropy (Basel)        ISSN: 1099-4300            Impact factor:   2.524


  11 in total

1.  Measuring information transfer

Authors: 
Journal:  Phys Rev Lett       Date:  2000-07-10       Impact factor: 9.161

2.  Estimating mutual information.

Authors:  Alexander Kraskov; Harald Stögbauer; Peter Grassberger
Journal:  Phys Rev E Stat Nonlin Soft Matter Phys       Date:  2004-06-23

3.  Quantifying information transfer and mediation along causal pathways in complex systems.

Authors:  Jakob Runge
Journal:  Phys Rev E Stat Nonlin Soft Matter Phys       Date:  2015-12-28

4.  Recurrent Neural Networks are universal approximators.

Authors:  Anton Maximilian Schäfer; Hans-Georg Zimmermann
Journal:  Int J Neural Syst       Date:  2007-08       Impact factor: 5.866

5.  Estimating the decomposition of predictive information in multivariate systems.

Authors:  Luca Faes; Dimitris Kugiumtzis; Giandomenico Nollo; Fabrice Jurysta; Daniele Marinazzo
Journal:  Phys Rev E Stat Nonlin Soft Matter Phys       Date:  2015-03-06

6.  Inferring neuronal network functional connectivity with directed information.

Authors:  Zhiting Cai; Curtis L Neveu; Douglas A Baxter; John H Byrne; Behnaam Aazhang
Journal:  J Neurophysiol       Date:  2017-05-03       Impact factor: 2.714

7.  Transfer entropy in physical systems and the arrow of time.

Authors:  Richard E Spinney; Joseph T Lizier; Mikhail Prokopenko
Journal:  Phys Rev E       Date:  2016-08-24       Impact factor: 2.529

8.  Estimating the directed information to infer causal relationships in ensemble neural spike train recordings.

Authors:  Christopher J Quinn; Todd P Coleman; Negar Kiyavash; Nicholas G Hatsopoulos
Journal:  J Comput Neurosci       Date:  2010-06-26       Impact factor: 1.621

9.  Transfer entropy--a model-free measure of effective connectivity for the neurosciences.

Authors:  Raul Vicente; Michael Wibral; Michael Lindner; Gordon Pipa
Journal:  J Comput Neurosci       Date:  2010-08-13       Impact factor: 1.621

10.  Estimating Conditional Transfer Entropy in Time Series Using Mutual Information and Nonlinear Prediction.

Authors:  Payam Shahsavari Baboukani; Carina Graversen; Emina Alickovic; Jan Østergaard
Journal:  Entropy (Basel)       Date:  2020-10-03       Impact factor: 2.524

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.