Literature DB >> 22181242

Entropy rate estimates from mutual information.

B D Wissman1, L C McKay-Jones, P-M Binder.   

Abstract

We show how to estimate the Kolmogorov-Sinai entropy rate for chaotic systems using the mutual information function, easily obtainable from experimental time series. We state the conditions under which the relationship is exact, and explore the usefulness of the approach for both maps and flows. We also explore refinements of the method, and study its convergence properties as a function of time series length.

Year:  2011        PMID: 22181242     DOI: 10.1103/PhysRevE.84.046204

Source DB:  PubMed          Journal:  Phys Rev E Stat Nonlin Soft Matter Phys        ISSN: 1539-3755


  1 in total

1.  Mutual information rate and bounds for it.

Authors:  Murilo S Baptista; Rero M Rubinger; Emilson R Viana; José C Sartorelli; Ulrich Parlitz; Celso Grebogi
Journal:  PLoS One       Date:  2012-10-24       Impact factor: 3.240

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.