Literature DB >> 23112809

Mutual information rate and bounds for it.

Murilo S Baptista1, Rero M Rubinger, Emilson R Viana, José C Sartorelli, Ulrich Parlitz, Celso Grebogi.   

Abstract

The amount of information exchanged per unit of time between two nodes in a dynamical network or between two data sets is a powerful concept for analysing complex systems. This quantity, known as the mutual information rate (MIR), is calculated from the mutual information, which is rigorously defined only for random systems. Moreover, the definition of mutual information is based on probabilities of significant events. This work offers a simple alternative way to calculate the MIR in dynamical (deterministic) networks or between two time series (not fully deterministic), and to calculate its upper and lower bounds without having to calculate probabilities, but rather in terms of well known and well defined quantities in dynamical systems. As possible applications of our bounds, we study the relationship between synchronisation and the exchange of information in a system of two coupled maps and in experimental networks of coupled oscillators.

Entities:  

Mesh:

Year:  2012        PMID: 23112809      PMCID: PMC3480398          DOI: 10.1371/journal.pone.0046745

Source DB:  PubMed          Journal:  PLoS One        ISSN: 1932-6203            Impact factor:   3.240


  13 in total

1.  Synchronization and information flow in EEGs of epileptic patients.

Authors:  M Palus; V Komárek; T Procházka; Z Hrncír; K Sterbová
Journal:  IEEE Eng Med Biol Mag       Date:  2001 Sep-Oct

2.  The mutual information: detecting and evaluating dependencies between variables.

Authors:  R Steuer; J Kurths; C O Daub; J Weise; J Selbig
Journal:  Bioinformatics       Date:  2002       Impact factor: 6.937

3.  Estimating mutual information.

Authors:  Alexander Kraskov; Harald Stögbauer; Peter Grassberger
Journal:  Phys Rev E Stat Nonlin Soft Matter Phys       Date:  2004-06-23

Review 4.  Organization, development and function of complex brain networks.

Authors:  Olaf Sporns; Dante R Chialvo; Marcus Kaiser; Claus C Hilgetag
Journal:  Trends Cogn Sci       Date:  2004-09       Impact factor: 20.229

5.  Entropy rate estimates from mutual information.

Authors:  B D Wissman; L C McKay-Jones; P-M Binder
Journal:  Phys Rev E Stat Nonlin Soft Matter Phys       Date:  2011-10-06

6.  Delay independence of mutual-information rate of two symbolic sequences.

Authors:  Jean-Luc Blanc; Laurent Pezard; Annick Lesne
Journal:  Phys Rev E Stat Nonlin Soft Matter Phys       Date:  2011-09-23

7.  General framework for phase synchronization through localized sets.

Authors:  T Pereira; M S Baptista; J Kurths
Journal:  Phys Rev E Stat Nonlin Soft Matter Phys       Date:  2007-02-28

8.  Reduction of noise of large amplitude through adaptive neighborhoods.

Authors:  M Eugenia Mera; Manuel Morán
Journal:  Phys Rev E Stat Nonlin Soft Matter Phys       Date:  2009-07-09

9.  Independent coordinates for strange attractors from mutual information.

Authors: 
Journal:  Phys Rev A Gen Phys       Date:  1986-02

10.  Finding quasi-optimal network topologies for information transmission in active networks.

Authors:  Murilo S Baptista; Josué X de Carvalho; Mahir S Hussein
Journal:  PLoS One       Date:  2008-10-22       Impact factor: 3.240

View more
  7 in total

1.  Do Brain Networks Evolve by Maximizing Their Information Flow Capacity?

Authors:  Chris G Antonopoulos; Shambhavi Srivastava; Sandro E de S Pinto; Murilo S Baptista
Journal:  PLoS Comput Biol       Date:  2015-08-28       Impact factor: 4.475

2.  Production and transfer of energy and information in Hamiltonian systems.

Authors:  Chris G Antonopoulos; Ezequiel Bianco-Martinez; Murilo S Baptista
Journal:  PLoS One       Date:  2014-02-28       Impact factor: 3.240

3.  Universality in ant behaviour.

Authors:  Kim Christensen; Dario Papavassiliou; Alexandre de Figueiredo; Nigel R Franks; Ana B Sendova-Franks
Journal:  J R Soc Interface       Date:  2015-01-06       Impact factor: 4.118

4.  Chaotic, informational and synchronous behaviour of multiplex networks.

Authors:  M S Baptista; R M Szmoski; R F Pereira; S E de Souza Pinto
Journal:  Sci Rep       Date:  2016-03-04       Impact factor: 4.379

5.  Markovian language model of the DNA and its information content.

Authors:  S Srivastava; M S Baptista
Journal:  R Soc Open Sci       Date:  2016-01-06       Impact factor: 2.963

6.  Weak connections form an infinite number of patterns in the brain.

Authors:  Hai-Peng Ren; Chao Bai; Murilo S Baptista; Celso Grebogi
Journal:  Sci Rep       Date:  2017-04-21       Impact factor: 4.379

7.  Inference of financial networks using the normalised mutual information rate.

Authors:  Yong Kheng Goh; Haslifah M Hasim; Chris G Antonopoulos
Journal:  PLoS One       Date:  2018-02-08       Impact factor: 3.240

  7 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.