Literature DB >> 29346985

Three faces of entropy for complex systems: Information, thermodynamics, and the maximum entropy principle.

Stefan Thurner1,2,3,4, Bernat Corominas-Murtra1,4, Rudolf Hanel1,4.   

Abstract

There are at least three distinct ways to conceptualize entropy: entropy as an extensive thermodynamic quantity of physical systems (Clausius, Boltzmann, Gibbs), entropy as a measure for information production of ergodic sources (Shannon), and entropy as a means for statistical inference on multinomial processes (Jaynes maximum entropy principle). Even though these notions represent fundamentally different concepts, the functional form of the entropy for thermodynamic systems in equilibrium, for ergodic sources in information theory, and for independent sampling processes in statistical systems, is degenerate, H(p)=-∑_{i}p_{i}logp_{i}. For many complex systems, which are typically history-dependent, nonergodic, and nonmultinomial, this is no longer the case. Here we show that for such processes, the three entropy concepts lead to different functional forms of entropy, which we will refer to as S_{EXT} for extensive entropy, S_{IT} for the source information rate in information theory, and S_{MEP} for the entropy functional that appears in the so-called maximum entropy principle, which characterizes the most likely observable distribution functions of a system. We explicitly compute these three entropy functionals for three concrete examples: for Pólya urn processes, which are simple self-reinforcing processes, for sample-space-reducing (SSR) processes, which are simple history dependent processes that are associated with power-law statistics, and finally for multinomial mixture processes.

Year:  2017        PMID: 29346985     DOI: 10.1103/PhysRevE.96.032124

Source DB:  PubMed          Journal:  Phys Rev E        ISSN: 2470-0045            Impact factor:   2.529


  8 in total

1.  Decomposing information into copying versus transformation.

Authors:  Artemy Kolchinsky; Bernat Corominas-Murtra
Journal:  J R Soc Interface       Date:  2020-01-22       Impact factor: 4.118

2.  Gintropy: Gini Index Based Generalization of Entropy.

Authors:  Tamás S Biró; Zoltán Néda
Journal:  Entropy (Basel)       Date:  2020-08-10       Impact factor: 2.524

3.  Information Geometric Duality of ϕ-Deformed Exponential Families.

Authors:  Jan Korbel; Rudolf Hanel; Stefan Thurner
Journal:  Entropy (Basel)       Date:  2019-01-24       Impact factor: 2.524

4.  Maximum Configuration Principle for Driven Systems with Arbitrary Driving.

Authors:  Rudolf Hanel; Stefan Thurner
Journal:  Entropy (Basel)       Date:  2018-11-01       Impact factor: 2.524

Review 5.  Twenty Years of Entropy Research: A Bibliometric Overview.

Authors:  Weishu Li; Yuxiu Zhao; Qi Wang; Jian Zhou
Journal:  Entropy (Basel)       Date:  2019-07-15       Impact factor: 2.524

6.  A Mutation Threshold for Cooperative Takeover.

Authors:  Alexandre Champagne-Ruel; Paul Charbonneau
Journal:  Life (Basel)       Date:  2022-02-08

7.  Analysis of chimera states as drive-response systems.

Authors:  André E Botha; Mohammad R Kolahchi
Journal:  Sci Rep       Date:  2018-01-30       Impact factor: 4.379

8.  Coupled VAE: Improved Accuracy and Robustness of a Variational Autoencoder.

Authors:  Shichen Cao; Jingjing Li; Kenric P Nelson; Mark A Kon
Journal:  Entropy (Basel)       Date:  2022-03-18       Impact factor: 2.524

  8 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.