| Literature DB >> 24953547 |
Mikhail Prokopenko1, Joseph T Lizier2.
Abstract
Transfer entropy is a recently introduced information-theoretic measure quantifying directed statistical coherence between spatiotemporal processes, and is widely used in diverse fields ranging from finance to neuroscience. However, its relationships to fundamental limits of computation, such as Landauer's limit, remain unknown. Here we show that in order to increase transfer entropy (predictability) by one bit, heat flow must match or exceed Landauer's limit. Importantly, we generalise Landauer's limit to bi-directional information dynamics for non-equilibrium processes, revealing that the limit applies to prediction, in addition to retrodiction (information erasure). Furthermore, the results are related to negentropy, and to Bremermann's limit and the Bekenstein bound, producing, perhaps surprisingly, lower bounds on the computational deceleration and information loss incurred during an increase in predictability about the process. The identified relationships set new computational limits in terms of fundamental physical quantities, and establish transfer entropy as a central measure connecting information theory, thermodynamics and theory of computation.Entities:
Year: 2014 PMID: 24953547 PMCID: PMC4066251 DOI: 10.1038/srep05394
Source DB: PubMed Journal: Sci Rep ISSN: 2045-2322 Impact factor: 4.379
Figure 1The Joule expansion of a one molecule gas.
The container X is in thermal isolation from the surrounding exterior Y. As the partition is removed, the particle may be on the left or on the right.
Figure 2Resetting one bit by compression in the Szilárd engine-like device.
The container X is in thermal contact with the surrounding exterior Y. As the partition is re-inserted and moved back to the middle, the particle returns to the left side.