Literature DB >> 36071118

Information flow in first-order potts model phase transition.

Joshua M Brown1, Terry Bossomaier2, Lionel Barnett3.   

Abstract

Phase transitions abound in nature and society, and, from species extinction to stock market collapse, their prediction is of widespread importance. In earlier work we showed that Global Transfer Entropy, a general measure of information flow, was found to peaks away from the transition on the disordered side for the Ising model, a canonical second order transition. Here we show that (a) global transfer entropy also peaks on the disordered side of the transition of finite first order transitions, such as ecology dynamics on coral reefs, which have latent heat and no correlation length divergence, and (b) analysis of information flow across state boundaries unifies both transition orders. We obtain the first information-theoretic result for the high-order Potts model and the first demonstration of early warning of a first order transition. The unexpected earlier finding that global transfer entropy peaks on the disordered side of a transition is also found for finite first order systems, albeit not in the thermodynamic limit. By noting that the interface length of clusters in each phase is the dominant region of information flow, we unify the information theoretic behaviour of first and second order transitions.
© 2022. The Author(s).

Entities:  

Mesh:

Year:  2022        PMID: 36071118      PMCID: PMC9452544          DOI: 10.1038/s41598-022-17359-w

Source DB:  PubMed          Journal:  Sci Rep        ISSN: 2045-2322            Impact factor:   4.996


Introduction

Numerous mechanisms for predicting phase transitions exist, applied, for example, from core science and engineering through biology, ecology, medicine and finance[1]: increased variance; critical slowing down[1]; flickering[2]; and a peak in the global transfer entropy (Eq. 2). Two canonical models of equilibrium transitions stand out: the Ising model[3], a binary spin system on a square lattice, where each point on the lattice has a spin, which may point up or down; and the Potts model, which generalises Ising to spins with an arbitrary number of states, q, and reduces to the Ising model for . In the Ising model[3], mutual information peaks at the transition between ordered and disordered phases[4,5], as does pairwise transfer entropy[6] (Eq. 4, Suppl. Material), while global transfer entropy (Eq. 2) peaks distinctly on the disordered side[7]. Here we extend this prior work on global transfer entropy to the q-state Potts model[8] which exhibits first-order phase transitions for [9] and provide a unifying framework for both transitions (See Suppl. Material for details regarding mutual information in the first-order Potts model). At the transition is weakly first order[10], implying a long correlation length and low latent heat. As q increases the correlation length decreases and the latent heat increases. We show that as the system becomes more strongly first order (i.e., [11,12]) the behaviour of global transfer entropy, , diverges from the second-order behaviour: In the thermodynamic limit, becomes discontinuous at the transition temperature, , peaking at . The standard Potts model comprises a two-dimensional lattice of spins with periodic boundary conditions and size , where the system state is , with . The interaction energy between two neighbouring sites is giving the Hamiltonian , where interaction strength , is the Kronecker delta function which is one if and zero otherwise, and are all interacting pairs of sites in the system. Local site energy, , is defined similarly, fixing i and summing over its four neighbours. The system is updated using Glauber dynamics[13]. Overall alignment of the lattice is measured by its magnetisation, [10], where is the proportion of the dominant, mode, state, , over all sites for the given configuration of the lattice, with ranging from when all q states are equally represented to 1 where only one state is present in the lattice, giving magnetisation in the range [0, 1]. M serves as the order parameter and the order-disorder transition occurs—for and —at an intermediate temperature[14]where the (thermodynamic) system is disordered () at temperatures above and non-zero below . The behaviour at defines the transition order, where has continuous M (and discontinuous dM/dT) giving a second-order phase transition. Transfer entropy, , measures (Eqs. 2, 4, Suppl. Material) information flow from one stochastic process, Y, to another, X—in this case the states of two neighbouring spins over time. Global transfer entropy measures the average information flow of the entire system to individual spin sites: We note however, that all information—no matter its origin in the lattice—must flow to via its neighbours or its own past, and thus consider only the immediate neighbourhood of each site (including ) rather than in Eq. (2)[7]. As with , with iff each site , conditioned on its past, is independent of its neighbours.

Methods

Direct simulation of the higher order Potts models is not straightforward, since the first-order transition shows a void region of energy space around the phase transition, such that general purpose update schemes, such as Glauber dynamics, are very unlikely to enter this region. In fact, for temperatures close to the critical temperature, the energy distribution P(E) is bimodal (See Suppl. Material, Fig. S1). Thus we estimate via two methods. The first, denoted , employs straight-forward Glauber dynamics where each update, or sweep, comprises N spin flip attempts. The second uses the density of states, d(E), calculated with the Wang-Landau algorithm[15] (See Suppl. Material). P(E) may then be calculated fromwhere E is the lattice energy. Any thermodynamic observable, f(T), may now be determined from its value as a function of f(E)[16]where is the distribution of energies, and has been rescaled for visualisation and computational reasons (Specifically, normalisation is such that . As the new term, , is constant over the summation, it cancels out such that f(T) is unmodified.). After determining d(E) we need to determine . While f(E) depends on energy only, is a temporal quantity (Eq. 4, Suppl. Material), that is, it is a function of the change in state between and t, and thus also depends on temperature, therefore we in fact need to determine for varying T. Additionally, as for many values, f(E) can be measured more simply by culling energy values where P(E) is sufficiently low—that is, once d(E) has been determined, and we move to determining , reaching every E is unnecessary and so can be calculated via a separate set of realisations utilising only Glauber dynamics rather than Wang-Landau updating. We thus collect ensemble statistics similar to (with fixed T per ensemble), collating statistics for using the energy E of the lattice before the Glauber sweep, where the future state is the post-sweep state. We denote this regime as . We note however that this may not be strictly correct, as E can change after each successful spin flip during the sweep, thus statistics collated for will include elements from . To address this, we separately collate statistics on a per-flip basis, where and its neighbourhood are recorded at any attempt to flip , giving . Straight-forward Glauber approaches construct ensembles comprising realisations pooled together, with settling time of 1000 time steps, followed by a measurement sequence of time steps as in the Ising model[7]. Standard error calculated by generating 10 ensembles. We optimise simulation by modifying initialisation dependent on T. In the ordered regime, , we initialise all realisations to the same ground state, noting that  [p. 4][17] and thus only the ordered peak exists in P(E) for . For we evenly divide realisations into random ground states or random disordered states to sample both P(E) peaks. Density of states approaches constructed likewise, minus the superflous (in this regime only) settling time. Due to data volume requirements involved in estimation of , and we employ a compression regime (explained below in the methodology). This regime is validated by applying it to , giving (shown in the Suppl. Materials).

Results and discussion

All variants exhibit a peak in on the disordered side of the transition (Fig. 1), with per-sweep versus per-flip statistics differing by a roughly constant factor: the statistics collated for can be considered equivalent to those collated for with a small amount of random noise applied, thus reducing the information flow and therefore , with . The peak locations are mostly stable for the pure Glauber approach, , except for , while the thermodynamic, density of states approaches, , exhibit a strong shift in peak as q and lattice sizes increase, rapidly approaching the critical temperature.
Figure 1

measured using three methods (top: , bottom: ), with (columns) for . Ensemble collated using time steps over 10 realisations. Vertical lines indicate . Filled symbols indicate “effective” , the location where P(E) is precisely bimodal for given q, L, corresponding to values found in analytical methods[11] (see Suppl. Materials). Error bars calculated from 10 ensembles and are smaller than symbols in some regions. Gap between and due to extraneous data included in (See main text).

measured using three methods (top: , bottom: ), with (columns) for . Ensemble collated using time steps over 10 realisations. Vertical lines indicate . Filled symbols indicate “effective” , the location where P(E) is precisely bimodal for given q, L, corresponding to values found in analytical methods[11] (see Suppl. Materials). Error bars calculated from 10 ensembles and are smaller than symbols in some regions. Gap between and due to extraneous data included in (See main text). The behaviour of the peak in compared with the density of states approaches highlights the core issue with initialisation regimes and hysteresis. Specifically, for the current initialisation regime initialises half of the realisations in an ensemble to disordered and the other half to randomly chosen ground states, in an attempt to circumvent infeasibly long times to traverse the valley in P(E). This is done under the assumption that it achieves bimodal P(E) while rapidly collapsing to the correct unimodal P(E) on either side of . However, this collapse is not quick enough. For example, for , the normalised ordered peak in P(E) should drop below at (according to density of states estimation), however, in the Glauber simulation it does not reach this threshold until . That is, we have a spurious ordered peak above , resulting in an artificial reduction in for (See Suppl. Materials). Here, by using Eq. (4) directly, as done here, we circumvent the issue altogether. Finally, we look at a physical understanding of the behaviour of . Intuitively, information flows when neighbour states differ, hence zero information flow in ground states. This behaviour necessarily extends to clusters of states, implying information flow occurs on the boundaries, or interfaces, between clusters (See Fig. 2). It seems reasonable then to assume that information flow scales with number of interfaces. However, such a maximum coincides with the zero-energy fully disordered regime, where quite clearly . This assumption neglects the temporal nature of , which is disrupted at high temperature.
Figure 2

Interfaces for lattice sampled from (0.8515) (Top) and (Bottom) where each square is a lattice site. Top: Arrows show the counter-clockwise path interface walker (for large cluster) takes around complex interactions. Labelled clusters, while sharing the same state, are disjoint, and thus have separate interfaces. Average interface length is . Bottom: When one cluster dominates, it no longer has an “outer” perimeter. Average interface length is .

Interfaces for lattice sampled from (0.8515) (Top) and (Bottom) where each square is a lattice site. Top: Arrows show the counter-clockwise path interface walker (for large cluster) takes around complex interactions. Labelled clusters, while sharing the same state, are disjoint, and thus have separate interfaces. Average interface length is . Bottom: When one cluster dominates, it no longer has an “outer” perimeter. Average interface length is . Remember that is a measure of a site’s dependence on neighbouring sites, conditioned on its own past. At high temperature, spin flips are essentially random, choosing new states with little influence from neighbours. As temperature decreases, neighbour influence increases, leading to clusters of similar sites. We can thus approximate average influence by probability of cluster size, p(c). This influence is the manifestation of information flow in the system, but only along interfaces (since information flow is conditioned on its own past), leading to the conjecture:where is the average interfacial length of cluster of size c. Note however that when clusters get sufficiently large—i.e., on the order of system size L—they no longer have an outer perimeter and are instead defined by the holes created by other clusters (Fig. 2, bottom). Thus for this dominant cluster to increase in size, the internal holes must shrink and its interfacial length actually falls. As temperature decreases, influence increases, but the available sites to transfer influence decreases, hence total information flow falls. We note that Eq. (5) is essentially the average interface length. There should thus be some relationship between average interfacial length and net information flow in the lattice. The average interface length is defined as:where interfacial lengths are found by performing a “turn-right walk” procedure, similar to Saberi[18], on every unmarked edge between adjoining lattice sites of differing states. Edges are marked during the walk procedure in association with an adjoining site (such that each edge is ultimately left unmarked or marked twice). This prevents a cluster from counting its perimeter (of length ) separate times, but correctly accounts for interface boundaries between a cluster and all of its neighbouring clusters. This also addresses clusters with two or more disjoint interfaces, i.e., a 2D doughnut. Interface results, I(T), are calculated from I(E) and Eq. (4) with the weighted Wang-Landau update scheme[15]. Each E value sampled at minimum 5000 times, up to a maximum of 10000 samples. The intuitive interface model of Eqs. (5) and (6), shown in Fig. 3, gives a remarkably good match to the trends, peaking in the disordered regime in all cases, and converging to only where systems become more strongly first-order (increased q and increased L for ). In the Ising case, interface peak location remains stable at increasing lattice sizes, as does peak location[7].
Figure 3

Average interface length for systems with for indicated lattice sizes. The behaviour in peak location mimics the behaviour of the peak in all systems: the first-order cases, , converge to as the system becomes more strongly first order (increased q, L), while the second-order peak remains stable above the phase transition. Note the factor of two difference in temperature for and Ising results is simply due to a slight difference in definition of site energy (i.e., ), with no further side effects.

Average interface length for systems with for indicated lattice sizes. The behaviour in peak location mimics the behaviour of the peak in all systems: the first-order cases, , converge to as the system becomes more strongly first order (increased q, L), while the second-order peak remains stable above the phase transition. Note the factor of two difference in temperature for and Ising results is simply due to a slight difference in definition of site energy (i.e., ), with no further side effects. Given the excellent agreement between the average interface length and , the average interface length is a suitable theoretical justification for , fitting the behaviour for the first- and second-order transitions into a single unified framework. It is known that in—at least some class of—second-order phase transitions, that can serve as a predictor for an impending phase transition in systems slowly approaching criticality from the disordered regime[7], and we show here that the same behaviour is exhibited in the first-order Potts model. Specifically, detection of a peak in while other information theoretic metrics[7] continue to rise could indicate an imminent transition. This behaviour of falling GTE while other information theoretic quantities rise holds most strongly for finite-size systems, that is, those with most practical utility, while it’s possible that in at least the case above, may converge to in the thermodynamic limit. Supplementary Information.
  7 in total

1.  Efficient, multiple-range random walk algorithm to calculate the density of states.

Authors:  F Wang; D P Landau
Journal:  Phys Rev Lett       Date:  2001-03-05       Impact factor: 9.161

2.  Determining the density of states for classical statistical models: a random walk algorithm to produce a flat histogram.

Authors:  F Wang; D P Landau
Journal:  Phys Rev E Stat Nonlin Soft Matter Phys       Date:  2001-10-17

3.  Measuring information transfer

Authors: 
Journal:  Phys Rev Lett       Date:  2000-07-10       Impact factor: 9.161

4.  Information flow in a kinetic Ising model peaks in the disordered phase.

Authors:  Lionel Barnett; Joseph T Lizier; Michael Harré; Anil K Seth; Terry Bossomaier
Journal:  Phys Rev Lett       Date:  2013-10-24       Impact factor: 9.161

Review 5.  Anticipating critical transitions.

Authors:  Marten Scheffer; Stephen R Carpenter; Timothy M Lenton; Jordi Bascompte; William Brock; Vasilis Dakos; Johan van de Koppel; Ingrid A van de Leemput; Simon A Levin; Egbert H van Nes; Mercedes Pascual; John Vandermeer
Journal:  Science       Date:  2012-10-19       Impact factor: 47.728

6.  Information theoretic aspects of the two-dimensional Ising model.

Authors:  Hon Wai Lau; Peter Grassberger
Journal:  Phys Rev E Stat Nonlin Soft Matter Phys       Date:  2013-02-20

7.  Flickering gives early warning signals of a critical transition to a eutrophic lake state.

Authors:  Rong Wang; John A Dearing; Peter G Langdon; Enlou Zhang; Xiangdong Yang; Vasilis Dakos; Marten Scheffer
Journal:  Nature       Date:  2012-11-18       Impact factor: 49.962

  7 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.