Literature DB >> 33266537

A Brief Review of Generalized Entropies.

José M Amigó1, Sámuel G Balogh2, Sergio Hernández3.   

Abstract

Entropy appears in many contexts (thermodynamics, statistical mechanics, information theory, measure-preserving dynamical systems, topological dynamics, etc.) as a measure of different properties (energy that cannot produce work, disorder, uncertainty, randomness, complexity, etc.). In this review, we focus on the so-called generalized entropies, which from a mathematical point of view are nonnegative functions defined on probability distributions that satisfy the first three Shannon-Khinchin axioms: continuity, maximality and expansibility. While these three axioms are expected to be satisfied by all macroscopic physical systems, the fourth axiom (separability or strong additivity) is in general violated by non-ergodic systems with long range forces, this having been the main reason for exploring weaker axiomatic settings. Currently, non-additive generalized entropies are being used also to study new phenomena in complex dynamics (multifractality), quantum systems (entanglement), soft sciences, and more. Besides going through the axiomatic framework, we review the characterization of generalized entropies via two scaling exponents introduced by Hanel and Thurner. In turn, the first of these exponents is related to the diffusion scaling exponent of diffusion processes, as we also discuss. Applications are addressed as the description of the main generalized entropies advances.

Entities:  

Keywords:  Hanel–Thurner exponents; Rényi; Tsallis; generalized entropy; non-stationary regime

Year:  2018        PMID: 33266537      PMCID: PMC7512376          DOI: 10.3390/e20110813

Source DB:  PubMed          Journal:  Entropy (Basel)        ISSN: 1099-4300            Impact factor:   2.524


1. Introduction

The concept of entropy was introduced by Clausius [1] in thermodynamics to measure the amount of energy in a system that cannot produce work, and given an atomic interpretation in the foundational works of statistical mechanics and gas dynamics by Boltzmann [2,3], Gibbs [4], and others. Since then, entropy has played a central role in many-particle physics, notoriously in the description of non-equilibrium processes through the second principle of thermodynamics and the principle of maximum entropy production [5,6]. Moreover, Shannon made of entropy the cornerstone on which he built his theory of information and communication [7]. Entropy and the associated entropic forces are also the main character in recent innovative approaches to artificial intelligence and collective behavior [8,9]. Our formalism is information-theoretic (i.e., entropic forms are functions of probability distributions) owing to the mathematical properties that we discuss along the way, but can be translated to a physical context through the concept of microstate. The prototype of entropy that we are going to consider below is the Boltzmann–Gibbs–Shannon (BGS) entropy, In its physical interpretation, J/K is the Boltzmann constant, W is the number of microstates consistent with the macroscopic constraints of a given thermodynamical system, and is the probability (i.e., the asymptotic fraction of time) that the system is in the microstate i. In information theory, k is set equal to 1 for mathematical convenience, as we do hereafter, and measures the average information conveyed by the outcomes of a random variable with probability distribution . We use natural logarithms unless otherwise stated, although logarithms to base 2 is the natural choice in binary communications (the difference being the units, nats or bits, respectively). Remarkably enough, Shannon proved in Appendix B of his seminal paper [7] that Equation (1) follows necessarily from three properties or axioms (actually, four are needed; more on this below). BGS entropy was later on generalized by other “entropy-like” quantities in dynamical systems (Kolmogorov–Sinai entropy [10], etc.), information theory (Rényi entropy [11], etc.), and statistical physics (Tsallis entropy [12], etc.), to mention the most familiar ones (see, e.g., [13] for an account of some entropy-like quantities and their applications, especially in time series analysis). Similar to with , the essence of these new entropic forms was distilled into a small number of properties that allow sorting them out in a more systematic way [13,14]. Currently, the uniqueness of is derived from the four Khinchin–Shannon axioms (Section 2). However, the fourth axiom, called the separability or strong additivity axiom (which implies additivity, i.e., , where stands for a system composed of any two probabilistically independent subsystems and ), is violated by physical systems with long-range interactions [15,16]. This poses the question of what mathematical properties have the “generalized entropies” satisfying only the other three axioms. These are the primary candidates for extensive entropic forms, i.e., functions S such that , the shorthand standing for the physical system composed of the subsystems and . Note that in non-ergodic interacting systems just because the number of states in is different from the number of states in . A related though different question is how to weaken the separability axiom to identify the extensive generalized entropies; we come back briefly to this point in Section 2 when speaking of the composability property. Along with , typical examples of generalized entropies are the Tsallis entropy [12], (, , with the proviso that for terms with are omitted), and the Rényi entropy [11], (, ). The Tsallis and Rényi entropies are related to the BGS entropy through the limits this being one of the reasons they are considered generalizations of the BGS entropy. Both and have found interesting applications [15,17]; in particular, the parametric weighting of the probabilities in their definitions endows data analysis with additional flexibility. Other generalized entropies that we consider in this paper are related to ongoing work on graphs [18]. Further instances of generalized entropies are also referred to below. Let us remark at this point that , , and other generalized entropies considered in this review can be viewed as special cases of the -entropies introduced in [19] for the study of asymptotic probability distributions. In turn, -entropies were generalized to quantum information theory in [20]. Quantum -entropies, which include von Neumann’s entropy [21] as well as the quantum versions of Tsallis’ and Rényi’s entropies, have been applied, for example, to the detection of quantum entanglement (see [20] and references therein). In this review, we do not consider quantum entropies, which would require advanced mathematical concepts, but only entropies defined on classical, discrete and finite probability distributions. If necessary, the transition to continuous distributions is done by formally replacing probability mass functions by densities and sums by integrals. For other approaches to the concept of entropy in more general settings, see [22,23,24,25]. Generalized entropies can be characterized by two scaling exponents in the limit , which we call Hanel–Thurner exponents [16]. For the simplest generalized entropies, which include but not (see Section 2), these exponents allow establishing a relationship between the abstract concept of generalized entropy and the physical properties of the system they describe through their asymptotic scaling behavior in the thermodynamic limit. That is, the two exponents label equivalence classes of systems which are universal in that the corresponding entropies have the same thermodynamic limit. In this regard, it is interesting to mention that, for any pair of Hanel–Thurner exponents (at least within certain ranges), there is a generalized entropy with those exponents, i.e., systems with the sought asymptotic behavior. Furthermore, the first Hanel–Thurner exponent allows also establishing a second relation with physical properties, namely, with the diffusion scaling exponents of diffusion processes, under some additional assumptions. The rest of this review is organized as follows. The concept of generalized entropy along with some formal preliminaries and its basic properties are discussed in Section 1. As way of illustration, we discuss in Section 3 the Tsallis and Renyi entropies, as well as more recent entropic forms. The choice of the former ones is justified by their uniqueness properties under quite natural axiomatic formulations. The Hanel–Thurner exponents are introduced in Section 4, where their computation is also exemplified. Their aforementioned relation to diffusion scaling exponents is explained in Section 5. The main messages are recapped in Section 6. There is no section devoted to the applications but, rather, these are progressively addressed as the different generalized entropies are presented. The main text has been supplemented with three appendices at the end of the paper.

2. Generalized Entropies

Let be the set of probability mass distributions for all . For any function ( being the nonnegative real numbers), the Shannon–Khinchin axioms for an entropic form H are the following. Continuity. depends continuously on all variables for each W. Maximality. For all W, Expansibility: For all W and , Separability (or strong additivity): For all , where . Let be the joint probability distribution of the random variables X and Y, with marginal distributions and , respectively. Then, axiom SK4 can be written as where is the entropy of Y conditional on X. In particular, if X and Y are independent (i.e., ), then and A function H such that Equation (5) holds (for independent random variables X and Y) is called additive. Physicists prefer writing for composed systems with microstate probabilities ; this condition holds approximately only for weakly interacting systems X and Y. With regard to Equation (5), let us remind that, for two general random variables X and Y, the difference is the mutual information of X and Y. It holds if and only if X and Y are independent [26]. More generally, a function H such that () is called -additive. With the same notation as above, we can write this property as where, again, X and Y are independent random variables. In a statistical mechanical context, X and Y may stand also for two probabilistically independent (or weakly interacting) physical systems. If , we recover additivity (Equation (5)). In turn, additivity and -additivity are special cases of composability [15,27]: with the same caveats for X and Y. Here, is a symmetric function of two variables. Composability was proposed in [15] to replace axiom SK4. Interestingly, it has been proved in [27] that, under some technical assumptions, the only composable generalized entropy of the form in Equation (10) is , up to a multiplicative constant. As mentioned in Section 1, a function satisfying axioms SK1–SK4 is necessarily of the form for every W, where k is a positive constant ([28], Theorem 1). The same conclusion can be derived using other equivalent axioms [14,29]. For instance, Shannon used continuity, the property that increases with n, and a property called grouping [29] or decomposibility [30], which he defined graphically in Figure 6 of [7]: (). This property allows reducing the computation of to the computation of the entropy of dichotomic random variables. According to ([15], Section 2.1.2.7), Shannon missed in his uniqueness theorem to formulate the condition in Equation (5), X and Y being independent random variables. Nonnegative functions defined on that satisfy axioms SK1–SK3 are called generalized entropies [16]. In the simplest situation, a generalized entropy has the sum property [14], i.e., the algebraic form with . The following propositions are immediate. Symmetry: is invariant under permutation of . satisfies axiom SK1 if and only if g is continuous. If satisfies axiom SK2, then for all and with . If g is concave (i.e., ∩-convex), then satisfies axiom SK2. satisfies axiom SK3 if and only if . Note that Proposition (iv) follows from the symmetry and concavity of (since the unique maximum of must occur at equal probabilities). We conclude from Propositions (ii), (iv) and (v) that, for to be a generalized entropy, the following three condition suffice: g is continuous. g is concave. . As in [16], we say that a macroscopic statistical system is admissible if it is described by a generalized entropy of the form in Equation (10) such that g verifies Conditions (C1)–(C3). By extension, we say also that the generalized entropy is admissible. Admissible systems and generalized entropies are the central subject of this review. Clearly, is admissible because . On the other hand, corresponds to For to be admissible, Condition (C1) requires and Condition (C3) requires . An example of a function with the sum property that does not qualify for admissible generalized entropy is Indeed, is not ∩-convex but ∪-convex and . This probability functional was used in [31] to classify sleep stages. Other generalized entropies that are considered below have the form where G is a continuous monotonic function, and g is continuous with . By definition, is also symmetric, and Proposition (iii) holds with the obvious changes. However, the concavity of g is not a sufficient condition any more for to be a generalized entropy. Such is the case of the Rényi entropy (Equation (3)); here but (and, hence, ) is not ∩-convex for . Furthermore, note that axiom SK3 requires for to be a generalized entropy. Since Equation (10) is a special case of Equation (14) (set G to be the identity map ), we can refer to both cases just by using the notation , as we do hereafter. We say that two probability distributions and , , are close if where ; other norms, such as the two-norm and the max-norm, will do as well since they are all equivalent in the metric sense. A function is said to be Lesche-stable if for all W and there exists such that where . It follows that Lesche stability is called experimental robustness in [15] because it guarantees that similar experiments performed on similar physical systems provide similar results for the function F. According to [16], all admissible systems are Lesche stable.

3. Examples of Generalized Entropies

As way of illustration, we put the focus in this section on two classical generalized entropies as well as on some newer ones. The classical examples are the Tsallis entropy and the Rényi entropy because they have extensively been studied in the literature from an axiomatic point of view too. As it turns out, they are unique under some natural assumptions, such as additivity, -additivity or composability (see below for details). The newer entropies are related to potential applications of the concept of entropy to graph theory [18]. Other examples of generalized entropies are listed in Appendix A for further references.

3.1. Tsallis Entropy

A simple way to introduce Tsallis’ entropy as a generalization of the BGS entropy is the following [15]. Given , define the q-logarithm of a real number as Note that is defined by continuity since . If the logarithm in the definition of , Equation (1), is replaced by , then we obtain the Tsallis entropy: As noted before, for to be an admissible generalized entropy. Alternatively, the definition can also be generalized to provide the Tsallis entropy via the q-derivative, where Set , i.e., , and let to check that . Although Tsallis proposed his entropy (Equation (17)) in 1988 to go beyond the standard statistical mechanics [12], basically the same formula had already been proposed in 1967 by Havrda and Charvát (with a different multiplying factor) in the realm of cybernetics and control theory [32]. Some basic properties of follow. because (or ). is (strictly) ∩-convex for . Figure 1 plots for , 1, 2 and 5. Let us mention in passing that is ∪-convex for .
Figure 1

Tsallis entropy for and 5.

is Lesche-stable for all [33,34]. Actually, we stated at the end of Section 2 that all admissible systems are Lesche stable. is not additive but q-additive (see Equation (6) or (7) with replaced by q). This property follows from [15] Similar to what happens with the BGS entropy, Tsallis entropy can be uniquely determined (except for a multiplicative positive constant) by a small number of axioms. Thus, Abe [35] characterized the Tsallis entropy by: (i) continuity; (ii) the increasing monotonicity of with respect to W; (iii) expansivity; and (iv) a property involving conditional entropies. Dos Santos [36], on the other hand, used the previous Axioms (i) and (ii), q-additivity, and a generalization of the grouping axiom (Equation (9)). Suyari [37] derived from the first three Shannon–Khinchin axioms and a generalization of the fourth one. The perhaps most economical characterization of was given by Furuichi [38]; it consists of continuity, symmetry under the permutation of , and a property called q-recursivity. As mentioned in Section 2, Tsallis entropy was recently shown [27] to be the only composable generalized entropy of the form in Equation (10) under some technical assumptions. Further axiomatic characterizations of the Tsallis entropy can be found in [39]. An observable of a thermodynamical (i.e., many-particle) system, say its energy or entropy, is said to be extensive if (among other characterizations), for a large number N of particles, that observable is (asymptotically) proportional to N. For example, for a system whose particles are weakly interacting (think of a dilute gas), the additive is extensive, whereas the non-additive () is non-extensive. The same happens with ergodic systems [40]. However, according to [15], for a non-ergodic system with strong correlations, can be non-extensive while can be extensive for a particular value of q; such is the case of a microcanonical spin system on a network with growing constant connectancy [40]. This is why represents a physically relevant generalization of the traditional . Axioms SK1–SK3 are expected to hold true also in strongly interacting systems. Further applications of the Tsallis entropy include astrophysics [41], fractal random walks [42], anomalous diffusion [43,44], time series analysis [45], classification [46,47], and artificial neural networks [48].

3.2. Rényi Entropy

A simple way to introduce Rényi’s entropy as a generalization of is the following [17]. By definition, the BGS entropy of the probability distribution (or of a random variable X with that probability distribution) is the linear average of the information function or, equivalently, the expected value of the random variable : In the general theory of expected values, for any invertible function and realizations of X in the definition domain of , an expected value can be defined as Applying this definition to , we obtain If this generalized average has to be additive for independent events, i.e., it has to satisfy Equation (6) with , then must hold, where , are positive constants, and . The first case leads to , Equation (1), after choosing . The second case leads to the Rényi entropy (actually, a one-parameter family of entropies) , Equation (3), after choosing as well. Next, we summarize some important properties of the Rényi entropy. is additive by construction. . Indeed, use L’Hôpital’s Rule to derive is ∩-convex for and it is neither ∩-convex nor ∪-convex for . Figure 2 plots for , 1, 2 and 5.
Figure 2

Rényi entropy for and 5.

is Lesche-unstable for all , [49]. The entropies are monotonically decreasing with respect to the parameter q for any distribution of probabilities, i.e., This property follows from the formula where , and is the Kullback–Leibler divergence of the probability distributions and . vanishes only in the event that both probability distributions coincide, otherwise is positive [26]. A straightforward relation between Rényi’s and Tsallis’ entropies is the following [50]: However, the axiomatic characterizations of the Rényi entropy are not as simple as those for the Tsallis entropy. See [27,51,52] for some contributions in this regard. For some values of q, has particular names. Thus, is called Hartley or max-entropy, which coincides numerically with for an even probability distribution. We saw in (R2) that converges to the BGS entropy in the limit . is called collision entropy. In the limit , converges to the min-entropy The name of is due to property (R5). Rényi entropy has found interesting applications in random search [53], information theory (especially in source coding [54,55]), cryptography [56], time series analysis [57], and classification [46,58], as well as in statistical signal processing and machine learning [17].

3.3. Graph Related Entropies

As part of ongoing work on graph entropy [18], the following generalized entropies are defined: and Note that , while . Other oddities of the above entropies include the terms in their definitions, as well as the presence of products instead of sums in the definition of . First, is of the type in Equation (10) with By definition, is continuous (even smooth), concave on the interval , and . Therefore (see Conditions (C1)–(C3) in Section 2), H1 satisfies the axioms SK1–SK3, hence it is a generalized entropy. As for , this probability functional is of the type in Equation (14) with and . To prove that is a generalized entropy, note that satisfies axioms SK1–SK3 for the same reasons as does. Therefore, the same happens with on account of the exponential function being continuous (SK1), increasingly monotonic (SK2), and univalued (SK3). Finally, is of the type in Equation (10) with Since , it is a generalized entropy because, as shown above, satisfies axioms SK1–SK3. Figure 3 depicts , , , along with and for comparison. As a curiosity, let us point out that the scaled versions (), see Figure 4, approximate measured in bits very well. In particular, the relative error in the approximation of by is less than , so their graphs overlap when plotted.
Figure 3

Entropies , , along with and for comparison.

Figure 4

Scaled entropies , see Equation (24).

A further description of the entropies in Equations (18)–(20) is beyond the scope of this section. Let us only mention in this regard that these entropies can be extended into the realm of acyclic directed graphs.

4. Hanel–Thurner Exponents

All generalized entropies group in classes labeled by two exponents introduced by Hanel and Thurner [16], which are determined by the limits (W being as before the cardinality of the probability distribution or the total number of microstates in the system, ) and (). Note that the limit in Equation (26) does not depend actually on c. The limits in Equations (25) and (26) can be computed via the asymptotic equipartition property [26]. Thus, and asymptotically with ever larger W (thermodynamic limit). Set now to derive and Clearly, the scaling exponents c, d of a generalized entropy depend on the behavior of g in an infinitesimal neighborhood of 0 (i.e., with ), as well as on the properties of G if . We call the Hanel–Thurner (HT) exponents of the generalized entropy . When , Equations (27) and (28) abridge to (after replacing by z), and respectively. In this case, , while d can be any real number. If , the concavity of g implies [16]. The physical properties of admissible systems are uniquely characterized by their HT exponents, i.e., by their asymptotic properties in the limit [16]. In this sense, we can also speak of the universality class . As way of illustration, we are going to derive the HT exponents of , and . For the BGS entropy, (see Equation (11)), so as . Therefore, . Furthermore, for all , so . For the Tsallis entropy, see Equation (12), It follows readily that if , and if . Hence, although , there is no parallel convergence concerning the HT exponents. For the Rényi entropy, and (see Equation (15)), so as (both for and ). Therefore, . Furthermore, for all , so that . In sum, for all q. As for the generalized entropies , and considered in Section 3.3, we show in Appendix B that their HT exponents are , , and , respectively. Thus, and belong to the same universality class as , while the HT exponents of and (both of the same the type in Equation (14)) are different. Moreover, the interested reader will find in Table 1 of [16] the HT exponents of the generalized entropies listed in Appendix A. An interesting issue that arises at this point is the inverse question: Given and , is there an admissible system such that its HT exponents are precisely ? The answer is yes, at least under some restrictions on the values of c and d. Following [16], we show in Appendix C that, if then the “generalized -entropy” has HT exponents . Here, and is the incomplete Gamma function (Section 6.5 of [59]), that is, Several application cases where generalized -entropies are relevant have been discussed by Hanel and Thurner in [40] (super-diffusion, spin systems, binary processes, and self-organized critical systems) and [60] (aging random walks, i.e., random walks whose transition rates between states are path- and time-dependent).

5. Asymptotic Relation between the HT Exponent c and the Diffusion Scaling Exponent

In contrast to “non-interacting” systems, where both the additivity and extensivity of the BGS entropy hold, in the case of general interacting statistical systems these properties can no longer be simultaneously satisfied, requiring a more general concept of entropy [16,40]. Following [16] (Section 4), a possible generalization of for admissible systems is defined via the two asymptotic scaling relations in Equations (29) and (30), i.e., the HT exponents c and d, respectively. These asymptotic exponents can be interpreted as a measure of deviation from the “non-interacting” case regarding the stationary behavior.

5.1. The Non-Stationary Regime

In this section, we describe a relation between the exponent c and a similar macroscopic measure that characterizes the system in the non-stationary regime, thus providing a meaningful interpretation of the exponent. The non-stationary behavior of a system can possibly be described by the Fokker–Planck (FP) equation governing the time evolution of a probability density function . In this continuous limit, the generalized entropy is assumed to be written as , where g is asymptotically characterized by Equation (29) and is a time-independent scalar function of the space coordinate x (for example, a potential) [61,62]. Going beyond the scope of the simplest FP equation, we consider systems for which the correlation among their (sub-)units can be taken into account by replacing the diffusive term with an effective term , where is a pre-defined functional of the probability density. can be either derived directly from the microscopical transition rules or it may be defined based on macroscopic assumptions. The resulting FP equation can be written as where are constants and is a time-independent external potential. For simplicity, hereafter we exclusively focus on one dimensional FP equations. In the special case of and no external forces, Equation (34) reduces to the well-known linear diffusion equation The above equation is invariant under the space-time scaling transformation with [63,64]. This scaling property opens up the possibility of a phenomenological and macroscopic characterization of anomalous diffusion processes [15,44] as well, which correspond to more complicated non-stationary processes described by FP equations in the form of Equation (34) with a non-trivial value of . With the help of the transformation in Equation (36), we can also classify correlated statistical systems according to the rate of the spread of their probability density functions over time in the asymptotic limit and, thus, quantitatively describe their behavior in the non-stationary regime.

5.2. Relation between the Stationary and Non-Stationary Regime

To reasonably and consistently relate the generalized entropies to the formalism of FP equations—corresponding to the stationary and non-stationary regime, respectively—the functional has to be chosen such that the stationary solution of the general FP equation becomes equivalent to the Maximum Entropy (MaxEnt) probability distribution calculated with the generalized entropies. These MaxEnt distributions can be obtained analogously to the results by Hanel and Thurner in [16,40], where they used standard constrained optimization to find the most general form of MaxEnt distributions, which turned out to be with Here, are constants depending only on the parameters and is the kth branch of the Lambert-W function (specifically, branch for and branch for ). The consistency criterion imposed above accords with the fact that many physical systems tend to converge towards maximum entropy configuration over time, however, it specifies the limits of our assumptions. Consider systems described by Equation (34) in the absence of external force, i.e., By assuming that the corresponding stationary solutions can be identified with the MaxEnt distributions in Equation (37), it can be shown that the functional form of the effective density must be expressed as where we neglected additive and multiplicative constant factors for the sake of simplicity. Similar implicit equations have already been investigated in [61,62,65]. Once the asymptotic phase space volume scaling relation in Equation (29) holds, it can also be shown that the generalized FP in Equation (38) (with as in Equation (39)) obeys the diffusion scaling property in Equation (36) with a non-trivial value of in the asymptotic limit [66] (assuming additionally the existence of the solution of Equation (38), at least from an appropriate initial condition). A simple algebraic relation between the diffusion scaling exponent and the phase space volume scaling exponent c can be established [66], which can be written as Therefore, this relation between c and defines families of FP equations which show asymptotic invariance under the scaling relation in Equation (36).

6. Conclusions

This review concentrates on the concept of generalized entropy (Section 2), which is relevant in the study of real thermodynamical systems and, more generally, in the theory of complex systems. Possibly the first example of a generalized entropy was introduced by Rényi (Section 3.2), who was interested in the most general information measure which is additive in the sense of Equation (5), with the random variables X and Y being independent. Another very popular generalized entropy was introduced by Tsallis as a generalization of the Boltzmann–Gibbs entropy (Section 3.1) to describe the properties of physical systems with long range forces and complex dynamics in equilibrium. Some more exotic generalized entropies are considered in Section 3.3, while other examples that have been published in the last two decades are gathered in Appendix A. Our approach was to a great extent formal, with special emphasis in Section 2 and Section 3 on axiomatic formulations and mathematical properties. For expository reasons, applications are mentioned and the original references given as our description of the main generalized entropies progressed, rather than addressing them jointly in a separate section. An alternative approach to generalized entropies other than the axiomatic one (Section 2) consists in characterizing their asymptotic behavior in the thermodynamic limit . Hanel and Thurner showed that two scaling exponents suffice for admissible generalized entropies, i.e., those entropies of the form in Equation (10) with g continuous, concave and (Section 4); it holds and . As a result, the admissible systems fall in equivalence classes labeled by the exponents of the corresponding entropies. Conversely, to each , there is a generalized entropy with those Hanel–Thurner exponents (see Equation (32)), at least for the most interesting value ranges. It is also remarkable that, at asymptotically large times and volumes, there is a 1-to-1 relation between the equivalence class of generalized entropies with a given and the equivalence class of Fokker–Planck equations in which the invariance in Equation (36) holds with (Section 5). This means that the equivalence classes of admissible systems can generally be mapped into anomalous diffusion processes and vice versa, thus conveying the same information about the system in the asymptotic limit (i.e., when ) [66]. A schematic visualization of this relation is provided in Figure 5. Moreover, the above result can actually be understood as a possible generalization of the Tsallis–Bukman relation [44].
Figure 5

Visual summary of the main result presented in Section 5 schematically depicting the relation between the exponents and c. Source: [66].

  9 in total

1.  Stability of Tsallis entropy and instabilities of Rényi and normalized Tsallis entropies: a basis for q-exponential distributions.

Authors:  Sumiyoshi Abe
Journal:  Phys Rev E Stat Nonlin Soft Matter Phys       Date:  2002-10-24

2.  Statistical mechanics in the context of special relativity.

Authors:  G Kaniadakis
Journal:  Phys Rev E Stat Nonlin Soft Matter Phys       Date:  2002-11-25

3.  Generalized entropy arising from a distribution of q indices.

Authors:  G A Tsekouras; Constantino Tsallis
Journal:  Phys Rev E Stat Nonlin Soft Matter Phys       Date:  2005-04-29

4.  Anomalous diffusion in the presence of external forces: Exact time-dependent solutions and their thermostatistical basis.

Authors: 
Journal:  Phys Rev E Stat Phys Plasmas Fluids Relat Interdiscip Topics       Date:  1996-09

5.  Fractal random walks from a variational formalism for Tsallis entropies.

Authors: 
Journal:  Phys Rev E Stat Phys Plasmas Fluids Relat Interdiscip Topics       Date:  1994-02

6.  Causal entropic forces.

Authors:  A D Wissner-Gross; C E Freer
Journal:  Phys Rev Lett       Date:  2013-04-19       Impact factor: 9.161

7.  Entropies for detection of epilepsy in EEG.

Authors:  N Kannathal; Min Lim Choo; U Rajendra Acharya; P K Sadasivan
Journal:  Comput Methods Programs Biomed       Date:  2005-10-10       Impact factor: 5.428

8.  The entropic basis of collective behaviour.

Authors:  Richard P Mann; Roman Garnett
Journal:  J R Soc Interface       Date:  2015-05-06       Impact factor: 4.118

9.  Phase space volume scaling of generalized entropies and anomalous diffusion scaling governed by corresponding non-linear Fokker-Planck equations.

Authors:  Dániel Czégel; Sámuel G Balogh; Péter Pollner; Gergely Palla
Journal:  Sci Rep       Date:  2018-01-30       Impact factor: 4.379

  9 in total
  10 in total

1.  Gamow Temperature in Tsallis and Kaniadakis Statistics.

Authors:  Hooman Moradpour; Mohsen Javaherian; Ebrahim Namvar; Amir Hadi Ziaie
Journal:  Entropy (Basel)       Date:  2022-06-08       Impact factor: 2.738

2.  An Automated Wavelet-Based Sleep Scoring Model Using EEG, EMG, and EOG Signals with More Than 8000 Subjects.

Authors:  Manish Sharma; Anuj Yadav; Jainendra Tiwari; Murat Karabatak; Ozal Yildirim; U Rajendra Acharya
Journal:  Int J Environ Res Public Health       Date:  2022-06-11       Impact factor: 4.614

3.  Generalized entropies, density of states, and non-extensivity.

Authors:  Sámuel G Balogh; Gergely Palla; Péter Pollner; Dániel Czégel
Journal:  Sci Rep       Date:  2020-09-23       Impact factor: 4.379

4.  A New Wavelet-Based Privatization Mechanism for Probability Distributions.

Authors:  Hélio M de Oliveira; Raydonal Ospina; Víctor Leiva; Carlos Martin-Barreiro; Christophe Chesneau
Journal:  Sensors (Basel)       Date:  2022-05-14       Impact factor: 3.847

5.  Asymptotic Normality for Plug-In Estimators of Generalized Shannon's Entropy.

Authors:  Jialin Zhang; Jingyi Shi
Journal:  Entropy (Basel)       Date:  2022-05-12       Impact factor: 2.738

6.  Weighted Relative Group Entropies and Associated Fisher Metrics.

Authors:  Iulia-Elena Hirica; Cristina-Liliana Pripoae; Gabriel-Teodor Pripoae; Vasile Preda
Journal:  Entropy (Basel)       Date:  2022-01-13       Impact factor: 2.524

7.  Signal Fluctuations and the Information Transmission Rates in Binary Communication Channels.

Authors:  Agnieszka Pregowska
Journal:  Entropy (Basel)       Date:  2021-01-10       Impact factor: 2.524

8.  Estimation of different types of entropies for the Kumaraswamy distribution.

Authors:  Abdulhakim A Al-Babtain; Ibrahim Elbatal; Christophe Chesneau; Mohammed Elgarhy
Journal:  PLoS One       Date:  2021-03-30       Impact factor: 3.240

9.  A Quantitative Comparison between Shannon and Tsallis-Havrda-Charvat Entropies Applied to Cancer Outcome Prediction.

Authors:  Thibaud Brochet; Jérôme Lapuyade-Lahorgue; Alexandre Huat; Sébastien Thureau; David Pasquier; Isabelle Gardin; Romain Modzelewski; David Gibon; Juliette Thariat; Vincent Grégoire; Pierre Vera; Su Ruan
Journal:  Entropy (Basel)       Date:  2022-03-22       Impact factor: 2.524

10.  Evaluation of Surrogate Endpoints Using Information-Theoretic Measure of Association Based on Havrda and Charvat Entropy.

Authors:  María Del Carmen Pardo; Qian Zhao; Hua Jin; Ying Lu
Journal:  Mathematics (Basel)       Date:  2022-01-31
  10 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.