Nonadditive entropy: The concept and its use

Centro Brasileiro de Pesquisas Fisicas and National Institute of Science and Technology for Complex Systems Xavier Sigaud 150 22290-180 Rio de Janeiro-RJ Brazil; Santa Fe Institute 1399 Hyde Park Road 87501 Santa Fe USA
European Physical Journal A (Impact Factor: 2.42). 12/2008; 40(3):257-266. DOI: 10.1140/epja/i2009-10799-0
Source: arXiv

ABSTRACT The thermodynamical concept of entropy was introduced by Clausius in 1865 in order to construct the exact differential dS = d \delta
Q/T , where d \delta
Q is the heat transfer and the absolute temperature T its integrating factor. A few years later, in the period 1872-1877, it was shown by Boltzmann that this quantity can be expressed
in terms of the probabilities associated with the microscopic configurations of the system. We refer to this fundamental connection
as the Boltzmann-Gibbs (BG) entropy, namely (in its discrete form) SBG=-kåi=1W pi lnpi\ensuremath S_{BG}=-k\sum_{i=1}^W p_i \ln p_i , where k is the Boltzmann constant, and {p
i} the probabilities corresponding to the W microscopic configurations (hence ∑W
i = 1 . This entropic form, further discussed by Gibbs, von Neumann and Shannon, and constituting the basis of the celebrated
BG statistical mechanics, is additive. Indeed, if we consider a system composed by any two probabilistically independent subsystems A and B (i.e., pijA+B=piA pjB, "(i,j)\ensuremath p_{ij}^{A+B}=p_i^A p_j^B, \forall(i,j) , we verify that SBG(A+B)=SBG(A)+SBG(B)\ensuremath S_{BG}(A+B)=S_{BG}(A)+S_{BG}(B) . If a system is constituted by N equal elements which are either independent or quasi-independent (i.e., not too strongly correlated, in some specific nonlocal sense), this additivity guarantees SBG to be extensive in the thermodynamical sense, i.e., that SBG(N) µ N\ensuremath S_{BG}(N) \propto N in the N ≫ 1 limit. If, on the contrary, the correlations between the N elements are strong enough, then the extensivity of SBG is lost, being therefore incompatible with classical thermodynamics. In such a case, the many and precious relations described
in textbooks of thermodynamics become invalid. Along a line which will be shown to overcome this difficulty, and which consistently
enables the generalization of BG statistical mechanics, it was proposed in 1988 the entropy Sq=k [1-åi=1W piq]/(q-1) (q Î R; S1=SBG)\ensuremath S_q=k [1-\sum_{i=1}^W p_i^q]/(q-1) (q\in{R}; S_1=S_{BG}) . In the context of cybernetics and information theory, this and similar forms have in fact been repeatedly introduced before
1988. The entropic form Sq is, for any q
¹ \neq 1 , nonadditive. Indeed, for two probabilistically independent subsystems, it satisfies Sq(A+B)/k=[Sq(A)/k]+ [Sq(B)/k]+(1-q)[Sq(A)/k][Sq(B)/k] ¹ Sq(A)/k+Sq(B)/k\ensuremath S_q(A+B)/k=[S_q(A)/k]+ [S_q(B)/k]+(1-q)[S_q(A)/k][S_q(B)/k] \neq S_q(A)/k+S_q(B)/k . This form will turn out to be extensive for an important class of nonlocal correlations, if q is set equal to a special value different from unity, noted qent (where ent stands for entropy . In other words, for such systems, we verify that Sqent(N) µ N (N >> 1)\ensuremath S_{q_{ent}}(N) \propto N (N \gg 1) , thus legitimating the use of the classical thermodynamical relations. Standard systems, for which SBG is extensive, obviously correspond to q
ent = 1 . Quite complex systems exist in the sense that, for them, no value of q exists such that Sq is extensive. Such systems are out of the present scope: they might need forms of entropy different from Sq, or perhaps --more plainly-- they are just not susceptible at all for some sort of thermostatistical approach. Consistently
with the results associated with Sq, the q -generalizations of the Central Limit Theorem and of its extended Lévy-Gnedenko form have been achieved. These recent theorems
could of course be the cause of the ubiquity of q -exponentials, q -Gaussians and related mathematical forms in natural, artificial and social systems. All of the above, as well as presently
available experimental, observational and computational confirmations --in high-energy physics and elsewhere-- are briefly
reviewed. Finally, we address a confusion which is quite common in the literature, namely referring to distinct physical mechanisms
versus distinct regimes of a single physical mechanism.

1 Bookmark
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We review the consequences of intrinsic, nonstatistical temperature fluctuations as seen in observables measured in high-energy collisions. We do this from the point of view of nonextensive statistics and Tsallis distributions. Particular attention is paid to multiplicity fluctuations as a first consequence of temperature fluctuations, to the equivalence of temperature and volume fluctuations, to the generalized thermodynamic fluctuations relations allowing us to compare fluctuations observed in different parts of the phase space, and to the problem of the relation between Tsallis entropy and Tsallis distributions. We also discuss the possible influence of conservation laws on these distributions and provide some examples of how one can get them without considering temperature fluctuations.
    European Physical Journal A 01/2012; 48(11). · 2.42 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The structure entropy is one of the most important parameters to describe the structure property of the complex networks. Most of the existing struc- ture entropies are based on the degree or the betweenness centrality. In order to describe the structure property of the complex networks more reasonably, a new structure entropy of the complex networks based on the Tsallis nonextensive statistical mechanics is proposed in this paper. The influence of the degree and the betweenness centrality on the structure property is combined in the proposed structure entropy. Compared with the existing structure entropy, the proposed structure entropy is more reasonable to describe the structure property of the complex networks in some situations.
  • [Show abstract] [Hide abstract]
    ABSTRACT: This paper proposes a new probabilistic non-extensive entropy feature for texture characterization, based on a Gaussian information measure. The highlights of the new entropy are that it is bounded by finite limits and that it is non additive in nature. The non additive property of the proposed entropy makes it useful for the representation of information content in the non extensive systems containing some degree of regularity or correlation. The effectiveness of the proposed entropy in representing the correlated random variables is demonstrated by applying it for the texture classification problem since textures found in nature are random and at the same time contain some degree of correlation or regularity at some scale. The gray level co-occurrence probabilities (GLCP) are used for computing the entropy function. The experimental results indicate high degree of the classification accuracy. The performance of the new entropy function is found superior to other forms of entropy such as Shannon, Renyi, Tsallis and Pal and Pal entropies on comparison. Using the feature based polar interaction maps (FBIM) the proposed entropy is shown to be the best measure among the entropies compared for representing the correlated textures.
    Neurocomputing 11/2013; 120(Image Feature Detection and Description):214-225. · 2.01 Impact Factor


Available from