Article

Nonadditive entropy: The concept and its use

Centro Brasileiro de Pesquisas Fisicas and National Institute of Science and Technology for Complex Systems Xavier Sigaud 150 22290-180 Rio de Janeiro-RJ Brazil; Santa Fe Institute 1399 Hyde Park Road 87501 Santa Fe USA
European Physical Journal A (Impact Factor: 2.04). 40(3):257-266. DOI: 10.1140/epja/i2009-10799-0
Source: arXiv

ABSTRACT The thermodynamical concept of entropy was introduced by Clausius in 1865 in order to construct the exact differential dS = d \delta
Q/T , where d \delta
Q is the heat transfer and the absolute temperature T its integrating factor. A few years later, in the period 1872-1877, it was shown by Boltzmann that this quantity can be expressed
in terms of the probabilities associated with the microscopic configurations of the system. We refer to this fundamental connection
as the Boltzmann-Gibbs (BG) entropy, namely (in its discrete form) SBG=-kåi=1W pi lnpi\ensuremath S_{BG}=-k\sum_{i=1}^W p_i \ln p_i , where k is the Boltzmann constant, and {p
i} the probabilities corresponding to the W microscopic configurations (hence ∑W
i=1
p
i = 1 . This entropic form, further discussed by Gibbs, von Neumann and Shannon, and constituting the basis of the celebrated
BG statistical mechanics, is additive. Indeed, if we consider a system composed by any two probabilistically independent subsystems A and B (i.e., pijA+B=piA pjB, "(i,j)\ensuremath p_{ij}^{A+B}=p_i^A p_j^B, \forall(i,j) , we verify that SBG(A+B)=SBG(A)+SBG(B)\ensuremath S_{BG}(A+B)=S_{BG}(A)+S_{BG}(B) . If a system is constituted by N equal elements which are either independent or quasi-independent (i.e., not too strongly correlated, in some specific nonlocal sense), this additivity guarantees SBG to be extensive in the thermodynamical sense, i.e., that SBG(N) µ N\ensuremath S_{BG}(N) \propto N in the N ≫ 1 limit. If, on the contrary, the correlations between the N elements are strong enough, then the extensivity of SBG is lost, being therefore incompatible with classical thermodynamics. In such a case, the many and precious relations described
in textbooks of thermodynamics become invalid. Along a line which will be shown to overcome this difficulty, and which consistently
enables the generalization of BG statistical mechanics, it was proposed in 1988 the entropy Sq=k [1-åi=1W piq]/(q-1) (q Î R; S1=SBG)\ensuremath S_q=k [1-\sum_{i=1}^W p_i^q]/(q-1) (q\in{R}; S_1=S_{BG}) . In the context of cybernetics and information theory, this and similar forms have in fact been repeatedly introduced before
1988. The entropic form Sq is, for any q
¹ \neq 1 , nonadditive. Indeed, for two probabilistically independent subsystems, it satisfies Sq(A+B)/k=[Sq(A)/k]+ [Sq(B)/k]+(1-q)[Sq(A)/k][Sq(B)/k] ¹ Sq(A)/k+Sq(B)/k\ensuremath S_q(A+B)/k=[S_q(A)/k]+ [S_q(B)/k]+(1-q)[S_q(A)/k][S_q(B)/k] \neq S_q(A)/k+S_q(B)/k . This form will turn out to be extensive for an important class of nonlocal correlations, if q is set equal to a special value different from unity, noted qent (where ent stands for entropy . In other words, for such systems, we verify that Sqent(N) µ N (N >> 1)\ensuremath S_{q_{ent}}(N) \propto N (N \gg 1) , thus legitimating the use of the classical thermodynamical relations. Standard systems, for which SBG is extensive, obviously correspond to q
ent = 1 . Quite complex systems exist in the sense that, for them, no value of q exists such that Sq is extensive. Such systems are out of the present scope: they might need forms of entropy different from Sq, or perhaps --more plainly-- they are just not susceptible at all for some sort of thermostatistical approach. Consistently
with the results associated with Sq, the q -generalizations of the Central Limit Theorem and of its extended Lévy-Gnedenko form have been achieved. These recent theorems
could of course be the cause of the ubiquity of q -exponentials, q -Gaussians and related mathematical forms in natural, artificial and social systems. All of the above, as well as presently
available experimental, observational and computational confirmations --in high-energy physics and elsewhere-- are briefly
reviewed. Finally, we address a confusion which is quite common in the literature, namely referring to distinct physical mechanisms
versus distinct regimes of a single physical mechanism.

1 Bookmark
 · 
69 Views
  • [Show abstract] [Hide abstract]
    ABSTRACT: This paper proposes automated detection of skin lesions by unsupervised feature based clustering based on a new fuzzy entropy function for characterizing texture. The parameterized entropy function is optimized using the Bacterial Foraging algorithm. The clustering of the entropy function of the image is done using the popular Fuzzy C-means algorithm (FCM). The experimental results obtained after the clustering process indicate a very good segregation of texture clusters with satisfactory visual results. The results also provide us with the normalized entropy values needed for texel identification.
    Image and Signal Processing and Analysis, 2009. ISPA 2009. Proceedings of 6th International Symposium on; 10/2009
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: In this work the non-additive entropy is examined. It appears in isolated particle systems composed of few components. Therefore, the mixing of isolated particle systems S=S1+S2 has been studied. Two cases are considered T1=T2 and T1\leqT2, where T1,T2 are the initial temperatures of the system S1 and S2 respectively. The concept of similar systems containing interacting particles is introduced. These systems are defined by a common temperature and an identical time evolution process, i.e. the approach to the same thermodynamic equilibrium. The main results are: 1) The properties of the similar particle systems yield the non-additive entropy and free energy. The Gibbs Paradox is not a paradox. 2) The relation between the initial temperatures T1 and T2 governs the mixing process. 3) In the two cases T1=T2, T1\leqT2 mixing of the systems S1, S2 results in a uniform union system S=S1+S2. The systems S, S1, S2 are similar one to the other. 4) The mixing process is independent of the extensive quantities (volume, particle number, energy) and of the particle type. Only the mean energy plays an important role in the mixing of the systems S1, S2. 5) Mixing in the case T1\leqT2 is in essence a thermalization process, but mixing in the case T1=T2 is not a thermodynamic process. 6)Mixing is an irreversible process. Keywords: Entropy; Similar systems of interacting particles; Mixing of systems; Thermal equilibrium
    06/2012;
  • [Show abstract] [Hide abstract]
    ABSTRACT: This paper proposes a new probabilistic non-extensive entropy feature for texture characterization, based on a Gaussian information measure. The highlights of the new entropy are that it is bounded by finite limits and that it is non additive in nature. The non additive property of the proposed entropy makes it useful for the representation of information content in the non extensive systems containing some degree of regularity or correlation. The effectiveness of the proposed entropy in representing the correlated random variables is demonstrated by applying it for the texture classification problem since textures found in nature are random and at the same time contain some degree of correlation or regularity at some scale. The gray level co-occurrence probabilities (GLCP) are used for computing the entropy function. The experimental results indicate high degree of the classification accuracy. The performance of the new entropy function is found superior to other forms of entropy such as Shannon, Renyi, Tsallis and Pal and Pal entropies on comparison. Using the feature based polar interaction maps (FBIM) the proposed entropy is shown to be the best measure among the entropies compared for representing the correlated textures.
    Neurocomputing 11/2013; 120(Image Feature Detection and Description):214-225. · 1.63 Impact Factor

Full-text

View
0 Downloads
Available from