Access to this full-text is provided by SAGE Publications Inc.
Content available from Advances in Mechanical Engineering
This content is subject to copyright.
Reliability Analysis and Design Optimization of Mechanical Systems under Various Uncertainties - Review
Advances in Mechanical Engineering
2019, Vol. 11(6) 1–14
ÓThe Author(s) 2019
DOI: 10.1177/1687814019857350
journals.sagepub.com/home/ade
A review of entropy measures for
uncertainty quantification of stochastic
processes
Alireza Namdari and Zhaojun (Steven) Li
Abstract
Entropy is originally introduced to explain the inclination of intensity of heat, pressure, and density to gradually disappear
over time. Based on the concept of entropy, the Second Law of Thermodynamics, which states that the entropy of an
isolated system is likely to increase until it attains its equilibrium state, is developed. More recently, the implication of
entropy has been extended beyond the field of thermodynamics, and entropy has been applied in many subjects with
probabilistic nature. The concept of entropy is applicable and useful in characterizing the behavior of stochastic pro-
cesses since it represents the uncertainty, ambiguity, and disorder of the processes without being restricted to the forms
of the theoretical probability distributions. In order to measure and quantify the entropy, the existing probability of every
event in the stochastic process must be determined. Different entropy measures have been studied and presented
including Shannon entropy, Renyi entropy, Tsallis entropy, Sample entropy, Permutation entropy, Approximate entropy,
and Transfer entropy. This review surveys the general formulations of the uncertainty quantification based on entropy as
well as their various applications. The results of the existing studies show that entropy measures are powerful predictors
for stochastic processes with uncertainties. In addition, we examine the stochastic process of lithium-ion battery capac-
ity data and attempt to determine the relation between the changes in battery capacity over different cycles and two
entropy measures: Sample entropy and Approximate entropy.
Keywords
Entropy, uncertainty, probability distribution, stochastic processes, characterization
Date received: 31 October 2018; accepted: 24 May 2019
Handling Editor: James Baldwin
Introduction
A stochastic process can be simply defined as a collec-
tion of random variables indexed by time. The continu-
ous and discrete time cases can be separated to define
stochastic processes more precisely. A discrete time
stochastic process Xdis =fXn,n=0,1,2,...gis a coun-
table collection of random variables indexed by
non-negative integers, whereas a continuous time sto-
chastic process Xcon =fXt,0łtł‘gis an uncountable
collection of random variables indexed by non-negative
real numbers.
1
Real-world applications of most scientific subjects
including stochastic processes are fraught with
numerous uncertainties. Hence, it is believed, by many,
that the stochastic processes, which consist of compli-
cated uncertainties, are not predictable.
1–3
The applica-
tions of concepts of physics and thermodynamics to
explain scientific phenomena have been studied in the
Department of Industrial Engineering and Engineering Management,
Western New England University, Springfield, MA, USA
Corresponding author:
Zhaojun (Steven) Li, Department of Industrial Engineering and
Engineering Management, Western New England University, Springfield,
MA 01119, USA.
Email: zhaojun.li@wne.edu
Creative Commons CC BY: This article is distributed under the terms of the Creative Commons Attribution 4.0 License
(http://www.creativecommons.org/licenses/by/4.0/) which permits any use, reproduction and distribution of the work without
further permission provided the original work is attributed as specified on the SAGE and Open Access pages (https://us.sagepub.com/en-us/nam/
open-access-at-sage).
recent decade. The concept of entropy, which stems
from thermodynamics, has advanced our understand-
ing of the world.
3–5
Entropy is one of the concepts in
physics that can be useful in rejecting the null hypoth-
esis of unpredictability of stochastic processes.
6–8
In
this regard, various metrics including Shannon entropy,
Renyi entropy, Tsallis entropy, Approximate entropy,
Sample entropy, Transfer entropy, and Permutation
entropy have been presented. The concept of entropy is
applicable and useful in characterizing the behavior of
stochastic processes since it represents the uncertainty,
ambiguity, and disorder of the process without causing
any restrictions on the theoretical probability distribu-
tion. In order to quantify the entropy, an associated
probability distribution is needed.
9
The conditional heteroscedastic models such as
Generalized Autoregressive Conditional Heteroskedasticity
(GARCH) models have traditionally been studied for
measuring the variabilities in the stochastic processes
due to the inherent uncertainties. In this article, first,
we present traditional models as well as their assump-
tions and limitations. Next, we review the literature
on the relationship between entropy measures and
uncertainty inherent in stochastic processes. This
research area is characterized by numerous studies
reflecting different entropy measures, uncertainties,
methodologies, and applications. We discuss nearly
140 scholarly works on the subject and draw general
conclusions from previous studies. We attempt to
provide a cogent approach to the issue, identify the
key findings from the literature, and present the con-
cluding remarks as well as the future research direc-
tions. In addition, we present case study of the
lithium-ion battery capacity data, which is an exam-
ple of stochastic process. We examine the relation
between battery capacities over different cycles and
two entropy measures: Sample entropy (SampEn) and
Approximate entropy (ApEn).
Conditional heteroscedastic models
Before the idea of utilizing entropy measures for uncer-
tainty modeling and quantification was introduced,
long memory and volatility clustering
10,11
were tradi-
tionally studied for predictive modeling and uncertainty
quantification based on conditional heteroscedastic
models. Conditional heteroscedastic models are statisti-
cal models, which are usually used to model time series.
These models explain the variance of the current error
term, known as innovation, as a function of the previ-
ous periods’ error terms.
12
Examples of conditional het-
eroscedastic models are Autoregressive Conditional
Heteroskedasticity (ARCH) model, Generalized
ARCH (GARCH) model, GARCH in-Mean
(GARCH-M) model, and Exponential GARCH
(EGARCH) model. Conditional heteroscedastic models
are mostly used to model the stochastic behavior of a
time series, which is a special case of stochastic process
with discrete time data. This section reviews and sum-
marizes these models focusing on the main assumptions
and modeling principles. Conditional heteroscedastic
models for variances are based on Autoregressive (AR)
and Moving Averages (MA) as follows.
The general form of the GARCH variance model,
GARCH (p,q),
13
can be written as follows
Var YtjYt1,...,Ytp
=s2
t
=v+a1Y2
t1+ +apY2
tp
+b1s2
t1+ +bqs2
tq
=v+X
p
i=1
aiY2
ti+X
q
j=1
bjs2
tj
ð1Þ
where v,ai, and bjare model parameters with the fol-
lowing assumptions
v.0,aiø0,bjø0,X
p
i=1
ai+X
q
j=1
bj\1ð2Þ
It should be noted that large values of Y2
t1result in a
large value of variance at time t, which indicate that Yt
is less predictable than the case with smaller values of
Y2
t1. In other words, large values of Y2
t1indicate that
Ythas more volatility.
In the GARCH-M model,
14
the conditional var-
iance, ht, of the random variable Xt, given the random
variable Yt, is a linear function of the previous periods’
conditional variances and the square of the past peri-
ods’ errors
Xt=[Yt+ght+etð3Þ
ht=a0+X
p
i=1
aie2
ti+X
q
i=1
bihtið4Þ
In addition, the error terms, et, are distributed
according to a normal distribution as follows
etjOt1;N0,ht
ðÞ ð5Þ
where Xtis the risk premium, Ytis a predetermined vec-
tor of variables, Ois information set which explains the
variability in the stochastic process Xt, and Ot1indi-
cates that the current information is significant in
regard to forecasting the future random variable Xt.It
should be noted that if information arrives in clusters,
the random variable Xtmay exhibit clustering as well.
In the EGARCH model,
15
the logarithm of the con-
ditional variance follows an AR as shown below
Xt=a+geht+bet1+etð6Þ
2Advances in Mechanical Engineering
ln ht
ðÞ=v+X
p
i=1
biln hti
ðÞ
+X
q
i=1
ai[zti+czti
jj
Ez
ti
jj
ðÞ½
ð7Þ
Ez
t
jj
=2
p
1=2
ð8Þ
et=e1=2ðÞ
htztð9Þ
zt;N0,1ðÞ ð10Þ
where v,ai,bi,[, and care parameters. It is noted
that v,ai, and biare not necessarily non-negative. As
it is shown in equations (6)–(10), the variances depend
on both the sign and the size of lagged residuals.
In summary, GARCH family models can be used for
modeling the stochastic processes with complicated
endogenous and exogenous uncertainties such as the
financial time series. In order to build the appropriate
GARCH model that fits the data set, the GARCH
parameters must be estimated based on the historical
data. It is noted that the model order parameters pand
qcan be specified according to the autocorrelations of
the standardized residuals. Hurst exponent is another
metric which can be used for measuring the long-term
memory of time series. Hurst exponent can be estimated
by fitting the power law to the data set in time-series
form. It represents the autocorrelations
16
of the time
series and how the autocorrelations decrease as the
number of lags increases. Autocorrelation between the
observations challenges the unpredictability of the sto-
chastic processes since the future data can be forecasted
based on the previous data. However, Gunaratne
et al.
17
stated that Hurst parameter, in isolation, is not
sufficient to conclude the presence of long-term mem-
ory. Kwon and Yang
18
showed that directionality
between the different data points is not reflected in the
cross-correlation.
19–21
Another approach for studying
the phenomena of long memory and volatility cluster-
ing is by using measures based on the concept of
entropy including Shannon entropy, Renyi entropy,
22,23
Tsallis entropy,
24–27
Transfer entropy, and Sample
entropy as explained in the following section.
Entropy measures for uncertainty
quantification
Shannon entropy
Entropy was first discussed by Clausius in 1865 to
explore the inclination of intensity of heat, pressure,
and density to gradually disappear over time. Based on
this concept, Clausius established the Second Law of
Thermodynamics stating that the entropy of an isolated
system is likely to increase until it attains its
equilibrium state. Schwill
28
and Shannon
29,30
extended
the concept of entropy and claimed that entropy is not
limited to thermodynamics and can be applied in any
subject with probabilistic nature.
31,32
Bentes et al.
33
argued that physical concept of entropy is actually a
special case of the Shannon entropy as it quantifies
probabilities in the full state space.
Schwill
28
stated that entropy is a measure of uncer-
tainty in random variables. For a random variable X
over a probability space Oand probability distribution
p(x), Shannon entropy is defined as follows
HXðÞ
cont =ð
O
pxðÞlog2pxðÞðÞdx ð11Þ
HXðÞ
discr =X
x2O
pxðÞlog2pxðÞðÞ ð12Þ
where H(X)cont and H(X)discr are entropy measures for
continuous and discrete random variables, respectively.
Moreover, equations (11) and (12) can be explained as
follows
HXðÞ=Eplog 1
pXðÞ ð13Þ
As it is shown in equation (13), the expected value
of log1=p(X) is equivalent to Shannon entropy. It
should be noted that the original measurement units
for entropy are bits, which are units of information
needed to store one symbol in a message. Entropy can
also be used as one way of measuring the uncertainty
of random processes. The state with p(X)=0can be
justified by limp!0plog2(p)=0. It is also noted that,
as Kullback and Leibler
34
showed, the definition of
Shannon entropy lacks an invariant measure and in the
absolutely continuous case depends on the parametri-
zations.
35
The concept of Shannon entropy has been
applied to different domains including the power sys-
tems,
36
physics,
29,30
finance,
3
biology,
4
physiology,
1
and cardiovascular time series.
37–41
Maasoumi and Racine
42
studied the predictability of
the time series by utilizing a new measure of depen-
dence, which performs similar to the Shannon entropy
as follows
S=1
2ð
‘
‘ð
‘
‘
f1=2f1=2
1f1=2
2
2
dxdy ð14Þ
where f=f(x,y) is the joint density and f1=f(x) and
f2=f(y) are the marginal densities of the random vari-
ables Xand Y.
The interesting properties of this measure of depen-
dence are as follows:
1. It is applicable in both discrete and continuous
cases.
Namdari and Li 3
2. It lies within the range from 0 to unity. It is nor-
malized to 0 in the case of independency and the
modulus of the measure is 1 in the case of mea-
surable exact relationship between the random
variables.
3. In the case of a bivariate normal distribution,
this measure of dependence has a simple rela-
tionship with the correlation coefficient.
4. It measures not only the distance but also the
divergence.
Moreover, they compare their numerical results with
those of a set of traditional measures. They show that
their entropy measure
43–45
is capable of uncovering
non-linear dependence in the time series. To the best of
our knowledge, the work by Maasoumi and Racine is
one of the first studies related to the entropy measures,
uncertainty, and predictability of time series, at least, in
the finance area.
Renyi and Tsallis entropies
Shannon entropy performs well only if the storage
capacity of a transmitting channel is finite. Renyi pre-
sents a new entropy measure known as Renyi entropy
of order a,Sa(X), for discrete variables,
46,47
which is
defined as follows
SaXðÞ=1
1aln X
n
k=1
pa
k
! ð15Þ
where a.0and a6¼ 1. It is noted that in the case of
limit a!1, Renyi entropy and Shannon entropy are
identical.
Tsallis entropy
24–27
for any non-negative real num-
ber q,Sq(X), can be estimated as it is shown in the fol-
lowing formula
SqXðÞ=1Pn
i=1pa
i
q1ð16Þ
Tsallis entropy yields power-law distribution,
whereas Shannon entropy yields exponential equili-
brium distribution. It is noted that pa
kand pa
iin equa-
tions (15) and (16) are the probability distributions of a
given random variable X. Renner and Wolf
48
intro-
duced a new entropy measure called smooth Renyi
entropy for characterizing the fundamental properties
of a random variable such as the uniform randomness
that may be extracted from the random variable. Let
Be(P):=fQ:d(P,Q)łegbe the set of probability dis-
tributions which are e-close to P, the e-smooth Renyi
entropy of order acan be defined as
He
aPðÞ=1
1ainf
Q2BePðÞ
log2X
z2Z
QzðÞ
a
!
ð17Þ
where Pis the probability distribution with range Z,dis
the variational distance, a2½0,‘, and eø0.
Sample entropy
Sample entropy is useful in quantifying the fluctuation
degree of a time series.
49–56
The sample entropy
SampEn (m,r,Nc) can be defined as the negative natu-
ral logarithm of an estimate of the conditional prob-
ability that windows of length m(subseries of the time
series of length Nc) that remain similar within a toler-
ance ralso match at the next point.
The process of sample entropy measurement can be
summarized as follows.
For a data sequence x(n)=x(1), x(2), x(3), ...,x(N),
where Nis the total number of data points to calculate
SampEn (r,m,N), a run length mand a tolerance win-
dow rmust be specified to compute SampEn (r,m,N).
Step 1. Form m-vectors, X(1)toX(Nm+1) defined
by
XiðÞ=xiðÞ,xi+1ðÞ,...,xi+m1ðÞ½i=1to Nm+1
ð18Þ
XjðÞ=xjðÞ,xj+1ðÞ,...,xj+m1ðÞ½j=1to Nm+1
ð19Þ
Step 2. Define the distance d½X(i), X(j)between vectors
X(i) and X(j) as the maximum absolute difference
between their respective scalar components
dX i
ðÞ
,Xj
ðÞ½
= max xi+k
ðÞ
xj+k
ðÞ
jj
½
k=0to m1
ð20Þ
The relation matrix is shown in Figure 1.
Step 3. Define for each i
VmiðÞ=no:of dmXiðÞ,XjðÞ½łri6¼ jð21Þ
Bm
iiðÞ=1
Nm+1VmiðÞ ð22Þ
BmrðÞ=1
NmX
Nm
i=1
Bm
iiðÞ ð23Þ
Similarly
Vm+1iðÞ=no:of dm+1XiðÞ,XjðÞ½łri6¼ jð24Þ
Am
iiðÞ=1
Nm+1Vm+1iðÞ ð25Þ
AmrðÞ=1
NmX
Nm
i=1
Am
iiðÞ ð26Þ
where i=1to Nm+1.
4Advances in Mechanical Engineering
Step 4. Sample entropy for a finite data length of Ncan
be estimated as
SampEn r,m,NðÞ=ln AmrðÞ
BmrðÞ
ð27Þ
Mutual information and transfer entropy
Entropy measures, including the measures discussed in
the previous section, are useful in explaining the varia-
bility in univariate time series.
57–60
The concept of
entropy is extended to the Transfer entropy
61
and
Mutual information,
62,63
which are discussed in the fol-
lowing section. The concept of Transfer entropy is
applicable to uncovering the information flows between
the systems, bivariate analysis of time series under
uncertainties such as financial time series,
28
and multi-
variate analysis of time series in various disciplines such
as physiology,
64
neuroscience,
65
ecology, bionomics,
and neurology.
28
Mutual information and Transfer entropy measure
the information flow between two time series and deter-
mine whether the variability in one random variable is
helpful in explaining the variability in a second variable,
whereas Shannon entropy is utilized for quantifying the
variability in an individual random variable. Mutual
information and Transfer entropy can be determined
based on the conditional probability distributions
between the two random variables and are discussed in
this section.
The joint entropy H(X,Y) and conditional entropy
H(XjY) of two variables Xand Ywith probability space
Xand Yare defined as
HX,YðÞ=X
x2X,y2Y
px,yðÞlog2px,yðÞðÞð28Þ
HXjYðÞ=X
x2X,y2Y
pxjyðÞlog2pxjyðÞðÞ ð29Þ
Moreover, the relationship between conditional and
joined entropies is as follows
HX,YðÞ=HXðÞ+HYjXðÞ=HYðÞ+HXjYðÞð30Þ
Mutual information of two random variables Xand
Y,I(X;Y),
66,67
and its general form, Transfer entropy,
determine the dependency between two time series.
68,69
Mutual information quantifies the reduction of uncer-
tainty regarding Xby observing Y
70,71
as is shown in
the following formula
IX;YðÞ=X
x2X,y2Y
px,yðÞlog2
px,yðÞ
pxðÞpyðÞ
=HXðÞHXjYðÞ
ð31Þ
Figure 2 demonstrates the relationship between (1)
different entropy measures as well as the joint entropy
between the two random variables Xand Y, (2) the
entropy of Xconditioned on Y, (3) the entropy of Y
conditioned on X, and (4) mutual information between
the random variables Xand Y.
Figure 1. Relation matrix used for defining distance d½X(i), X(j)between vectors X(i)and X(j).
Figure 2. The entropies H(X)and H(Y), the joint entropy
H(X,Y), the conditional entropies H(XjY)and H(YjX), and
mutual information I(X;Y).
Namdari and Li 5
Mutual information is symmetric and therefore can-
not be used for determining the direction of an informa-
tion flow
35,72
IX;YðÞ=IY;XðÞ ð32Þ
Mutual information can be modified in order to
include lead-lag relationships
73
as follows
IX;Y
ðÞ
t=X
xnt2X,yn2Y
px
nt,yn
ðÞ
log2
px
nt,yn
ðÞ
px
nt
ðÞpy
n
ðÞ
ð33Þ
where tis the number of lags. The mutual information
between the random variables Xand Ywithout lag and
the mutual information between the random variable X
and the lagged version of the random variable Ywith
t=1are depicted in Figure 3.
Measuring the interactions between the time series is
critical in most disciplines. Correlation and cross-
correlation can be utilized to detect the linear depen-
dence between time series. However, these measures are
not appropriate for modeling the dependence between
the time series due to the following two reasons: time is
not equally spaced and there is a non-linear mechanism
in the time series. Hence, Transfer entropy, which per-
forms according to transition probabilities, can be uti-
lized to quantify the dependence between the time
series.
74
Let p(x1,...,xn) be the probability of
observing the sequence (x1,...,xn). Transfer
entropy,
75–78
which is a general form of mutual
entropy, can be determined as
TY!Xm,lðÞ=Xpx
t1,...,xtm,ytml+1,...ytm
ðÞ
log2
px
tm+1jxt1,...,xtm,ytml+1,...ytm
ðÞ
px
tm+1jxt1,...,xtm
ðÞ
ð34Þ
where xtand ytare the discrete states of Xand Yat time
t, and the parameters mand lare the number of past
observations included in Xand Y. Figure 4 illustrates a
bivariate analysis between the two random variables X
and Y, and the Transfer entropy TY!X(1,1).
The main difference between the Transfer entropy
and Mutual information is that Transfer entropy takes
into consideration the transition probabilities. Transfer
entropy can be defined as the difference in the gain of
information regarding xtm+1conditioned on both its
own history and the history of yt, and the case where it
is conditioned on its own history only. It is noted that
Transfer entropy is not symmetric. Moreover, for a sta-
tionary Markov process, only the most recent history,
Xn1, is required for forecasting Xn.
Marschinski and Kantz
79
examined the well-known
physical concepts such as Spin systems,
80
Turbulence,
81
Universality,
82
Self-organized criticality,
83,84
Complexity,
85
and entropy and concluded that among
these physical concepts entropy may be useful in cap-
turing the uncertainties inherent in stochastic processes.
They define the Transfer entropy as follows
Transfer Entropy TEðÞ=finformation about future data It+1ðÞgained form past joint
observations of Iand Jgfinformation about future data It+1ðÞgained from past
observations of Ionlyg= information flow from Jto I
ð35Þ
Figure 3. The mutual information I(Xt,Yt)and I(Xt,Yt1).Figure 4. The Transfer entropy TY!X(1,1).
6Advances in Mechanical Engineering
Kwon and Yang
86
argued that the first step in the
study of Transfer entropy is to discretize the time
series
87,88
by some coarse graining. They partition the
variables xninto discretized counterparts An, by utiliz-
ing a constant das follows
An=0for xnłd
2decreaseðÞð36Þ
An=1for d
2łxnłd
2intermediateðÞð37Þ
An=2for xnød
2increaseðÞ ð38Þ
It is crucial to determine an appropriate constant d
since the probability of the states varied by the changes
in d.
Effective transfer entropy
Using the longer history of the time series in the calcu-
lation of Transfer entropy may cause noises rather than
providing more information.
89–92
The limitation of the
Transfer entropy is that it does not account for the
effect of noises.
67
In order to take into consideration
the noises, Effective Transfer entropy
93
is introduced as
ETY!Xm,lðÞ=TY!Xm,lðÞm^
Tsh
ð39Þ
where ^
Tsh is the mean shuffled Transfer entropy,
28
mis
the length of the past history of X, and lis the length of
the past observations of Y.
Relative explanation added (REA) can be utilized as
a secondary measure
28
to study the uncertainty based
on the concept of entropy as follows
REA m,lðÞ=HX
0jX1,...,Xm,Y1,...,Yl
ðÞ
HX
0jX1,...,Xm
ðÞ
1
ð40Þ
REA quantifies the extra information obtained from
the history of the random variables Xand Y, when X
has already been observed.
Entropy rate
Entropy rate measures the average information
required in order to predict a future observation given
the previous
28
observations as follows
h‘= lim
n!‘HX
0jX1,...,Xn+1ðÞ
ð41Þ
where H(X0jX1,...,Xn+1ðÞ
) represents the entropy
measure conditioned on the history of the random vari-
able X. The conditional entropy H(X0jX1) and the
entropy rate h‘are illustrated in Figure 5.
Normalized permutation entropy and number of
forbidden patterns
Zunino et al.
94
stated that the Random Walk model is
based on the assumption that the dynamic changes in
the data are uncorrelated and therefore entropy mea-
sures can be used to capture the uncertainties inherent
in the time series.
95
The authors discussed the Second
Law of Thermodynamics, which states that entropy
expands monotonically over time. In addition, they
examined Renyi,
96,97
Tsallis,
98
Approximate,
99–101
and
Transfer entropies, as well as a local approach to study
Shannon entropy.
102
Zunino et al.
103
presented two
new quantifiers for predictive models: the number of
Forbidden patterns and the normalized Permutation
entropy. Both of these metrics are model independent
and therefore have wider applicability. The number of
Forbidden patterns is positively correlated with the
inefficiency degree, and the normalized Permutation
entropy is negatively correlated with the inefficiency
degree. Forbidden patterns represent the patterns that
cannot be uncovered because of the underlying deter-
ministic structure of the time series. It is noted that
Permutation entropy is the normalized form of
Shannon entropy over the probability distribution
P=fp(pi), i=1,...,D!gas follows
HSP½=SP½
Smax
=
P
D!
i=1
ppi
ðÞln ppi
ðÞðÞ
Smax
ð42Þ
where Sstands for Shannon entropy and
Smax =lnD!(0łHSł1). It is noted that the greatest
Figure 5. The conditional entropy H(X0jX1)and the entropy
rate h‘.
Namdari and Li 7
value of HS½Pis equal to unity in the case of totally
random sequence and the smallest value of HS½Pis
equal to zero in the case of increasing/decreasing
sequence.
Entropy measures based on singular value
decomposition
Gu et al.
104
investigated the predictive ability of the
Singular Value Decomposition entropy. The multifractal-
ity of time series is due to local fluctuations of the index,
long-term memory of the volatility,
105,106
heavy-tailed dis-
tributions, herding behavior, outside information, the
intrinsic properties, non-linear incremental changes in the
data, and so on. According to their study, the tendency
of the entropy series is ahead of the component index,
which indicates that the Singular Value Decomposition
entropy has predictive power for the variable.
Caraiani
107
studied the entropy measures based on
the Singular Value Decomposition of the correlation
matrix for the components of the time series and
defined the correlation matrix of data as follows
Ri,j=yiyi
hi
ðÞyjyj
sisj
ð43Þ
where yi
hi
is the mean, and siand sjare the standard
deviations of yiand yj, respectively. He used Granger
causality tests
108–110
to determine whether a certain time
series is helpful in predicting the future dynamics of the
second variable as is shown in the following formula
yt=b0+X
N
k=1
bkytk+X
N
t=1
alxtl+utð44Þ
where ytand xtare the variables of interest and utare
the uncorrelated disturbances. The parameters kand l
are the order of lags. The null and alternative hypoth-
eses in his statistical analysis are as follows
H0:al=0for any lð45Þ
H1:al6¼ 0for at least some lð46Þ
Residual sum of squares for the initial series, RSSu,
and the restricted version without ytk,RSSv, can be
defined as follows
RSSu=X
N
t=1
u2
tð47Þ
RSSv=X
N
t=1
v2
tð48Þ
The appropriate statistic, F0, can be determined as
follows
F0=RSSuRSSv
ðÞ=l
RSSu=N2l1ðÞ ð49Þ
F
0
-statistic has land (Nl1) degrees of freedom.
Caraiani
107
used the F
0
-statistic and attempted to
address the following question: do the entropy indices
Granger-cause the time series? In other words, are the
past values of a random variable, xt, helpful in predict-
ing a second variable, yt, beyond the information con-
tained in the past values of yt?
Approximate entropy
In this section, other measures of dependencies of ran-
dom variables are presented. Different methods for
quantifying the uncertainties of stochastic processes
have been introduced including the methods based on
Fourier analysis,
111,112
Symbolic analysis,
113
Amplitude
statistics,
113–117
and Wavelet transform.
118,119
Huang
et al.
120
reported that real-world complex dynamics
reflect anomalous, chaotic, irregular, and non-
stationary characterizations due to various factors,
including internal drive and external disturbances. In
particular, the methods of analyzing the stochastic pro-
cess of financial times series include wavelet analy-
sis,
121,122
detrended fluctuation analysis,
123,124
and
diffusion entropy analysis (DEA).
125–127
Shi and
Shang
128
studied two entropy-related metrics, Cross-
sample entropy and Transfer entropy, as well as a
Cross-correlation analysis to examine the relationship
between time series among different data sets. Cross-
sample entropy, Transfer entropy, and Cross-correla-
tion measures are utilized to obtain the asynchrony,
information flow, and cross-correlation between the
time series, respectively. Other statistical tools, which
have been introduced to examine the stochastic pro-
cesses, are Correlation function, Multi-fractal, Spin-
glass models, and Complex networks.
129–137
The
Kolmogorov–Sinai entropy for real-world time series
cannot be precisely estimated. Therefore, the analysis
of the finite and noisy time series can be performed by
using Approximate entropy.
137
In addition, Cross-sam-
ple entropy is based on Approximate entropy,
138–141
which captures changes in a system complexity.
Approximate entropy can be utilized to measure pat-
terns in time series and uncover the regularities by
extracting the noises from the original data. Kristoufek
and Vosvrda
135
studied various variables including the
Efficiency indices, Long-term memory, Fractal dimen-
sion,
142
and Approximate entropy. According to their
study, the efficiency index can be defined as a distance
from the unpredictable process specification based on
various measures including Long-term memory,
Fractal dimension, and Approximate entropy.
Consider a series of numbers of length N,
u(1), ...,u(N) and form m-blocks of the subseries
8Advances in Mechanical Engineering
x(i)[(u(i), u(i+1)...,u(i+m1)), where mis a non-
negative number less than or equal to the length of the
series. We then determine Cm
i(r) as the fraction of
blocks x(j) which have a distance of less than a given
positive real number r. Approximate entropy can be
calculated as
140
ApEn m,r,NðÞuðÞ=FmrðÞFm+1rðÞ,mø0ð50Þ
where Fm(r) is defined as
FmrðÞ=1
Nm+1X
Nm+1
i=1
logCm
irðÞ ð51Þ
and Cm
i(r) can be calculated as
Cm
irðÞ=1
Nm+1X
Nm+1
j=1
YrdxiðÞ,xjðÞðÞðÞð52Þ
Yis Heaviside function which counts the instances where
the distance dis below the threshold r.Approximate
entropy quantifies the logarithmic likelihood that
sequences of patterns that are close for mobservations
remain close on next comparisons as well.
28
Case study
Battery health monitoring is important since the failure
of the battery may result in the failure of the system.
The proper and efficient battery health management
can prevent the disastrous hazards and premature fail-
ures. This section aims at determining the correspon-
dence between the loss in the capacity of the battery
and the Sample entropy (SampEn) and Approximate
entropy (ApEn) of voltages. Many approaches for esti-
mating the battery capacity are based on a direct analy-
sis of battery capacity with respect to aging cycle.
Approximate entropy is capable of quantifying the reg-
ularity of time series and therefore can be used as esti-
mators of degradation of the battery capacities.
In this section, we use the definitions of Sample
entropy and Approximate entropy and determine these
two entropy measures for the changes in the battery
voltages over time. We examine five lithium-ion bat-
teries as follows: PL19, PL17, PL11, PL10, and PL09.
The battery voltages over time are illustrated in Figure
6. Figure 7 displays the changes in the battery capaci-
ties over cycle. The Sample entropy (SampEn) and
Approximate entropy (ApEn) are calculated and
shown in Figures 8 and 9, respectively.
It is noted that the degradation of batteries may be
due to the internal shorts, opening of the short, or cell
undergoing reversal. Thus, detecting these causes of
degradation is very significant for battery diagnosis in
real-world applications.
Discussion and conclusion
Entropy was originally defined as the average number
of bits needed to store or communicate one symbol in a
message. It can also quantify the uncertainty involved
in predicting the value of a random variable. In this
article, we review various entropy measures to examine
Figure 6. Battery voltages versus cycles.
Figure 7. Battery capacities versus cycles.
Figure 8. Sample entropy (SampEn) versus cycles.
Namdari and Li 9
the uncertainties inherent in stochastic processes. It is
shown that different predictive models have been built
by using various entropy measures. The concept of
entropy has not been limited to physics and thermody-
namics. The entropy measures have been used for
developing predictive models in various disciplines
including mechanics, finance, physiology, neuroscience,
ecology, bionomics, and neurology. Moreover, various
entropy measures have been introduced depending on
the process and time series studied. These metrics
include, but are not limited to Shannon entropy,
Approximate entropy, Sample entropy, Transfer
entropy, and Permutation entropy. The concept of
entropy is helpful in characterizing stochastic processes
since it represents the uncertainty and disorder of the
process, without causing any restrictions on the theore-
tical probability distribution. The contribution of this
article lies in its description of information, uncertainty,
entropy, and ignorance in stochastic processes. More
precisely, information is the decline in disorder and
ambiguity, uncertainty is referred to the unlikelihood
of logical reasoning, entropy is the expected informa-
tion, and ignorance is the lack of knowledge regarding
the uncertainty. The key findings of this review and the
future research directions are as follows:
1. Entropy is useful in explaining the uncertainties
inherent in stochastic processes.
2. Entropy measures have predictive power, and
the type of time series studied may have an
impact on the predictive ability of the entropy
measures.
3. Entropy measures play an important role in var-
ious branches of science and disciplines.
4. Entropy is an indicator representing the com-
plexity of the signals. The greater the entropy
measures are, the more complexity the signals
accumulate, and the more information the sig-
nals contain.
5. The results of the existing studies in this research
area show that the predictive ability of the
entropy measures is related to the complex
degree of the available information.
6. Future work should examine the predictive
power of the entropy after taking into account
the effect of other variables such as periodicity.
Variables such as periodicity may make some of
the uncertainties inherent in stochastic processes
more explicit.
7. The entropy-based proposed method can be
applied to battery monitoring and prognostics,
and it is found that Sample entropy and
Approximate entropy are useful in quantifying
the fluctuation degree of a stochastic process.
These two entropy measures can quantify the
regularity of time series and assess the predict-
ability of a data sequence. Thus, they can be
applied to battery capacity data as an indicator
for battery health.
Declaration of conflicting interests
The author(s) declared no potential conflicts of interest with
respect to the research, authorship, and/or publication of this
article.
Funding
The author(s) received no financial support for the research,
authorship, and/or publication of this article.
ORCID iDs
Alireza Namdari https://orcid.org/0000-0002-1776-5179
Zhaojun (Steven) Li https://orcid.org/0000-0002-2673-
9909
References
1. Ross SM, Kelly JJ, Sullivan RJ, et al. Stochastic pro-
cesses. New York: Wiley, 1996.
2. Bag BC, Banik SK and Ray DS. Noise properties of sto-
chastic processes and entropy production. Phys Rev E
2001; 64: 026110.
3. Robinson PM. Consistent nonparametric entropy-based
testing. Rev Econ Stud 1991; 58: 437–453.
4. Sabatini AM. Analysis of postural sway using entropy
measures of signal complexity. Med Biol Eng Comput
2000; 38: 617–624.
5. Schu
¨rmann T. Bias analysis in entropy estimation. J
Phys A Math Gen 2004; 37: L295.
6. Billingsley P. Ergodic theory and information. New York:
Wiley, 1965.
7. Ebrahimi N, Maasoumi E and Soofi E. Comparison of
entropy and variance orderings. J Econometrics 1999; 90:
317–336.
8. Faddeev DK. On the concept of entropy of a finite prob-
abilistic scheme. Uspekhi Mate Nauk 1956; 11: 227–231.
Figure 9. Approximate entropy (ApEn) versus cycles.
10 Advances in Mechanical Engineering
9. Zurek WH. Complexity, entropy and the physics of infor-
mation. Boca Raton. FL: CRC Press, 2018.
10. Kullmann L, Kertesz J and Mantegna RN. Identification
of clusters of companies in stock indices via Potts super-
paramagnetic transitions. Phys A Stat Mech Appl 2000;
287: 412–419.
11. Nanda SR, Mahanty B and Tiwari MK. Clustering
Indian stock market data for portfolio management.
Expert Syst Appl 2010; 37: 8793–8798.
12. Engle RF, Focardi SM and Fabozzi FJ. ARCH/GARCH
models in applied financial econometrics. Encyclopedia
of Financial Models, 22 October 2012, http://pages.stern.
nyu.edu/;rengle/ARCHGARCH.pdf
13. Katsiampa P. Volatility estimation for Bitcoin: a compar-
ison of GARCH models. Econ Lett 2017; 158: 3–6.
14. Christensen BJ, Dahl CM and Iglesias EM. Semipara-
metric inference in a GARCH-in-mean model. J Econo-
metrics 2012; 167: 458–472.
15. Hansen PR and Huang Z. Exponential GARCH model-
ing with realized measures of volatility. J Bus Econ Stat
2016; 34: 269–287.
16. Mizuno T, Takayasu H and Takayasu M. Correlation
networks among currencies. Phys A Stat Mech Appl 2006;
364: 336–342.
17. Gunaratne GH, McCauley JL and Bassler KE. Markov
processes, Hurst exponents, and nonlinear diffusion equa-
tions: with application to finance. Phys A Stat Mech Appl
2006; 369: 343–353.
18. Kwon O and Yang JS. Information flow between stock
indices. Europhys Lett 2008; 82: 68003.
19. Podobnik B and Stanley HE. Detrended cross-correlation
analysis: a new method for analyzing two nonstationary
time series. Phys Rev Lett 2008; 100: 084102.
20. Podobnik B, Jiang ZQ, Zhou WX, et al. Statistical tests
for power-law cross-correlated processes. Phys Rev E
2011; 84: 066118.
21. Shadkhoo S and Jafari GR. Multifractal detrended cross-
correlation analysis of temporal and spatial seismic data.
Eur Phys J B 2009; 72: 679–683.
22. Csisza
´r I. Generalized cutoff rates and Re
´nyi’s informa-
tion measures. IEEE T Inform Theory 1995; 41: 26–34.
23. Jizba P, Kleinert H and Shefaat M. Re
´nyi’s information
transfer between financial time series. Phys A Stat Mech
Appl 2012; 391: 2971–2989.
24. Tsallis C. Entropic nonextensivity: a possible measure of
complexity. Chaos Soliton Fract 2002; 13: 371–391.
25. Tsallis C. Nonadditive entropy: the concept and its use.
Eur Phys J A 2009; 40: 257–266.
26. Tsallis C. Possible generalization of Boltzmann-Gibbs
statistics. J Stat Phys 1988; 52: 479–487.
27. Tsallis C. Introduction to nonextensive statistical
mechanics: approaching a complex world. Berlin: Springer
Science + Business Media, 2009.
28. Schwill S. Entropy analysis of financial time series. 2018,
https://arxiv.org/pdf/1807.09423.pdf
29. Shannon CE. Communication theory of secrecy systems.
Bell Syst Tech J 1949; 28: 656–715.
30. Shannon CE. A mathematical theory of communication.
ACM SIGMOBILE Mobile Comput Commun Rev 2001;
5: 3–55.
31. Efron B and Tibshirani RJ. An introduction to the boot-
strap. Boca Raton, FL: CRC Press, 1994.
32. Fast JD. Entropy: the significance of the concept of
entropy and its applications in science and technology.
London: Macmillan, 1982.
33. Bentes SR, Menezes R and Mendes DA. Long memory
and volatility clustering: is the empirical evidence consis-
tent across stock markets? Phys A: Stat Mech Appl 2008;
387: 3826–3830.
34. Kullback S and Leibler RA. On information and suffi-
ciency. Ann Math Stat 1951; 122: 79–86.
35. Jaynes ET. Information theory and statistical mechanics.
Phys Rev 1957; 106: 620.
36. He Z, Cai Y and Qian Q. A study of wavelet entropy the-
ory and its application in power system. In: Proceedings
of the international conference on intelligent mechatronics
and automation, Chengdu, China, 26–31 August 2004,
pp.847–851. New York: IEEE.
37. Dawes GS, Moulden M, Sheil O, et al. Approximate
entropy, a statistic of regularity, applied to fetal heart
rate data before and during labor. Obstet Gynecol 1992;
80: 763–768.
38. Richman JS and Moorman JR. Physiological time-series
analysis using approximate entropy and sample entropy.
Am J Physiol Heart Circ Physiol 2000; 278:
H2039–H2049.
39. Fleisher LA, Pincus SM and Rosenbaum SH. Approxi-
mate entropy of heart rate as a correlate of postoperative
ventricular dysfunction. Anesthesiology 1993; 78:
683–692.
40. Palazzolo JA, Estafanous FG and Murray PA. Entropy
measures of heart rate variation in conscious dogs. Am J
Physiol Heart Circy Physiol 1998; 274: H1099–H1105.
41. Pincus SM and Viscarello RR. Approximate entropy: a
regularity measure for fetal heart rate analysis. Obstet
Gynecol 1992; 79: 249–255.
42. Maasoumi E and Racine J. Entropy and predictability of
stock market returns. J Econometrics 2002; 107: 291–312.
43. Li SY, Yang M, Li CC, et al. Analysis of heart rate varia-
bility based on singular value decomposition entropy. J
Shanghai Univ Eng Ed 2008; 12: 433.
44. Lian JJ, Li HK and Zhang JW. Vibration of hydraulic
structure modal identification method ERA based on
order reduction of singular entropy. Sci China Ser E Sci
Technol 2008; 38: 1398–1413.
45. Ljung GM and Box GE. On a measure of lack of fit in
time series models. Biometrika 1978; 65: 297–303.
46. Jizba P and Arimitsu T. The world according to Re
´nyi:
thermodynamics of multifractal systems. Ann Phys 2004;
312: 17–59.
47. Re
´nyi A. On measures of entropy and information. Buda-
pest: Hungarian Academy of Sciences, 1961.
48. Renner R and Wolf S. Smooth Re
´nyi entropy and appli-
cations. In: Proceedings of the international symposium on
information theory, Chicago, IL, 27 June–2 July 2004,
p.233. New York: IEEE.
49. Guo J, Shi W and Li J. A sampled entropy-based method
for estimating lithium ion battery capacity under non-
standard operation condition. In: Proceedings of the IIE
annual conference, Pittsburgh, PA, 20–23 May 2017,
Namdari and Li 11
pp.43–48. Norcross, GA: Institute of Industrial and Sys-
tems Engineers (IISE).
50. Li J and Guo J. A new feature extraction algorithm based
on entropy cloud characteristics of communication sig-
nals. Math Prob Eng 2015; 2015: 1–8.
51. Sun YH, Jou HL and Wu JC. Auxiliary diagnosis method
for lead–acid battery health based on sample entropy.
Energ Convers Manage 2009; 50: 2250–2256.
52. Widodo A, Shim MC, Caesarendra W, et al. Intelligent
prognostics for battery health monitoring based on sam-
ple entropy. Expert Syst Appl 2011; 38: 11763–11769.
53. Zheng Y, Han X, Lu L, et al. Lithium ion battery pack
power fade fault identification based on Shannon entropy
in electric vehicles. J Power Sources 2013; 223: 136–146.
54. Hu X, Li SE, Jia Z, et al. Enhanced sample entropy-based
health management of Li-ion battery for electrified vehi-
cles. Energy 2014; 64: 953–960.
55. Li H, Pan D and Chen CP. Intelligent prognostics for
battery health monitoring using the mean entropy and
relevance vector machine. IEEE T Syst Man Cy Syst
2014; 44: 851–862.
56. Hu X, Jiang J, Cao D, et al. Battery health prognosis for
electric vehicles using sample entropy and sparse Baye-
sian predictive modeling. IEEE T Ind Electron 2016; 63:
2645–2656.
57. Rethinam S, Rajagopalan S, Janakiraman S, et al. Jitters
through dual clocks: an effective entropy source for true
random number generation. In: Proceedings of the inter-
national conference on computer communication and infor-
matics (ICCCI), Coimbatore, India, 4–6 January 2018,
pp.1–5. New York: IEEE.
58. Tsur D. The effective entropy of next/previous larger/
smaller value queries. Inform Process Lett 2019; 145:
39–43.
59. Costa MD and Goldberger AL. Generalized multiscale
entropy analysis: application to quantifying the complex
volatility of human heartbeat time series. Entropy 2015;
17: 1197–1203.
60. Wu SD, Wu CW, Lin SG, et al. Analysis of complex time
series using refined composite multiscale entropy. Phys
Lett A 2014; 378: 1369–1374.
61. James RG, Barnett N and Crutchfield JP. Information
flows? A critique of transfer entropies. Phys Rev Lett
2016; 116: 238701.
62. Zheng L, Pan W, Li Y, et al. Use of mutual information
and transfer entropy to assess interaction between para-
sympathetic and sympathetic activities of nervous system
from HRV. Entropy 2017; 19: 489.
63. Villaverde AF, Ross J, Mora
´n F, et al. MIDER: network
inference with mutual information distance and entropy
reduction. PLoS ONE 2014; 9: e96732.
64. Montalto A, Faes L and Marinazzo D. MuTE: a new
Matlab toolbox for estimating the multivariate transfer
entropy in physiological variability series. In: Proceedings
of the 8th conference of the European Study Group on car-
diovascular oscillations, Trento, 25–28 May 2014, pp.59–
60. New York: IEEE.
65. Wibral M, Vicente R and Lindner M. Transfer entropy
in neuroscience. In: Wibral M, Vicente R and Lizier JT
(eds) Directed information measures in neuroscience. Ber-
lin: Springer, 2014, pp.3–36.
66. Blumentritt T and Schmid F. Mutual information as a
measure of multivariate association: analytical properties
and statistical estimation. J Stat Comput Simul 2012; 82:
1257–1274.
67. Schreiber T. Measuring information transfer. Phys Rev
Lett 2000; 85: 461–464.
68. Dimpfl T and Peter FJ. Using transfer entropy to mea-
sure information flows between financial markets. Stud
Nonlin Dynam Econ 2013; 17: 85–102.
69. Pincus S and Singer BH. Randomness and degrees of irre-
gularity. Proc Natl Acad Sci U S A 1996; 93: 2083–2088.
70. Dionisio A, Menezes R and Mendes DA. Mutual infor-
mation: a measure of dependency for nonlinear time
series. Phys A Stat Mech Appl 2004; 344: 326–329.
71. Tumminello M, Aste T, Di Matteo T, et al. A tool for fil-
tering information in complex systems. Proc Natl Acad
Sci U S A 2005; 102: 10421–10426.
72. Golan A. Information and entropy econometrics—a
review and synthesis. Found Trends Econ 2008; 2: 1–45.
73. Cover TM and Thomas JA. Elements of information the-
ory. Hoboken, NJ: John Wiley & Sons, 2012.
74. Geweke J. Measurement of linear dependence and feed-
back between multiple time series. J Am Stat Assoc 1982;
77: 304–313.
75. Vicente R, Wibral M, Lindner M, et al. Transfer
entropy–a model-free measure of effective connectivity
for the neurosciences. J Comput Neurosci 2011; 30:
45–67.
76. Staniek M and Lehnertz K. Symbolic transfer entropy.
Phys Rev Lett 2008; 100: 158101.
77. Bauer M, Cox JW, Caveness MH, et al. Finding the
direction of disturbance propagation in a chemical pro-
cess using transfer entropy. IEEE T Control Syst Technol
2007; 15: 12–21.
78. Ito S, Hansen ME, Heiland R, et al. Extending transfer
entropy improves identification of effective connectivity
in a spiking cortical network model. PLoS ONE 2011; 6:
e27431.
79. Marschinski R and Kantz H. Analysing the information
flow between financial time series. Eur Phys J B Condense
Matter Complex Syst 2002; 30: 275–281.
80. Chowdhury D and Stauffer D. A generalized spin model
of financial markets. Eur Phys J B Condense Matter Com-
plex Syst 1999; 8: 477–482.
81. Ghashghaie S, Breymann W, Peinke J, et al. Turbulent
cascades in foreign exchange markets. Nature 1996; 381:
767.
82. Plerou V, Gopikrishnan P, Rosenow B, et al. Universal
and nonuniversal properties of cross correlations in finan-
cial time series. Phys Rev Lett 1999; 83: 1471.
83. Lux T and Marchesi M. Scaling and criticality in a sto-
chastic multi-agent model of a financial market. Nature
1999; 397: 498.
84. Sornette D, Johansen A and Bouchaud JP. Stock market
crashes, precursors and replicas. JPhysI1996; 6:
167–175.
85. Mantegna RN and Stanley HE. Introduction to econophy-
sics: correlations and complexity in finance. Cambridge:
Cambridge University Press, 1999.
12 Advances in Mechanical Engineering
86. Kwon O and Yang JS. Information flow between compo-
site stock index and individual stocks. Phys A Stat Mech
Appl 2008; 387: 2851–2856.
87. Hong Y and White H. Asymptotic distribution theory for
nonparametric entropy measures of serial dependence.
Econometrica 2005; 73: 837–901.
88. Joe H. Relative entropy measures of multivariate depen-
dence. J Am Stat Assoc 1989; 84: 157–164.
89. Dimitriadis S, Sun Y, Laskaris N, et al. Revealing cross-
frequency causal interactions during a mental arithmetic
task through symbolic transfer entropy: a novel vector-
quantization approach. IEEE T Neural Syst Rehabil Eng
2016; 24: 1017–1028.
90. Porta A, Faes L, Nollo G, et al. Conditional self-entropy
and conditional joint transfer entropy in heart period
variability during graded postural challenge. PLoS ONE
2015; 10: e0132851.
91. Murari A, Peluso E, Gelfusa M, et al. Application of
transfer entropy to causality detection and synchroniza-
tion experiments in tokamaks. Nucl Fusion 2015; 56:
026006.
92. Dickten H and Lehnertz K. Identifying delayed direc-
tional couplings with symbolic transfer entropy. Phys Rev
E2014; 90: 062706.
93. Sensoy A, Sobaci C, Sensoy S, et al. Effective transfer
entropy approach to information flow between exchange
rates and stock markets. Chaos Soliton Fract 2014; 68:
180–185.
94. Zunino L, Zanin M, Tabak BM, et al. Complexity-
entropy causality plane: a useful approach to quantify
the stock market inefficiency. Phys A Stat Mech Appl
2010; 389: 1891–1901.
95. Bachelier L. The
´orie de la spe
´culation. Annales scientifi-
ques de l’E
´cole normale supe
´rieure. Vol. 17, 1900.
96. De Gregorio A and Iacus SM. On Re
´nyi information for
ergodic diffusion processes. Inform Sci 2009; 179:
279–291.
97. Dionisio A, Menezes R and Mendes DA. An econophy-
sics approach to analyse uncertainty in financial markets:
an application to the Portuguese stock market. Eur Phys
J B Condense Matter Complex Syst 2006; 50: 161–164.
98. Matesanz D and Ortega GJ. A (econophysics) note on
volatility in exchange rate time series: entropy as a rank-
ing criterion. Int J Modern Phys C 2008; 19: 1095–1103.
99. Oh G, Kim S and Eom C. Market efficiency in foreign
exchange markets. Phys A: Stat Mech Appl 2007; 382:
209–212.
100. Pincus S and Kalman RE. Irregularity, volatility, risk,
and financial market time series. Proc Natl Acad Sci U
SA2004; 101: 13709–13714.
101. Rukhin AL. Approximate entropy for testing random-
ness. J Appl Prob 2000; 37: 88–100.
102. Risso WA. The informational efficiency and the finan-
cial crashes. Res Int Bus Finance 2008; 22: 396–408.
103. Zunino L, Zanin M, Tabak BM, et al. Forbidden pat-
terns, permutation entropy and stock market ineffi-
ciency. Phys A Stat Mech Appl 2009; 388: 2854–2864.
104. Gu R, Xiong W and Li X. Does the singular value
decomposition entropy have predictive power for stock
market?—evidence from the Shenzhen stock market.
Phys A Stat Mech Appl 2015; 439: 103–113.
105. Brooks C and Persand G. Volatility forecasting for risk
management. J Forecast 2003; 22: 1–22.
106. Giot P and Laurent S. Modelling daily value-at-risk
using realized volatility and ARCH type models. J
Empir Finance 2004; 11: 379–398.
107. Caraiani P. The predictive power of singular value
decomposition entropy for stock market dynamics. Phys
A Stat Mech Appl 2014; 393: 571–578.
108. Barnett L, Barrett AB and Seth AK. Granger causality
and transfer entropy are equivalent for Gaussian vari-
ables. Phys Rev Lett 2009; 103: 238701.
109. Gourieroux C, Monfort A and Renault E. Kullback
causality measures. Ann Econ Stat 1987; 6–7: 369–410.
110. Granger CW. Testing for causality: a personal view-
point. J Econ Dynam Control 1980; 2: 329–352.
111. Darbellay GA and Wuertz D. The entropy as a tool for
analysing statistical dependences in financial time series.
Phys A Stat Mech Appl 2000; 287: 429–439.
112. Powell GE and Percival IC. A spectral entropy method
for distinguishing regular and irregular motion of
Hamiltonian systems. J Phys A Math Gen 1979; 12:
2053.
113. Daw CS, Finney CE and Tracy ER. A review of sym-
bolic analysis of experimental data. Rev Sci Instrum
2003; 74: 915–930.
114. Broock WA, Scheinkman JA, Dechert WD, et al. A test
for independence based on the correlation dimension.
Econ Rev 1996; 15: 197–235.
115. Cameron AC and Windmeijer FA. An R-squared
measure of goodness of fit for some common
nonlinear regression models. J Econometrics 1997; 77:
329–342.
116. De Micco L, Gonza
´lez CM, Larrondo HA, et al. Ran-
domizing nonlinear maps via symbolic dynamics. Phys
A Stat Mech Appl 2008; 387: 3373–3383.
117. Shu G, Zeng B, Chen YP, et al. Performance assessment
of kernel density clustering for gene expression profile
data. Int J Genomics 2003; 4: 287–299.
118. Rosso OA and Mairal ML. Characterization of time
dynamical evolution of electroencephalographic epilep-
tic records. Phys A Stat Mech Appl 2002; 312: 469–504.
119. Rosso OA, Blanco S, Yordanova J, et al. Wavelet
entropy: a new tool for analysis of short duration brain
electrical signals. J Neurosci Methods 2001; 105: 65–75.
120. Huang J, Shang P and Zhao X. Multifractal diffusion
entropy analysis on stock volatility in financial markets.
Phys A Stat Mech Appl 2012; 391: 5739–5745.
121. Muzy JF, Bacry E and Arneodo A. The multifractal
formalism revisited with wavelets. Int J Bifu Chaos 1994;
4: 245–302.
122. Muzy JF, Bacry E and Arneodo A. Wavelets and multi-
fractal formalism for singular signals: application to tur-
bulence data. Phys Rev Lett 1991; 67: 3515.
123. Peng CK, Buldyrev SV, Havlin S, et al. Mosaic
organization of DNA nucleotides. Phys Rev E 1994; 49:
1685.
124. Kantelhardt JW, Zschiegner SA, Koscielny-Bunde E, et
al. Multifractal detrended fluctuation analysis of nonsta-
tionary time series. Phys A Stat Mech Appl 2002; 316:
87–114.
Namdari and Li 13
125. Scafetta N and Grigolini P. Scaling detection in time
series: diffusion entropy analysis. Phys Rev E 2002; 66:
036130.
126. Scafetta N, Latora V and Grigolini P. Le
´vy scaling: the
diffusion entropy analysis applied to DNA sequences.
Phys Rev E 2002; 66: 031906.
127. Cai SM, Zhou PL, Yang HJ, et al. Diffusion entropy
analysis on the scaling behavior of financial markets.
Phys A Stat Mech Appl 2006; 367: 337–344.
128. Shi W and Shang P. Cross-sample entropy statistic as a
measure of synchronism and cross-correlation of stock
markets. Nonlin Dynam 2013; 71: 539–554.
129. Anderson PW. The economy as an evolving complex sys-
tem. Boca Raton, FL: CRC Press, 2018.
130. Laloux L, Cizeau P, Bouchaud JP, et al. Noise dressing
of financial correlation matrices. Phys Rev Lett 1999; 83:
1467.
131. Ma WJ, Hu CK and Amritkar RE. Stochastic dynami-
cal model for stock-stock correlations. Phys Rev E 2004;
70: 026101.
132. Stanley HE and Mantegna RN. An introduction to econ-
ophysics. Cambridge: Cambridge University Press, 2000.
133. Bouchaud JP. Elements for a theory of financial risks.
Phys A Stat Mech Appl 1999; 263: 415–426.
134. Mandelbrot BB. Scaling in financial prices: II. Multi-
fractals and the star equation. Quant Finance 2001; 1:
124–130.
135. Kristoufek L and Vosvrda M. Measuring capital market
efficiency: long-term memory, fractal dimension and
approximate entropy. Eur Phys J B 2014; 87: 162.
136. Giada L and Marsili M. Algorithms of maximum likeli-
hood data clustering with applications. Phys A Stat
Mech Appl 2002; 315: 650–664.
137. Liu LZ, Qian XY and Lu HY. Cross-sample entropy of
foreign exchange time series. Phys A Stat Mech Appl
2010; 389: 4785–4792.
138. Pincus S. Approximate entropy (ApEn) as a complexity
measure. Chaos 1995; 5: 110–117.
139. Pincus S. Approximate entropy as an irregularity mea-
sure for financial data. Econ Rev 2008; 27: 329–362.
140. Pincus SM. Approximate entropy as a measure of sys-
tem complexity. Proc Natl Acad Sci U S A 1991; 88:
2297–2301.
141. Pincus SM and Huang WM. Approximate entropy: sta-
tistical properties and applications. Commun Stat The-
ory Methods 1992; 21: 3061–3077.
142. Schwarz G. Estimating the dimension of a model. Ann
Stat 1978; 6: 461–464.
14 Advances in Mechanical Engineering
Available via license: CC BY 4.0
Content may be subject to copyright.