ArticlePDF Available

Abstract

It is demonstrated that extremization of entropy production of stochastic representations of natural systems, performed at asymptotic times (zero or infinity) results in constant derivative of entropy in logarithmic time and, in turn, in Hurst-Kolmogorov processes. The constraints used include preservation of the mean, variance and lag-1 autocovariance at the observation time step, and an inequality relationship between conditional and unconditional entropy production, which is necessary to enable physical consistency. An example with real world data illustrates the plausibility of the findings.
A preview of the PDF is not available
... Hence, in complex systems the quantity that gets extremized is the entropy-or it could be the entropy production when time is involved (Koutsoyiannis, 2011b). As will be clarified in section 2.9, entropy is a fully stochastic concept. ...
... The next important step was made by Shannon (1948), who used a definition essentially similar to Planck's to describe the information content, which he also called entropy at von Neumann's suggestion (Robertson, 1993;Brissaud, 2005;Koutsoyiannis, 2011b). According to the latter definition, entropy is a probabilistic concept, a measure of information or, equivalently, uncertainty. ...
... In a stochastic process the change of uncertainty in time can be quantified by the entropy production, i.e. the time derivative of the entropy [ ( )] of the cumulative process ( ) (Koutsoyiannis, 2011b): ...
Preprint
Full-text available
This is a working draft of a book in preparation. Current version 0.4 – uploaded on ResearchGate on 25 January 2022. (Earlier versions: 0.3 – uploaded on ResearchGate on 17 January 2022. 0.2 – uploaded on ResearchGate on 3 January 2022. 0.1 (initial) – uploaded on ResearchGate on 1 January 2022.) Some stuff is copied from Koutsoyiannis (2021, https://www.researchgate.net/ publication/351081149). Comments and suggestions will be greatly appreciated and acknowledged.
... The HK dynamics are also linked to the entropy maximization principle, and thus, to robust physical justification [8]. It is worth noting that the stochastic simulation of the HK dynamics is still a mathematical challenge [9] since it requires the explicit preservation of high-order moments in a vast range of scales [10,11], affecting both the intermittent (fractal) behavior in small scales [12] and the dependence in extremes [13]. ...
... depicts the scaled images at scales k = 2,4,8,16,20,25,40,50, 80, 100 and 200, which are used to calculate the 2D climacogram. Encyclopedia 2021, 1, Benchmark of image analysis, the evolution of the universe[68]: (a) 500 million years after Big Bang image with faint clustering, average brightness 0.45; (b) 5000 million years after Big Bang image with clustering, average brightness 0.37; (c) 10,000 million years after Big Bang image with intense clustering, average brightness 0.33. ...
Article
Full-text available
The stochastic analysis in the scale domain (instead of the traditional lag or frequency domains) is introduced as a robust means to identify, model and simulate the Hurst–Kolmogorov (HK) dynamics, ranging from small (fractal) to large scales exhibiting the clustering behavior (else known as the Hurst phenomenon or long-range dependence). The HK clustering is an attribute of a multidimensional (1D, 2D, etc.) spatio-temporal stationary stochastic process with an arbitrary marginal distribution function, and a fractal behavior on small spatio-temporal scales of the dependence structure and a power-type on large scales, yielding a high probability of low- or high-magnitude events to group together in space and time. This behavior is preferably analyzed through the second-order statistics, and in the scale domain, by the stochastic metric of the climacogram, i.e., the variance of the averaged spatio-temporal process vs. spatio-temporal scale.
... Data from [5,6]. 58 59 Figure 2. Relative frequency of appearances of the indicated key phrases in the article title, ab- 60 stract and keywords of about 70 million articles written in English, which are contained in the 61 Scopus database [7] up to year 2020. 62 Out of its physical and stochastic context, the term "entropy" is typically used meta- 63 phorically and hence its meaning becomes ambiguous or diverse. ...
... 195 The next important step was made by Shannon in 1948 [58]. Shannon used an essen-196 tially similar, albeit more general, entropy definition to describe the information content, 197 which he also called entropy at von Neumann's suggestion [59][60][61]. According to the latter 198 definition, entropy is a probabilistic concept, a measure of information or, equivalently, 199 uncertainty. ...
Preprint
Full-text available
While entropy was introduced in the second half of the 19th century in the international vocabulary as a scientific term, in the 20th century it became common in colloquial use. Popular imagination has loaded “entropy” with almost every negative quality in the universe, in life and in society, with a dominant meaning of disorder and disorganization. Exploring the history of the term and many different approaches on it, we show that entropy has a universal stochastic definition which is not disorder. The accompanying principle of maximum entropy, which lies behind the Second Law, gives explanatory and inferential power to the concept and promotes entropy as the mother of creativity and evolution. As social sciences are often contaminated by subjectivity and ideological influences, we try to explore whether the maximum entropy, applied to the distribution of wealth quantified by annual income, can give an objective description. Using publicly available income data, we show that the income distribution is consistent with the principle of the maximum entropy. The increase of entropy is associated to increase of society’s wealth yet a standardized form of en-tropy can be used to quantify inequality. Historically, technology has played a major role in de-velopment and increase of the entropy of income. Such findings are contrary to the theories of ecological economics and other theories which use the term entropy in a Malthusian perspective.
... Whether or not long-range dependence occurs in hydrological time series, and if it is pre-asymptotic behaviour only or whether any physical processes exist that can explain such a statistical model, remains a frequently discussed topic (e.g. Klemes 1974;Salas et al. 1979;Mesa and Poveda 1993;Beran 1994;Koutsoyiannis 2011). ...
Article
Full-text available
Previous studies suggest that flood-rich and flood-poor periods are present in many flood peak discharge series around the globe. Understanding the occurrence of these periods and their driving mechanisms is important for reliably estimating future flood probabilities. We propose a method for detecting flood-rich and flood-poor periods in peak-over-threshold series based on scan-statistics and combine it with a flood typology in order to attribute the periods to their flood-generating mechanisms. The method is applied to 164 observed flood series in southern Germany from 1930 to 2018. The results reveal significant flood-rich periods of heavy-rainfall floods, especially in the Danube river basin in the most recent decades. These are consistent with trend analyses from the literature. Additionally, significant flood-poor periods of snowmelt-floods in the immediate past were detected, especially for low-elevation catchments in the alpine foreland and the uplands. The occurrence of flood-rich and flood-poor periods is interpreted in terms of increases in the frequency of heavy rainfall in the alpine foreland and decreases of both soil moisture and snow cover in the midlands.
... Natural processes are expected to exhibit specific stochastic similarities because they are all bounded by the law of entropy extremization (e.g., see the physical justification of the LRD in [9]). This concept has been confirmed in recent studies, such as the one by Dimitriadis et al. [6], where stochastic similarities (including LRD) are traced in all key hourly and daily-scale hydrological-cycle processes (i.e., surface air temperature, relative humidity, dew-point, close-to-surface wind speed, atmospheric pressure, streamflow, and precipitation) by analysing the most comprehensive available databases (i.e., including tens of billions of values from several thousands of globally distributed stations). ...
Article
Full-text available
The identification of the second-order dependence structure of streamflow has been one of the oldest challenges in hydrological sciences, dating back to the pioneering work of H.E Hurst on the Nile River. Since then, several large-scale studies have investigated the temporal structure of streamflow spanning from the hourly to the climatic scale, covering multiple orders of magnitude. In this study, we expanded this range to almost eight orders of magnitude by analysing small-scale streamflow time series (in the order of minutes) from ground stations and large-scale streamflow time series (in the order of hundreds of years) acquired from paleoclimatic reconstructions. We aimed to determine the fractal behaviour and the long-range dependence behaviour of the stream-flow. Additionally, we assessed the behaviour of the first four marginal moments of each time series to test whether they follow similar behaviours as suggested in other studies in the literature. The results provide evidence in identifying a common stochastic structure for the streamflow process, based on the Pareto-Burr-Feller marginal distribution and a generalized Hurst-Kolmogorov (HK) dependence structure.
... This finding suggests how complexity, which captures the balance between emergence and self-organization, can explain the dynamics of Hu. The association can be attributed to the extremization of entropy production which results in persistence of the process (Koutsoyiannis 2011). It is observed that for low-snow-dominated catchments, the correlation between Hu and C is significant. ...
Article
In this study, catchments are considered as complex systems, and information-theoretic measures are used to capture temporal streamflow characteristics. Emergence and self-organization are used to quantify information production and order in streamflow time series, respectively. The measure complexity is used to quantify the balance between emergence and self-organization in streamflow variability. The complexity measure is found to be effective in distinguishing streamflow variability for high and low snow-dominated catchments. The state of persistence-reflecting the memory of streamflow time series, is shown to be related to the complexity of streamflow. Moreover, it is observed that conventional causal detection methods are constrained by the state of persistence, and more robust methods are needed in hydrological applications considering persistence.
... The estimators and models applied for both the marginal and the second-order dependence structures are part of the stochastic framework of the HK dynamics, with a focus on the LRD behavior [30][31][32][33][34][35], and they have been applied to turbulent and key hydrologicalcycle processes of global networks with resolutions spanning from small scales (relevant to the fractal behavior) to climatic scales (for a review, see [26]). ...
Article
Full-text available
The stochastic structures of potential evaporation and evapotranspiration (PEV and PET or ETo) are analyzed using the ERA5 hourly reanalysis data and the Penman–Monteith model applied to the well-known CIMIS network. The latter includes high-quality ground meteorological samples with long lengths and simultaneous measurements of monthly incoming shortwave radiation, temperature, relative humidity, and wind speed. It is found that both the PEV and PET processes exhibit a moderate long-range dependence structure with a Hurst parameter of 0.64 and 0.69, respectively. Additionally, it is noted that their marginal structures are found to be light-tailed when estimated through the Pareto–Burr–Feller distribution function. Both results are consistent with the global-scale hydrological-cycle path, determined by all the above variables and rainfall, in terms of the marginal and dependence structures. Finally, it is discussed how the existence of, even moderate, long-range dependence can increase the variability and uncertainty of both processes and, thus, limit their predictability.
... This approach is further enhanced to assess the uncertainty in the nonstationary parameters through Bayesian Inference (Cheng and AghaKouchak, 2015;Sarhadi and Soulis, 2017). Some studies argue that the incorporation of non-stationarity in extreme event modeling (Koutsoyiannis, 2011;Koutsoyiannis and Montanari, 2014) adds to the uncertainty, since the nonstationary modeling includes the changes merely based on the trend in the hydrologic variable. These changes may also be due to physical processes (Ragno et al., 2019;Serinaldi and Kilsby, 2015), and hence, incorporating the relevant physical processes in extreme event modeling is beneficial (Montanari and Koutsoyiannis, 2014;Ragno et al., 2019). ...
Article
The overwhelming increase and variations in the extreme rainfall events demand the use of a nonstationary Intensity-Duration-Frequency (IDF) curve for the design and management of water resource infrastructure. Generally, nonstationary IDF curves were developed by incorporating the trend in the distribution parameter using Generalized Extreme Value distribution (GEV) with time as a covariate. The physical process influencing the variations in a hydrologic variable can be captured by utilizing relevant climatic variables (climate-informed) as covariates, since time alone cannot be the best covariate. Hence this study investigates the potential climate-informed covariates influencing the extreme rainfall and incorporates the best covariates to develop a realistic nonstationary IDF relationship. Unlike previous studies, a Time Sliding Window (TSW) approach is employed to detect the changing distribution parameters before performing Nonstationary Modeling (NSM). The proposed covariate based TSW-NSM is effectively used to construct IDF curves for seven major metropolitan cities of India. Several models are generated based on the detected changing parameters and combinations of covariates. Then, Bayesian Differential Evolutionary Monte Carlo (DE-MC) algorithm is employed to estimate the uncertainty bound of the nonstationary parameters, and the best model is chosen using the Deviance Information Criterion (DIC). The results reveal that the best covariate combinations for short duration events are dominated by local processes (i.e., Local temperature changes and diurnal temperature changes), whereas the same for longer duration events are dominated by the global process (global warming, ENSO Modoki cycle, and IOD). However, the acceptable nonstationary models reveal that all the temperature-based covariates are capable of capturing the dynamic behavior. It is also observed that the local processes carry the signature of global processes. Finally, the return levels computed through the best nonstationary model show that the return periods are decreasing, and the short duration events have undergone drastic changes than the longer duration events. Therefore, employing climate-informed covariates based nonstationary IDF curves is indispensable for devising long-term strategies to address the changing climate.
... The next important step was made by Shannon in 1948 [69]. Shannon used an essentially similar, albeit more general, definition describing information content, which he also called entropy, at von Neumann's suggestion [70][71][72]. According to the latter definition, entropy is a probabilistic concept, a measure of information or, equivalently, uncertainty. ...
Article
Full-text available
While entropy was introduced in the second half of the 19th century in the international vocabulary as a scientific term, in the 20th century it became common in colloquial use. Popular imagination has loaded “entropy” with almost every negative quality in the universe, in life and in society, with a dominant meaning of disorder and disorganization. Exploring the history of the term and many different approaches to it, we show that entropy has a universal stochastic definition, which is not disorder. Hence, we contend that entropy should be used as a mathematical (stochastic) concept as rigorously as possible, free of metaphoric meanings. The accompanying principle of maximum entropy, which lies behind the Second Law, gives explanatory and inferential power to the concept, and promotes entropy as the mother of creativity and evolution. As the social sciences are often contaminated by subjectivity and ideological influences, we try to explore whether maximum entropy, applied to the distribution of a wealth-related variable, namely annual income, can give an objective description. Using publicly available income data, we show that income distribution is consistent with the principle of maximum entropy. The increase in entropy is associated to increases in society’s wealth, yet a standardized form of entropy can be used to quantify inequality. Historically, technology has played a major role in the development of and increase in the entropy of income. Such findings are contrary to the theory of ecological economics and other theories that use the term entropy in a Malthusian perspective.
... The statistical bias in the climacogram estimator can be calculated as follows. As shown in Koutsoyiannis's study [15], assuming that we have n = T/∆ observations of the averaged process x i (∆) , where the observation period T is an integer multiple of ∆, the time-resolution, the expected value of the empirical (sample) climacogramγ(∆) is: ...
Article
Full-text available
We investigate the impact of time's arrow on the hourly streamflow process. Although time asymmetry, i.e., temporal irreversibility, has been previously implemented in stochastics, it has only recently attracted attention in the hydrological literature. Relevant studies have shown that the time asymmetry of the streamflow process is manifested at scales up to several days and vanishes at larger scales. The latter highlights the need to reproduce it in flood simulations of fine-scale resolution. To this aim, we develop an enhancement of a recently proposed simulation algorithm for irreversible processes, based on an asymmetric moving average (AMA) scheme that allows for the explicit preservation of time asymmetry at two or more timescales. The method is successfully applied to a large hourly streamflow time series from the United States Geological Survey (USGS) database, with time asymmetry prominent at time scales up to four days.
Article
Full-text available
According to the traditional notion of randomness and uncertainty, natural phenomena are separated into two mutually exclusive components, random (or stochastic) and deterministic. Within this dichotomous logic, the deterministic part supposedly represents cause-effect relationships and, thus, is physics and science (the "good"), whereas randomness has little relationship with science and no relationship with understanding (the "evil"). We argue that such views should be reconsidered by admitting that uncertainty is an intrinsic property of nature, that causality implies dependence of natural processes in time, thus suggesting predictability, but even the tiniest uncertainty (e.g., in initial conditions) may result in unpredictability after a certain time horizon. On these premises it is possible to shape a consistent stochastic representation of natural processes, in which predictability (suggested by deterministic laws) and unpredictability (randomness) coexist and are not separable or additive components. Deciding which of the two dominates is simply a matter of specifying the time horizon of the prediction. Long horizons of prediction are inevitably associated with high uncertainty, whose quantification relies on understanding the long-term stochastic properties of the processes.
Article
Full-text available
We study the temporal correlations in the atmospheric variability by 14 meteorological stations around the globe, the variations of the daily maximum temperatures from their average values. We apply several methods that can systematically overcome possible nonstationarities in the data. We find that the persistence, characterized by the correlation C(s) of temperature variations separated by s days, approximately decays C\(s\)~s-gamma, with roughly the same exponent gamma≅0.7 for all stations considered. The range of this universal persistence law seems to exceed one decade, and is possibly even larger than the range of the temperature series considered.
Book
This book focuses on nonextensive statistical mechanics, a current generalization of Boltzmann-Gibbs (BG) statistical mechanics, one of the greatest monuments of contemporary physics. Conceived more than 130 years ago by Maxwell, Boltzmann and Gibbs, the BG theory exhibits many impressive successes in physics, chemistry, mathematics, and computational sciences. Presently, several thousands of publications by scientists around the world have been dedicated to its nonextensive generalization. A variety of applications have emerged in complex systems and its mathematical grounding is by now well advanced. A pedagogical introduction to its concepts - nonlinear dynamics, extensivity of the nonadditive entropy, global correlations, and extensions of the standard central limit theorems, among others - is presented in this book, as well as a selection of paradigmatic applications in various sciences and diversified experimental verifications of some of its predictions. Introduction to Nonextensive Statistical Mechanics is suitable for students and researchers with an interest in complex systems and statistical physics. © Springer Science+Business Media, LLC 2009. All rights reserved.
Article
By design fast fractional Gaussian noises (ffGn) have the following characteristics: The number of operations needed to generate them is relatively small. The long run statistical dependence that they exhibit is strong and has the form required for self similar hydrology. Their short run properties, as expressed by the correlations between successive or nearly successive yearly averages, are adjustable (within bounds) and can be fitted to the corresponding short run properties of records. Extension to the multidimensional (multisite) case should raise no essential difficulty. Finally, their definition, as sums of Markov-Gauss and other simple processes, fits the intuitive ideas that climate can be visualized as either unpredictably inhomogeneous or ruled by a hierarchy of variable regimes.
Article
Prologue 1. THE ZEROTH LAW: the concept of temperature 2. THE FIRST LAW: the conservation of energy 3. THE SECOND LAW: the increase in entropy 4. FREE ENERGY: the availability of work 5. THE THIRD LAW: the unattainability of zero Conclusion