Article

Measuring Information Transfer

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

An information theoretic measure is derived that quantifies the statistical coherence between systems evolving in time. The standard time delayed mutual information fails to distinguish information that is actually exchanged from shared information due to common history and input signals. In our new approach, these influences are excluded by appropriate conditioning of transition probabilities. The resulting transfer entropy is able to distinguish effectively driving and responding elements and to detect asymmetry in the interaction of subsystems.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... In this work, we used transfer entropy (TE) (Schreiber, 2000) to map the inter-regional or intersessional or inter-subject information flow. We implemented several TE calculation algorithms in C++ and parallel threading and then applied it to data from the Human Connectome Project (HCP) . ...
... Denote two time series, for example, the time series of two brain voxels, by X = [x 1 , x 2 , … x N ] and = [y 1 , y 2 , … y N ], where N is the total number of timepoints which can be different between X and Y though the same length was used here for the simplicity of description. TE (Schreiber, 2000) quantifies the influence of the past of one variable on the future status of the other or alternatively whether adding the previous status of one variable can better predict the current or future status of the other variable than simply relying on the past status of that variable itself or the dependence of future state of one variable on the past states of itself and another variable. Mathematically, TE is measured through the deviation from the generalized Markov property, which is ( +1 | , ) = ( +1 | ) for TE from y to x (TE(Y->X) or simply TE(Y,X)). ...
... Similar to Shannon entropy, TE depends on an accurate estimation of the probability density function (PDF). Since its inception, TE has been estimated using a few different methods through different ways of probability density estimation (Kaiser and Schreiber, 2002;Schreiber, 2000;Staniek and Lehnertz, 2008). We have implemented the standard histogram PDF estimation based approach with or without using the symbolization (Staniek and Lehnertz, 2008), the Darbellay-Vajda partitioning (Darbellay and Vajda, 1999) PDF estimation based approach, the kernel density estimation based PDF based TE calculation (Lee et al., 2012), and the simplified format (Kaiser and Schreiber, 2002). ...
Article
Full-text available
Human brain is a massive information generation and processing machine. Studying the information flow may provide unique insight into brain function and brain diseases. We present here a tool for mapping the regional information flow in the entire brain using fMRI. Using the tool, we can estimate the information flow from a single region to the rest of the brain, between different regions, between different days, or between different individuals' brain.
... A way of estimating connectivity and disambiguate spurious correlations from actual connections is by inferring the information that flows from one neuronal spike train to another. For this, we can use information-theoretical measures [20][21][22], which are functions of these transition probabilities. Thus, the estimation of the transition probabilities is essential. ...
... Transfer entropy (TE), an information-theoretic measure for quantifying time-directed information transfer between joint processes, was proposed by [20] and independently by [23] as an effective measure of causality. A closely related concept that measures information transport is the transfer entropy rate (TER). ...
... This is particularly the case when recording neuronal and network signals, without or with equal external stimulation to the neurons, so that their activity is stationary, and inferring causal relationships, especially when dealing with different data formats. For a thorough review, we refer the reader to [20,46,47]. Thus, in this paper, we consider a plug-in estimator for the transfer entropy rate T(X → Y) between the jointly stationary ergodic chains X and Y (see Section 2.2). ...
Article
Full-text available
Information theory explains how systems encode and transmit information. This article examines the neuronal system, which processes information via neurons that react to stimuli and transmit electrical signals. Specifically, we focus on transfer entropy to measure the flow of information between sequences and explore its use in determining effective neuronal connectivity. We analyze the causal relationships between two discrete time series, X:=Xt:t∈Z and Y:=Yt:t∈Z, which take values in binary alphabets. When the bivariate process (X,Y) is a jointly stationary ergodic variable-length Markov chain with memory no larger than k, we demonstrate that the null hypothesis of the test—no causal influence—requires a zero transfer entropy rate. The plug-in estimator for this function is identified with the test statistic of the log-likelihood ratios. Since under the null hypothesis, this estimator follows an asymptotic chi-squared distribution, it facilitates the calculation of p-values when applied to empirical data. The efficacy of the hypothesis test is illustrated with data simulated from a neuronal network model, characterized by stochastic neurons with variable-length memory. The test results identify biologically relevant information, validating the underlying theory and highlighting the applicability of the method in understanding effective connectivity between neurons.
... In this study, we introduce a workflow to benchmark algorithms that have been applied to infer neuronal connectivity from large-scale extracellular recordings [36,40,[53][54][55][56]. We, therefore, standardize the output of algorithms and compare statistically inferred connectivity estimates on synthetic ground-truth data sets and experimentally obtained connectivity labels. ...
... Several studies have applied information-theoretic measures to estimate neuronal connectivity and information flow between brain regions and individual cells [51,52,54,63,64]. Here, we utilized an efficient algorithmic implementation of transfer entropy (TE) [65] to infer connectivity from discretized spike trains. ...
... Measures how knowledge about the spike history of one neuron reduces the uncertainty about the future spiking of another neuron [54,65]. ...
Article
Full-text available
Probing the architecture of neuronal circuits and the principles that underlie their functional organization remains an important challenge of modern neurosciences. This holds true, in particular, for the inference of neuronal connectivity from large-scale extracellular recordings. Despite the popularity of this approach and a number of elaborate methods to reconstruct networks, the degree to which synaptic connections can be reconstructed from spike-train recordings alone remains controversial. Here, we provide a framework to probe and compare connectivity inference algorithms, using a combination of synthetic ground-truth and in vitro data sets, where the connectivity labels were obtained from simultaneous high-density microelectrode array (HD-MEA) and patch-clamp recordings. We find that reconstruction performance critically depends on the regularity of the recorded spontaneous activity, i.e., their dynamical regime, the type of connectivity, and the amount of available spike-train data. We therefore introduce an ensemble artificial neural network (eANN) to improve connectivity inference. We train the eANN on the validated outputs of six established inference algorithms and show how it improves network reconstruction accuracy and robustness. Overall, the eANN demonstrated strong performance across different dynamical regimes, worked well on smaller datasets, and improved the detection of synaptic connectivity, especially inhibitory connections. Results indicated that the eANN also improved the topological characterization of neuronal networks. The presented methodology contributes to advancing the performance of inference algorithms and facilitates our understanding of how neuronal activity relates to synaptic connectivity.
... We first introduce the studies of information flow analysis between financial markets and financial markets and sentiment, summarising the use of Granger Causality [17], and Shannon Transfer Entropy [18] in this context. We then provide recent research on stock movement prediction. ...
... It is generally acknowledged that news events can substantially impact the short-term direction of stock Prices. Granger Causality [17], and Shannon Transfer Entropy [18] have been widely used in the finance field for causal analysis to quantify the extent of information transmission between markets and market and news sentiments time series. In the following, we review some of these works. ...
... Transfer Entropy [18] is based on Shannon's entropy [51], which measures the amount of information in random processes. TE estimates the reduction of uncertainty of the observations of Y (target process), accounted by both the past observations of X (driver process) and the past observations of Y, compared to the reduction of uncertainty of the observations of Y accounted only by its past [18]. ...
Article
Full-text available
Our study aims to investigate the interdependence between international stock markets and sentiments from financial news in stock forecasting. We adopt the Temporal Fusion Transformers (TFT) to incorporate intra and inter-market correlations and the interaction between the information flow, i.e. causality, of financial news sentiment and the dynamics of the stock market. The current study distinguishes itself from existing research by adopting Dynamic Transfer Entropy (DTE) to establish an accurate information flow propagation between stock and sentiments. DTE has the advantage of providing time series that mine information flow propagation paths between certain parts of the time series, highlighting marginal events such as spikes or sudden jumps, which are crucial in financial time series. The proposed methodological approach involves the following elements: a FinBERT-based textual analysis of financial news articles to extract sentiment time series, the use of the Transfer Entropy and corresponding heat maps to analyze the net information flows, the calculation of the DTE time series, which are considered as co-occurring covariates of stock Price, and TFT-based stock forecasting. The Dow Jones Industrial Average index of 13 countries, along with daily financial news data obtained through the New York Times API, are used to demonstrate the validity and superiority of the proposed DTE-based causality method along with TFT for accurate stock Price and Return forecasting compared to state-of-the-art time series forecasting methods.
... The idea was further developed in the field of finance by Granger, called Granger causality (GC) [3]. GC was initially applied to linear financial models, later extended to nonlinear systems [1,[4][5][6][7][8][9][10]. These extensions have various branches, including parametric approaches like extended Granger causality (EGC) [4], nonparametric methods like predictability improvement (PI) [5,6], and information-based measures like transfer entropy (TE) [7,8]. ...
... GC was initially applied to linear financial models, later extended to nonlinear systems [1,[4][5][6][7][8][9][10]. These extensions have various branches, including parametric approaches like extended Granger causality (EGC) [4], nonparametric methods like predictability improvement (PI) [5,6], and information-based measures like transfer entropy (TE) [7,8]. Additionally, there is another GC's branch based on frequency domain, namely partial directed coherence [9]. ...
... It is intimately associated with GC [3], i.e., predictability causality. With regard to the information-based measures [1] such as TE [7], they are also conceptually related to predictability [8], thus linked to continuity causality. Both of them measure the increase of future uncertainty in the effect by removing past information (more strictly, the measurements) of the cause. ...
Article
Full-text available
Inferring causal relationships between time series variables unveils the interaction pattern in dynamical systems and thus contributes to our better understanding of nature’s laws. An accepted notion is that measuring causality from the uncertainty reduction of the effect by considering measurements of the cause. This thought originally makes use of predictability and recently has been considered problematic when separability of subsystems does not hold. But there is no clear mathematical criterion for interpreting the specific drawbacks of this concept. In this paper, to explore the criterion, a framework similar to the accepted notion but based on mapping continuity is introduced. Under this framework, we conclude a partially testable premise that ensures the validity of inferred results. Furthermore, to put this concept into practice, a state space reconstruction technique that conforms to this framework is introduced to model unknown mappings to estimate the continuity. Finally, an approach is naturally developed to detect causality from mapping continuity changes. We validate our approach on discovering causal networks from multiple time series. In addition, we demonstrate the application in studying the effects of alcoholism on brain functional links.
... We captured the statistical precedence between HR and RR to quantify the functional effect of each process on the other. For this purpose, we used HR and RR original univariate time series (both mean-centered as in the case of HR state-space , , and ) to compute their transfer entropy (TE) [96]: ...
... We repeated this procedure for the case of → (Appendix B) to recompute their final → . We used JIDT [98] (version v1.5 for Python) implementation of and that are based on Kraskov-Stoegbauer-Grassberger (KSG) algorithm [99], its implementation of that is based on Tononi-Sporns-Edelman (TSE) algorithm [95], and its KSG-based implementation of Schreiber [96] TE computation. We carried out all the computations and analyses in Python 3.10.4. ...
... From a broader perspective, this dependence provided further evidence for the critical role of the inhibition in generation of (cardiorespiratory) rhythms [133]. In this respect, the absence of any significant contribution from → in our results echoed its association with cardiorespiratory pathology (i.e., sleep apnea [134]) [96] [135, p. 178]. ...
... information between two time series [10]. It has been useful for identifying the causal inferences of correlated thermal fluctuations in proteins from the trajectories of molecular dynamics simulation [11][12][13][14][15] or those produced by coarse-grained approaches such as elastic network models, specifically the Gaussian network model (GNM) [13,[16][17][18][19]. ...
... These sequence motifs are used to generate contact maps. The contact maps are then used to apply the entropy transfer concept of Schreiber [10] in combination with dGNM [18] to identify sources and sinks of information, as previously described. This methodology offers a computationally efficient means to pinpoint potential allosteric regulation sites, opening new avenues for understanding the conserved allosteric regulation mechanisms in protein domains and relating them to the proteins' functions. ...
Preprint
Full-text available
The amino acid sequence determines the structure, function, and dynamics of a protein. In recent years, enormous progress has been made in translating sequence information into 3D structural information using artificial intelligence. However, because of the underlying methodology, it is an immense computational challenge to extract this information from the ever-increasing number of sequences. In the present study, we show that it is possible to create 2D contact maps from sequences, for which only a few exemplary structures are available on a laptop without the need for GPUs. This is achieved by using a pattern-matching approach. The resulting contact maps largely reflect the interactions in the 3D structures. This approach was used to explore the evolutionarily conserved allosteric mechanisms and identify the source–sink (driver-driven) relationships by using an established method that combines Schreiber’s concept of entropy transfer with a simple Gaussian network model. The validity of our method was tested on the DHFR, PDZ, SH3, and S100 domains, with our predictions consistently aligning with the experimental findings.
... They are often employed 37 in the field of neuroscience to quantify temporal correlations in activity between 38 different regions of the brain, as in the case of correlated neural firing between regions 39 of the brain which can reveal correlated behavior, even if the regions are spatially 40 separated [25]. Information theory provides a set of mathematical tools to quantify such 41 correlations, where measures such as mutual information [26] and transfer entropy [27] 42 applied across temporally sampled data can be used to construct FC networks that 43 capture nonlinearities and structure not apparent in static images. A predecessor and 44 contemporary approach to constructing FC networks is anatomical connectivity maps, 45 which focuses on physical tracts that can reveal direct anatomical links between 46 different physical regions in a tissue. ...
... Aside from increasing the quality and quantity of data, future work will explore the 324 use of other measures of dependency between cell activity: the FC approach is 325 undirected and does not account for time-directed effective connections (where the past 326 state of one cell influences the future state of another). Measures of effective 327 connectivity such as the transfer entropy may provide a more refined perspective on 328 information "flow" by considering temporal directionality of signals [27]. Furthermore, 329 there has recently been an explosion of interest in the phenomena of 330 higher-order/beyond-pairwise interactions in complex systems [66,67], and many of the 331 tools that have been developed could be easily slotted into the general framework we 332 present here [68][69][70]. ...
Preprint
Full-text available
A central challenge in the progression of a variety of open questions in biology, such as morphogenesis, wound healing, and development, is learning from empirical data how information is integrated to support tissue-level function and behavior. Information-theoretic approaches provide a quantitative framework for extracting patterns from data, but so far have been predominantly applied to neuronal systems at the tissue-level. Here, we demonstrate how time series of Ca ²⁺ dynamics can be used to identify the structure and information dynamics of other biological tissues. To this end, we expressed the calcium reporter GCaMP6s in an organoid system of explanted amphibian epidermis derived from the African clawed frog Xenopus laevis , and imaged calcium activity pre- and post- a puncture injury, for six replicate organoids. We constructed functional connectivity networks by computing mutual information between cells from time series derived using medical imaging techniques to track intracellular Ca ²⁺ . We analyzed network properties including degree distribution, spatial embedding, and modular structure. We find organoid networks exhibit more connectivity than null models, with high degree hubs and mesoscale community structure with spatial clustering. Utilizing functional connectivity networks, we show the tissue retains non-random features after injury, displays long range correlations and structure, and non-trivial clustering that is not necessarily spatially dependent. Our results suggest increased integration after injury, possible cellular coordination in response to injury, and some type of generative structure of the anatomy. While we study Ca ²⁺ in Xenopus epidermal cells, our computational approach and analyses highlight how methods developed to analyze functional connectivity in neuronal tissues can be generalized to any tissue and fluorescent signal type. Our framework therefore provides a bridge between neuroscience and more basal modes of information processing. Author summary A central challenge in understanding several diverse processes in biology, including morphogenesis, wound healing, and development, is learning from empirical data how information is integrated to support tissue-level function and behavior. Significant progress in understanding information integration has occurred in neuroscience via the use of observable live calcium reporters throughout neural tissues. However, these same techniques have seen limited use in the non-neural tissues of multicellular organisms despite similarities in tissue communication. Here we utilize methods designed for neural tissues and modify them to work on any tissue type, demonstrating how non-neural tissues also contain non-random and potentially meaningful structures to be gleaned from information theoretic approaches. In the case of epidermal tissue derived from developing amphibians, we find non-trivial informational structure over greater spatial and temporal scales than those found in neural tissue. This hints at how more exploration into information structures within these tissue types could provide a deeper understanding into information processing within living systems beyond the nervous system.
... While significant progress has been made, the rigorous inference of causal relationships between bacterial species using information theory remains an underexplored area. In this study, we address this gap by constructing a causal interaction network employing transfer entropy (TE) [21], an information-theoretic measure for determining causality between variables. ...
... TENET algorithm [26] computes transfer entropy for a given pair of components in an interaction network, when time-series for these components are given. The transfer entropy is defined as [21] ܶ ‫ܧ‬ ՜ non-parametric approach to estimate the probability density function of a random variable [26]. For simplicity, we will consider only the effect of the immediate past of the current time-point and use L = 1. ...
Preprint
Full-text available
Understanding the complex dynamics of gut microbiota interactions is essential for unraveling their influence on human health. In this study, we employed transfer entropy analysis to construct a causal interaction network among gut microbiota genera from the time-series data of bacterial abundances. Based on the longitudinal microbiome data from two subjects, we found that the constructed gut microbiota regulatory networks exhibited power-law degree distribution, intermediate modularity, and enrichment of feedback loops. Interestingly, the networks of the two subjects displayed differential enrichment of feedback loops, which may be associated with the differential recovery dynamics of the two subjects. In summary, the transfer entropy-based network construction provides us with valuable insights into the ecosystem of gut microbiota and allows us to identify key microbial hubs that play pivotal roles in shaping the microbial balances.
... An open question is what are the causal links of these turbulent motions, especially in the context of inner and outer motions, either bottom-up (Adrian et al., 2000), top-down (Hunt & Morrison, 2000), co-supporting (Toh & Itano, 2005;Zhou et al., 2022) or independently self-sustained at all scales (Cossu & Hwang, 2017). For this purpose, We resort to the two widely adopted metrics which have been applied in physics, atmospheric sciences as well as fluid dynamics research recently, i.e. the transfer entropy (Schreiber, 2000) and the Liang-Kleeman information flow (Liang & Kleeman, 2005;Liang, 2013Liang, , 2014Liang, , 2016. Here we define the inner motions as the universal near-wall cycle (Hamilton et al., 1995;Waleffe, 1997), and the outer motions as those living in the logarithmic layer exhibiting a footprint on the near-wall region (Hutchins & Marusic, 2007;Marusic et al., 2010). ...
... The framework of information theory (Shannon, 1948) can be employed to quantify causality among time signals of different variables. A key metric is the transfer entropy (Schreiber, 2000), which has been applied in turbulent-flow problems recently (Lozano-Durán et al., 2020;Wang et al., 2021bWang et al., , 2022Martínez-Sánchez et al., 2023). ...
Preprint
Full-text available
In this work, we study the causality of near-wall inner and outer turbulent motions. Here we define the inner motions as the self-sustained near-wall cycle and the outer motions as those living in the logarithmic layer exhibiting a footprint on the near-wall region. We perform causal analysis using two different methods: one is the transfer entropy, based on the information theory, and the other one is the Liang-Kleeman information-flow theory. The causal-analysis methods are applied to several scenarios, including a linear and a non-linear problem, a low-dimensional model of the near-wall cycle of turbulence, as well as the interaction between inner and outer turbulent motions in a channel at a friction Reynolds number of Re τ = 1000. We find that both methods can well predict the causal links in the linear problem, and the information flow can identify more of the nonlinear problem. Despite richer causalities revealed by the transfer entropy for turbulent-flow problems, both methods can successfully identify the streak-vortex regeneration mechanism that majorly sustains the near-wall turbulence. It is also indicated that both bottom-up and top-down influences of inner and outer motions may coexist in addition to the multiscale self-sustaining mechanism. Lastly, we mention that the computation of the information flow is much more efficient than the transfer entropy. The present study suggests that the information flow can have great potential in causal inference for turbulent-flow problems besides the transfer entropy.
... Simulation results demonstrate that EMI ofers higher accuracy. Second, we use efective transfer entropy (ETE) to quantify information transfer [16,17]. Compared with traditional Granger causality, which is only appropriate for linear relations, ETE can be used in both linear and nonlinear systems. ...
... Tus, the transfer entropy (TE) from Y to X is [17,19,20] ...
Article
Full-text available
Bitcoin futures exchange-traded funds (ETFs) are recent innovations in cryptocurrency investment. This article studies the price-volume relationship in this market from an information perspective. We first propose effective mutual information which has better estimation accuracy to analyze the contemporaneous relationship. Using half-hourly trading data of the world’s largest Bitcoin futures ETF, we find that trading volume changes and returns contain information about each other and are contemporaneously dependent. Then, we employ effective transfer entropy to examine the intertemporal relationship. The results show that there exists information transfer from volume changes to returns in most of our sample period, suggesting the presence of return predictability and market inefficiency. However, information transfer in the opposite direction occurs much less frequently, and the amount is typically smaller.
... With its roots in communication systems [12], it provides an invaluable tool for quantifying the uncertainty of variables and the amount of information transmitted between them. The seminal work by Schreiber [13] popularized transfer entropy to quantify the causality between variables in a dynamical system. A comprehensive review of existing methods based on transfer entropy can be found in Bossomaier et al. [14]. ...
Article
Full-text available
We use an information-theoretic method, referred to as information flux, to quantify the causal relationships between streaks and bursts in a non-intrusive manner. Within this framework, causality is quantified as the flux of Shannon information from the present of the quantities of interest to their future. We also use the so-called information leak to measure the information that is not accounted for due to unobserved variables. We investigate data from a direct numerical simulation of turbulent channel flow at the friction Reynolds number Re τ ≈ 180. The spatial distribution of causality is investigated as two time scales based on 50% value of the information leak and maximum of the normalized cross-induced information flux. It is found that the most causal spatial configuration for streaks and bursts is always streamwise-aligned. Furthermore, four dominant causal spatial configurations between streaks and bursts are identified. At the short time scale around 10 viscous units (when information flux is 50%), streaks and bursts have comparable causality to one another. At the longer time scale (corresponding to the maximum cross-induced causality), there is greater causality from streaks to bursts.
... A major drawback of Granger causality is its linearity, which may render it insufficient to expose causal behavior for nonlinear systems. On the other hand, transfer entropy causality analyses [10] have successfully identified relevant dynamics in the field of fluids [11][12][13]. One of the main strengths of transfer entropy is that it is agnostic to the form of the system dynamics, although a significantly large amount of data and a high computational cost are required to achieve converged results [14][15][16]. ...
Article
Full-text available
This research focuses on the identification and causality analysis of coherent structures that arise in turbulent flows in square and rectangular ducts. Coherent structures are first identified from direct numerical simulation data via proper orthogonal decomposition (POD), both by using all velocity components, and after separating the streamwise and secondary components of the flow. The causal relations between the mode coefficients are analysed using pairwise-conditional Granger causality analysis. We also formulate a nonlinear Granger causality analysis that can account for nonlinear interactions between modes. Focusing on streamwise-constant structures within a duct of short streamwise extent, we show that the causal relationships are highly sensitive to whether the mode coefficients or their squared values are considered, whether nonlinear effects are explicitly accounted for, and whether streamwise and secondary flow structures are separated prior to causality analyses. We leverage these sensitivities to determine that linear mechanisms underpin causal relationships between modes that share the same symmetry or anti-symmetry properties about the corner bisector, while nonlinear effects govern the causal interactions between symmetric and antisymmetric modes. In all cases, we find that the secondary flow fluctuations (manifesting as streamwise vorticial structures) are the primary cause of both the presence and movement of near-wall streaks towards and away from the duct corners.
... work using sequential thresholded least-squares; and (v) Granger Causality (GC) 41 , which is implemented using multivariate granger causality toolbox 42 in MATLAB. We do not include inference techniques relying on correlations 43 , partial correlation 7 , and transfer entropy 10 for the comparative analysis as it has been shown that ARNI performs better than these approaches 18 . Furthermore, to ensure a fair comparison, we use an identical set of basis functions for the proposed method (SINC), ARNI, ICON, SINDY, and LASSO. ...
Article
Full-text available
The first step towards advancing our understanding of complex networks involves determining their connectivity structures from the time series data. These networks are often high-dimensional, and in practice, only a limited amount of data can be collected. In this work, we formulate the network inference task as a bilinear optimization problem and propose an iterative algorithm with sequential initialization to solve this bilinear program. We demonstrate the scalability of our approach to network size and its robustness against measurement noise, hyper-parameter variation, and deviations from the network model. Results across experimental and simulated datasets, comprising oscillatory, non-oscillatory, and chaotic dynamics, showcase the superior inference accuracy of our technique compared to existing methods. Determining the connectivity structure of a system of interconnected units from measurement data is a ubiquitous problem with applications spanning diverse domains, from engineering to biology. In practice, these systems are often high-dimensional, and only a limited amount of sampled data is available due to cost constraints and experimental limitations, creating a fundamental bottleneck for data-driven inference. Current methodologies address this by either constraining the network's topology, such as by enforcing sparsity, or by assuming the availability of network equations. Here, we introduce a framework, sequential inference of network connectivity (SINC), to effectively determine network connectivity without assumptions on the network topology or prior knowledge of network dynamics. Specifically, we consider networks having similar coupling dynamics and present a tractable iterative procedure for inference. We find that the proposed approach enhances the inference accuracy of the recovered networks compared to existing methods over a wide range of network models.
... In a financial context, according to Dimpfl and Peter (2014), specific time-series properties and an asymmetric measure are required to quantify the information flow. Thus, Schreiber (2000) proposes a measure of the information flow based on the Shannon entropy and more specifically, on mutual information, TE: ...
... Once the time series for each of the 122 regions had been extracted for each time window, they were correlated according to normalized transfer entropy (TE)(see Fig 1-B). The TE is described in [48] and given by the equation 1. The normalized TE metric was selected as our study's primary analytical tool, building upon prior research findings [34,35]. ...
Preprint
Neurodevelopmental conditions, such as Autism Spectrum Disorder (ASD) and Attention Deficit Hyperactivity Disorder (ADHD), present unique challenges due to overlapping symptoms, making an accurate diagnosis and targeted intervention difficult. Our study employs advanced machine learning techniques to analyze functional magnetic resonance imaging (fMRI) data from individuals with ASD, ADHD, and typically developed (TD) controls, totaling 120 subjects in the study. Leveraging multiclass classification (ML) algorithms, we achieve superior accuracy in distinguishing between ASD, ADHD, and TD groups, surpassing existing benchmarks with an area under the ROC curve near 98%. Our analysis reveals distinct neural signatures associated with ASD and ADHD: individuals with ADHD exhibit altered connectivity patterns of regions involved in attention and impulse control, whereas those with ASD show disruptions in brain regions critical for social and cognitive functions. The observed connectivity patterns, on which the ML classification rests, agree with established diagnostic approaches based on clinical symptoms. Furthermore, complex network analyses highlight differences in brain network integration and segregation among the three groups. Our findings pave the way for refined, ML-enhanced diagnostics in accordance with established practices, offering a promising avenue for developing trustworthy clinical decision-support systems.
... For constructing a brain functional network, many algorithms have been proposed by researchers in recent years (Schreiber 2000;Baccalá and Sameshima 2001). However, most of these methods are unable to contend with volume conduction. ...
Article
Full-text available
Graph-theory-based topological impairment of the whole-brain network has been verified to be one of the characteristics of mild cognitive impairment (MCI). However, two major challenges impede the further understanding of topological features for the personalized functional connectivity network of early Parkinson’s disease (ePD) with MCI. The uncertain of characteristic frequency band reflecting the abnormality of ePD-MCI and the setting of fixed length of sliding window at a second level in the construction of conventional brain network both limit a deeper exploration of network characteristics for ePD-MCI. Thus, a convolutional neural network is constructed first and the gradient-weighted class activation mapping method is used to determine the characteristic frequency band of the ePD-MCI. It is found that 1–4 Hz is a characteristic frequency band for recognizing MCI in ePD. Then, we propose a microstate window construction method based on electroencephalography microstate sequences to build brain functional network. By exploring the graph-theory-based topological features and their clinical correlations with cognitive impairment, it is shown that the clustering coefficient, global efficiency, and local efficiency of the occipital lobe significantly decrease in ePD-MCI, which reflects the low degree of nodes interconnection, low efficiency of parallel information transmission and low communication efficiency among the nodes in the brain network of the occipital lobe may be the neural marker of ePD-MCI. The finding of personalized topological impairments of the brain network may be a potential characteristic of early PD-MCI.
... Estimating information flow across two processes, however, requires combining Shannon entropy with Kullback-Leibler distance concept introduced by Kullback and Leibler 78 based on the assumption that the evolution of the underlying processes follows a Markov process. 79 The approach states that given two discrete random variables I and J that are stationary Markov processes of orders k and l, with marginal distributions p(i) and p( j) and joint distribution p(i, j), the probability that process I would be observed at time t + 1 for state I, conditional upon k past observations, is given as: ...
Article
The urgent need to achieve the COP-27 targets has become evident due to the more frequent and severe climate-induced disasters and their socioeconomic consequences. The adverse effects of climate change highlight the need for an immediate shift away from fossil fuels without compromising economic development. Internalization of the negative externality in market transactions through the imposition of carbon pricing is widely touted as the most economically efficient means of solving this problem. This study, however, argues that crude oil price volatility could be an unintended consequence of carbon pricing. To this end, information flows between the European Union Emissions Trading Scheme and crude oil price volatility are examined through transfer entropy and wavelet-partial wavelet coherence analyses. Daily data from January 1, 2014 to July 1, 2023 are analyzed. The transfer entropy results show that information on carbon pricing reduces uncertainty about crude oil price volatility and vice versa, indicating that carbon pricing would be quite informative in building models to predict crude oil price volatility. The wavelet-partial coherence analyses reveal that the surge in carbon prices experienced in the late 2010s induced crude oil volatility, whereas the crude oil price volatility triggered by the COVID-19 pandemic forced carbon prices down. This study therefore identifies carbon price movements as a legitimate fear for policymakers, as it is a new source of volatility in conventional energy markets. Caution should thus be the watchword regarding optimal carbon pricing. Aiming to rapidly attain the full optimal carbon price is not recommended. Rapid changes in carbon prices will have strong redistributive implications across economies.
... Pairwise TE and CTE or causation entropy are extensions of the definition of entropy described by Shannon in 1948 [56]. TE was formalized concurrently by Schreiber [57] and Palus et al [58] to assess the information exchange between two variables (X and Y) over time. It is a metric commonly used to detect the coupling strength and direction between time series variables. ...
Article
Full-text available
This study presents a data-driven framework for modeling complex systems, with a specific emphasis on traffic modeling. Traditional methods in traffic modeling often rely on assumptions regarding vehicle interactions. Our approach comprises two steps: first, utilizing information- theoretic (IT) tools to identify interaction directions and candidate variables thus eliminating assumptions, and second, employing the sparse identification of nonlinear systems (SINDy) tool to establish functional relationships. We validate the framework’s efficacy using synthetic data from two distinct traffic models, while considering measurement noise. Results show that IT tools can reliably detect directions of interaction as well as instances of no interaction. SINDy proves instrumental in creating precise functional relationships and determining coefficients in tested models. The innovation of our framework lies in its ability to use data-driven approach to model traffic dynamics without relying on assumptions, thus offering applications in various complex systems beyond traffic.
... To better understand their causal dynamics, employing a non-linear causal inference method such as transfer entropy (TE) is essential. 43,44 TE quantifies the reduction in uncertainty when predicting the future state of one process based on the knowledge of the current and past states of another process. As a deliberately asymmetric measure, TE is frequently employed to deduce the directionality of information flow and, consequently, the causal relationship between the two processes. ...
Article
Full-text available
Background Sleep and circadian rhythm disruptions are common in patients with mood disorders. The intricate relationship between these disruptions and mood has been investigated, but their causal dynamics remain unknown. Methods We analysed data from 139 patients (76 female, mean age = 23.5 ± 3.64 years) with mood disorders who participated in a prospective observational study in South Korea. The patients wore wearable devices to monitor sleep and engaged in smartphone-delivered ecological momentary assessment of mood symptoms. Using a mathematical model, we estimated their daily circadian phase based on sleep data. Subsequently, we obtained daily time series for sleep/circadian phase estimates and mood symptoms spanning >40,000 days. We analysed the causal relationship between the time series using transfer entropy, a non-linear causal inference method. Findings The transfer entropy analysis suggested causality from circadian phase disturbance to mood symptoms in both patients with MDD (n = 45) and BD type I (n = 35), as 66.7% and 85.7% of the patients with a large dataset (>600 days) showed causality, but not in patients with BD type II (n = 59). Surprisingly, no causal relationship was suggested between sleep phase disturbances and mood symptoms. Interpretation Our findings suggest that in patients with mood disorders, circadian phase disturbances directly precede mood symptoms. This underscores the potential of targeting circadian rhythms in digital medicine, such as sleep or light exposure interventions, to restore circadian phase and thereby manage mood disorders effectively. Funding 10.13039/501100010446Institute for Basic Science, the Human Frontiers Science Program Organization, the 10.13039/501100003725National Research Foundation of Korea, and the Ministry of Health & Welfare of South Korea.
... The main idea of Granger causality was to draw a causal link between two variables x j and x k if the past of x j would enhance the predictability of the future of x k . Another attempt, coming from the dynamical system literature, was based on the concept of transfer entropy [33,34]. Crucially, as noted in Ref. [21], Granger causality and transfer entropy give similar information and are equivalent for Gaussian variables [35]. ...
Article
Full-text available
We propose a data-driven framework to describe spatiotemporal climate variability in terms of a few entities and their causal linkages. Given a high-dimensional climate field, the methodology first reduces its dimension- ality into a set of regionally constrained patterns. Causal relations among such patterns are then inferred in the interventional sense through the fluctuation-response formalism. To distinguish between true and spurious responses, we propose an analytical null model for the fluctuation-dissipation relation, therefore allowing us for uncertainty estimation at a given confidence level. We showcase the methodology on the sea surface temperature field from a state-of-the-art climate model. The usefulness of the proposed framework for spatiotemporal climate data is demonstrated in several ways. First, we focus on the correct identification of known causal relations across tropical basins. Second, we show how the methodology allows us to visualize the cumulative response of the whole system to climate variability in a few selected regions. Finally, each pattern is ranked in terms of its causal strength, quantifying its relative ability to influence the system’s dynamics. We argue that the methodology allows us to explore and characterize causal relations in spatiotemporal fields in a rigorous and interpretable way.
... On a theoretical side, the developments include techniques using the improvement of forecasting skill (e.g. Granger, 1969;Papagiannopoulou et al., 2017b;Papana et al., 2021), analogs (Sugihara et al., 2012), networks (Runge, 2018), or information entropy dynamics (Schreiber, 2000;Paluš et al., 2001;Liang and Kleeman, 2005;Hlaváčková-Schindler et al., 2007;Liang, 2021;Silini and Masoller, 2021). Some of these approaches have been compared in recent works (Krakovská et al., 2018;Runge et al., 2019a,b;Docquier et al., 2024), which provide useful information on which method to use and for what purpose. ...
Preprint
Full-text available
The information entropy budget and the rate of information transfer between variables is studied in the context of a nonlinear reduced-order atmospheric model. The key ingredients of the dynamics are present in this model, namely the baroclinic and barotropic instabilities, the instability related to the presence of an orography, the dissipation related to the surface friction, and the large-scale meridional imbalance of energy. For the parameter chosen, the solutions of this system display a chaotic dynamics reminiscent of the large-scale atmospheric dynamics in the extra-tropics. The detailed information entropy budget analysis of this system reveals that the linear rotation terms plays a minor role in the generation of uncertainties as compared to the orography and the surface friction. Additionally, the dominant contribution comes from the nonlinear advection terms, and their decomposition in synergetic (co-variability) and single (impact of each single variable on the target one) components reveals that for some variables the co-variability dominates the information transfer. The estimation of the rate of information transfer based on time series is also discussed, and an extension of the Liang's approach to nonlinear observables, is proposed.
... To evaluate observed semantic information, interventions are applied to the joint dynamics of the autonomous agent and of the environment with the objective of perturbing the flow of information between them and then evaluate the part of such a flow that affects the autonomous agent existence. The temporal flow of syntactic information is evaluated by computing the concept of transfer entropy from the state description of the environment Y t at time t to the state of the system X t+1 at time t + 1 as [25] ...
Chapter
Full-text available
The concept of semantic information refers to the type of information that has some “significance” or “meaning” for a given system. Its use to describe how precisely the desired meaning is conveyed makes possible to characterize systems in terms of autonomous agents that are able to achieve an intrinsic goal or to accomplish a specific task. Two different types of semantic information are well recognized and used in the literature: i. ‘stored’ semantic information, which refers to information exchanged between a system and its environment in its initial distribution, and ii. ‘observed’ semantic information, which denotes the information that is dynamically acquired by a system to maintain its own existence. Both the concepts of stored and observed semantic information were first introduced by Kolchinsky and Wolpert in 2018. In this paper we present an approach to measure observed semantic information. Its quantitative measure is obtained for a smart drug delivery scenario where synthetic cells sense an environment made up of cancerous cells. These release a signal molecule that triggers the production of a cytotoxic drug by the synthetic cell. For the same scenario, the stored semantic information has already been computed. The main novel contribution compared to the evaluation of stored semantic information consists in a measure of the minimal perception of the environment [in bits] that allows a system to maintain its own functionality (as a proxy of its own existence) during its joint dynamic evolution with the environment, i.e. not decreasing its viability compared to full environment perception. Moreover, we provide a preliminary discussion about how the quantification of semantic information can contribute to better define what is meaningful to an agent. With this result we emphasize once again the role that “synthetic cells” have as new (bio)technological platform for theoretical and applied investigations of semantic information in biological systems.
... Relational Inference. Classical approaches to relational inference (RI) measure correlations (Peng et al. 2009), mutual information (Wu et al. 2020), transfer entropy (Schreiber 2000) or causality (Quinn et al. 2011) of system trajectories. These approaches do not perform future state predictions. ...
Article
Relational inference aims to identify interactions between parts of a dynamical system from the observed dynamics. Current state-of-the-art methods fit the dynamics with a graph neural network (GNN) on a learnable graph. They use one-step message-passing GNNs---intuitively the right choice since non-locality of multi-step or spectral GNNs may confuse direct and indirect interactions. But the effective interaction graph depends on the sampling rate and it is rarely localized to direct neighbors, leading to poor local optima for the one-step model. In this work, we propose a graph dynamics prior (GDP) for relational inference. GDP constructively uses error amplification in non-local polynomial filters to steer the solution to the ground-truth graph. To deal with non-uniqueness, GDP simultaneously fits a ``shallow'' one-step model and a polynomial multi-step model with shared graph topology. Experiments show that GDP reconstructs graphs far more accurately than earlier methods, with remarkable robustness to under-sampling. Since appropriate sampling rates for unknown dynamical systems are not known a priori, this robustness makes GDP suitable for real applications in scientific machine learning. Reproducible code is available at https://github.com/DaDaCheng/GDP.
... Consequently, this oversight can potentially result in misleading conclusions regarding directed information transfer. To unveil the authentic influence of 's past on the present of , Schreiber [9] highlighted the importance of conditioning on 's past state(s) within Equation 1 using conditional mutual information. This leads to the introduction of transfer entropy denoted by T → below, a concept designed to overcome the constraints of time-delayed mutual information. ...
Article
Full-text available
In collective systems, influence of individuals can permeate an entire group through indirect interactions— complicating any scheme to understand individual roles from observations. A typical approach to understand an individual’s influence on another involves consideration of confounding factors, for example, by conditioning on other individuals outside of the pair. This becomes unfeasible in many cases as the number of individuals increases. In this article, we review some of the unforeseen problems that arise in understanding individual influence in a collective such as single cells, as well as some of the recent works which address these issues using tools from information theory. Fullsize Image
... However, the GC test usually only applies to causality detection of separable variables [6,7]. Additionally, transfer entropy(namely, T E), based on the concept of information entropy, is another causality detection method proposed [8]. If one variable can reduce the uncertainty of another variable, causality between the two variables is affirmed. ...
Preprint
Full-text available
In numerous scientific domains, a significant challenge lies in identifying causal relationships among the various components of a system solely based on observational data. Recently, the convergent cross mapping (namely, CCM) method proposed by Sugihara et al. has demonstrated substantial potential for causal inference in the absence of models. By varying the coupling strength of the coupled Logistic model, we found that for asymmetrically strongly coupled time series, the CCM method fails in inferring both the direction and strength of causal relationships. In this paper, we propose an improved version of the CCM method, termed effective mutual information convergent cross mapping (namely, EMICCM) method, by considering inherent characteristics of time series such as nonlinearity and non-normal distribution features. As most real systems exhibit nonlinear and non-normally distributed characteristics, our method offers a more accurate measure for inferring causal relationships within them.
Article
Causal inference is the idea of cause-and-effect; this fundamental area of sciences can be applied to problem space associated with Newton’s laws or the devastating COVID-19 pandemic. The cause explains the “why” whereas the effect describes the “what”. The domain itself encompasses a plethora of disciplines from statistics and computer science to economics and philosophy. Recent advancements in machine learning (ML) and artificial intelligence (AI) systems, have nourished a renewed interest in identifying and estimating the cause-and-effect relationship from the substantial amount of available observational data. This has resulted in various new studies aimed at providing novel methods for identifying and estimating causal inference. We include a detailed taxonomy of causal inference frameworks, methods, and evaluation. An overview of causality for security is also provided. Open challenges are detailed, and approaches for evaluating the robustness of causal inference methods are described. This paper aims to provide a comprehensive survey on such studies of causality. We provide an in-depth review of causality frameworks, and describe the different methods.
Article
Full-text available
In a rapidly urbanizing world, heavy air pollution and increasing surface temperature pose significant threats to human health and lives, especially in densely populated cities. In this study, we took an information theory perspective to investigate the causal relationship between diel land surface temperature (LST) and transboundary air pollution (TAP) from 2003 to 2020 in the Bangkok Metropolitan Region (BMR), which includes Bangkok Metropolis and its five adjacent provinces. We found an overall increasing trend of LST over the study region, with the mean daytime LST rising faster than nighttime LST. Evident seasonal variations showed high aerosol optical depth (AOD) loadings during the dry period and low loadings at the beginning of the rainy season. Our study revealed that TAP affected diel surface temperature in Bangkok Metropolis significantly. Causality tests show that air pollutants of two adjacent provinces west of Bangkok, i.e., Nakhon Pathom and Samut Sakhon, have a greater influence on the LST of Bangkok than other provinces. Also, the bidirectional relationship indicates that air pollution has a greater impact on daytime LST than nighttime LST. While LST has an insignificant influence on AOD during the daytime, it influences AOD significantly at night. Our study offers a new approach to understanding the causal impact of TAP and can help policymakers to identify the most relevant locations that cause pollution, leading to appropriate planning and management.
Article
Full-text available
Understanding how different physical and chemical atmospheric processes affect the formation of fine particles has been a persistent challenge. Inferring causal relations between the various measured features affecting the formation of secondary organic aerosol (SOA) particles is complicated since correlations between variables do not necessarily imply causality. Here, we apply a state-of-the-art information transfer measure coupled with the Koopman operator framework to infer causal relations between isoprene epoxydiol SOA (IEPOX-SOA) and different chemistry and meteorological variables derived from detailed regional model predictions over the Amazon rainforest. IEPOX-SOA represents one of the most complex SOA formation pathways and is formed by the interactions between natural biogenic isoprene emissions and anthropogenic emissions affecting sulfate, acidity and particle water. Since the regional model captures the known relations of IEPOX-SOA with different chemistry and meteorological features, their simulated time series implicitly include their causal relations. We show that our causal model successfully infers the known major causal relations between total particle phase 2-methyl tetrols (the dominant component of IEPOX-SOA over the Amazon) and input features. We provide the first proof of concept that the application of our causal model better identifies causal relations compared to correlation and random forest analyses performed over the same dataset. Our work has tremendous implications, as our methodology of causal discovery could be used to identify unknown processes and features affecting fine particles and atmospheric chemistry in the Earth’s atmosphere.
Article
We introduce an approach which allows detecting causal relationships between variables for which the time evolution is available. Causality is assessed by a variational scheme based on the Information Imbalance of distance ranks, a statistical test capable of inferring the relative information content of different distance measures. We test whether the predictability of a putative driven system Y can be improved by incorporating information from a potential driver system X, without explicitly modeling the underlying dynamics and without the need to compute probability densities of the dynamic variables. This framework makes causality detection possible even between high-dimensional systems where only few of the variables are known or measured. Benchmark tests on coupled chaotic dynamical systems demonstrate that our approach outperforms other model-free causality detection methods, successfully handling both unidirectional and bidirectional couplings. We also show that the method can be used to robustly detect causality in human electroencephalography data.
Preprint
Full-text available
We introduce a novel specialized transformer architecture to analyze single-neuron spiking activity. We test our model on multi electrodes recordings from the dorsal premotor cortex (PMd) of non-human primates while performing a motor inhibition task. The proposed architecture provides a very early prediction of the correct movement direction- no later than 230ms after the Go signal presentation across animals- and can accurately forecast whether the movement will be generated or withheld before a Stop signal, unattended, is actually presented.
Article
Functional networks have emerged as powerful instruments to characterize the propagation of information in complex systems, with applications ranging from neuroscience to climate and air transport. In spite of their success, reliable methods for validating the resulting structures are still missing, forcing the community to resort to expert knowledge or simplified models of the system’s dynamics. We here propose the use of a real-world problem, involving the reconstruction of the structure of flights in the US air transport system from the activity of individual airports, as a way to explore the limits of such an approach. While the true connectivity is known and is, therefore, possible to provide a quantitative benchmark, this problem presents challenges commonly found in other fields, including the presence of non-stationarities and observational noise, and the limitedness of available time series. We explore the impact of elements like the specific functional metric employed, the way of detrending the time series, or the size of the reconstructed system and discuss how the conclusions here drawn could have implications for similar analyses in neuroscience.
Article
Salinity dynamics in the Delaware Bay estuary are a critical water quality concern as elevated salinity can damage infrastructure and threaten drinking water supplies. Current state‐of‐the‐art modeling approaches use hydrodynamic models, which can produce accurate results but are limited by significant computational costs. We developed a machine learning (ML) model to predict the 250 mg L ⁻¹ Cl ⁻ isochlor, also known as the “salt front,” using daily river discharge, meteorological drivers, and tidal water level data. We use the ML model to predict the location of the salt front, measured in river miles (RM) along the Delaware River, during the period 2001–2020, and we compare predictions of the ML model to the hydrodynamic Coupled Ocean–Atmosphere‐Wave‐Sediment Transport (COAWST) model. The ML model predicts the location of the salt front with greater accuracy (root mean squared error [RMSE] = 2.52 RM) than the COAWST model does (RMSE = 5.36); however, the ML model struggles to predict extreme events. Furthermore, we use functional performance and expected gradients, tools from information theory and explainable artificial intelligence, to show that the ML model learns physically realistic relationships between the salt front location and drivers (particularly discharge and tidal water level). These results demonstrate how an ML modeling approach can provide predictive and functional accuracy at a significantly reduced computational cost compared to process‐based models. In addition, these results provide support for using ML models in operational forecasting, scenario testing, management decisions, hindcasting, and resulting opportunities to understand past behavior and develop hypotheses.
Article
The influence of resonant magnetic perturbations (RMPs) on the dynamics of turbulence and flows at the edge of the HL-2A tokamak is analyzed utilizing transfer entropy technique. The results have shown that the RMP damps the poloidal flows as well as the E × B shearing rate, whereas enhances the toroidal flows and leads to a broadened particle spectrum with increased small scale turbulence transport. The causality analysis indicates that the regulation impact of poloidal flow on turbulent fluctuations and particle flux is weakened, while that of the toroidal rotation on the latter is strengthened by the RMP field. The impact of the changes in poloidal flow dominates over that of the modified toroidal flow on turbulent transport in the edge. The magnetic perturbation and the flows generally show predator–prey oscillations, where the causal effect between the former and the toroidal flow transits to a synchronization relation in the presence of RMP. In addition, the RMP field will weaken the causal effect on poloidal Reynolds stress while strengthening the parallel-radial component simultaneously. The present findings provide a possible explanation on the effects of external fields on the edge transport, which is suggested to be dominated by the complex interactions among external perturbations, flows, and ambient microturbulence.
Article
The causal connectivity of a network is often inferred to understand network function. It is arguably acknowledged that the inferred causal connectivity relies on the causality measure one applies, and it may differ from the network’s underlying structural connectivity. However, the interpretation of causal connectivity remains to be fully clarified, in particular, how causal connectivity depends on causality measures and how causal connectivity relates to structural connectivity. Here, we focus on nonlinear networks with pulse signals as measured output, e.g., neural networks with spike output, and address the above issues based on four commonly utilized causality measures, i.e., time-delayed correlation coefficient, time-delayed mutual information, Granger causality, and transfer entropy. We theoretically show how these causality measures are related to one another when applied to pulse signals. Taking a simulated Hodgkin–Huxley network and a real mouse brain network as two illustrative examples, we further verify the quantitative relations among the four causality measures and demonstrate that the causal connectivity inferred by any of the four well coincides with the underlying network structural connectivity, therefore illustrating a direct link between the causal and structural connectivity. We stress that the structural connectivity of pulse-output networks can be reconstructed pairwise without conditioning on the global information of all other nodes in a network, thus circumventing the curse of dimensionality. Our framework provides a practical and effective approach for pulse-output network reconstruction.
Article
Dust storms are typical dispersed two-phase atmospheric turbulence, where dust particles are highly charged. Although the very-large-scale motions (VLSMs) have been recognized in single-phase high-Reynolds-number turbulence, they remain largely unexplored in dust storms, especially regarding turbulent electric fields. Here, using datasets from field measurements in the atmospheric surface layer, we demonstrate that VLSMs of approximately the same size are found to exist in wind velocity, concentration of dust particles less than 10 micrometers in diameter (PM10 dust concentration), and electric fields in dust storms. Furthermore, such multiple fields are found to be maximally linearly coupled at the scale of the VLSMs, with linear coherence spectra in the range of 0.5–0.8. By performing transfer entropy analysis, we further demonstrate that wind velocity and PM10 dust concentration are maximally nonlinearly coupled at wavenumber k1 = 0.002 m−1, whereas PM10 dust concentration and electric field are maximally nonlinearly coupled at k1 = 0.15 m−1, suggesting different nonlinear coupling behaviours between the wind-dust and dust-electrostatics interactions. Our results shed light on the structural and coupling features of the multiple fields in dust storms and thus deliver key information for understanding complex dispersed two-phase turbulent flows.
Article
A logical sequence of information-theoretic quantifiers of directional (causal) couplings in Markov chains is generated within the framework of dynamical causal effects (DCEs), starting from the simplest DCEs (in terms of localization of their functional elements) and proceeding step-by-step to more complex ones. Thereby, a system of 11 quantifiers is readily obtained, some of them coinciding with previously known causality measures widely used in time series analysis and often called “information transfers” or “flows” (transfer entropy, Ay–Polani information flow, Liang–Kleeman information flow, information response, etc.,) By construction, this step-by-step generation reveals logical relationships between all these quantifiers as specific DCEs. As a further concretization, diverse quantitative relationships between the transfer entropy and the Liang–Kleeman information flow are found both rigorously and numerically for coupled two-state Markov chains.
Article
This article presents a novel approach for reconstructing an equivalent underlying model and deriving a precise equivalent expression through the use of direct causality topology. Central to this methodology is the transfer entropy method, which is instrumental in revealing the causality topology. The polynomial fitting method is then applied to determine the coefficients and intrinsic order of the causality structure, leveraging the foundational elements extracted from the direct causality topology. Notably, this approach efficiently discovers the core topology from the data, reducing redundancy without requiring prior domain-specific knowledge. Furthermore, it yields a precise equivalent model expression, offering a robust foundation for further analysis and exploration in various fields. Additionally, the proposed model for reconstructing an equivalent underlying framework demonstrates strong forecasting capabilities in multivariate time series scenarios.
Article
Full-text available
We review several aspects of the analysis of time sequences, and concentrate on recent methods using concepts from the theory of nonlinear dynamical systems. In particular, we discuss problems in estimating attractor dimensions, entropies, and Lyapunov exponents, in reducing noise and in forecasting. For completeness and since we want to stress connections to more traditional (mostly spectrum-based) methods, we also give a short review of spectral methods.
Article
Full-text available
We present the new effect of phase synchronization of weakly coupled self-sustained chaotic oscillators. To characterize this phenomenon, we use the analytic signal approach based on the Hilbert transform and partial Poincare maps. For coupled Rossler attractors, in the synchronous regime the phases are locked, while the amplitudes vary chaotically and are practically uncorrelated. Coupling a chaotic oscillator with a hyperchaotic one, we observe another new type of synchronization, where the frequencies are entrained, while the phase difference is unbounded. A relation between the phase synchronization and the properties of the Lyapunov spectrum is studied.
Article
Full-text available
We present the new effect of phase synchronization of weakly coupled self-sustained chaotic oscillators. To characterize this phenomenon, we use the analytic signal approach based on the Hilbert transform and partial Poincaré maps. For coupled Rössler attractors, in the synchronous regime the phases are locked, while the amplitudes vary chaotically and are practically uncorrelated. Coupling a chaotic oscillator with a hyperchaotic one, we observe another new type of synchronization, where the frequencies are entrained, while the phase difference is unbounded. A relation between the phase synchronization and the properties of the Lyapunov spectrum is studied.
Chapter
This introductory chapter gives a brief description of the main ideas that motivated this book.
Article
A one-dimensional lattice of logistic maps is investigated in the case of strong nonlinearity and strong coupling. Although the dynamics may be classified as fully developed turbulence, spatio-temporal structure can be detected by computing time-delayed mutual information and two-point correlations. The correlation is found to be superior in detecting weak structure. An improved algorithm for mutual information is described.
Article
We propose two methods to measure all (linear and nonlinear) statistical dependences in a stationary time series. Presuming ergodicity, the measures can be obtained from efficient numerical algorithms.
Article
Extensions to various information theoretic quantities used for nonlinear time series analysis are discussed, as well as their relationship to the generalized correlation integral. It is shown that calculating redundancies from the correlation integral can be more accurate and more efficient than direct box counting methods. It is also demonstrated that many commonly used nonlinear statistics have information theory based analogues. Furthermore, the relationship between the correlation integral and information theoretic statistics allows us to define ``local'' versions of many information theory based statistics; including a local version of the Kolmogorov-Sinai entropy, which gives an estimate of the local predictability.
Article
We propose an extension to time series with several simultaneously measured variables of the nonlinearity test, which combines the redundancy-linear-redundancy approach with the surrogate data technique. For several variables various types of redundancies can be defined, in order to test specific dependence structures between/among (groups of) variables. The null hypothesis of a multivariate linear stochastic process is tested using the multivariate surrogate data. The linear redundancies are used in order to avoid spurious results due to imperfect surrogates. The method is demonstrated using two types of numerically generated multivariate series (linear and nonlinear) and experimental multivariate data from meteorology and physiology.
Article
Lyapunov analysis for the coupled map lattices is presented. The co-moving Lyapunov exponent is calculated, which is related with the propagation of the disturbance in space. The propagation speed agrees with the zero-crossing point of the co-moving Lyapunov exponent. The co-moving mutual information flow is introduced, which shows the selective transmission of the information at some speed. Lyapunov spectra and vectors are calculated. Spatial structures of the vectors are investigated. Possible analogy with the Anderson localization problem is discussed.
Article
We present a measure for characterizing statistical relationships between two time sequences. In contrast to commonly used measures like cross-correlations, coherence and mutual information, the proposed measure is non-symmetric and provides information about the direction of interdependence. It is closely related to recent attempts to detect generalized synchronization. However, we do not assume a strict functional relationship between the two time sequences and try to define the measure so as to be robust against noise, and to detect also weak interdependences. We apply our measure to intracranially recorded electroencephalograms of patients suffering from severe epilepsies.
Article
The degree of interdependence between intracranial EEG channels was investigated in four epileptic patients with complex partial seizures of mesial temporal lobe origin. With a new method to characterize nonlinear dynamical interdependence-the mutual nonlinear prediction-we demonstrated here a possibility to quantify, during epileptic seizures, the relationships between EEG signals of electrode contacts in the epileptogenic area. During the interictal period, the degree of nonlinear interdependences were very low or absent. In contrast, it was found that transient patterns of nonlinear interdependences emerge at the initial spread of the seizure, during essential parts of its development, and at seizure end, but the maintenance of these interactions are not observed throughout the seizure activity. These results suggest that the nonlinear associations plays an important role in epileptogenesis, and that the process of neuronal entrainment during seizure onset involves a transient interaction between a distributed network of neuronal aggregates, but the maintenance of this interaction is not required for sustained seizure activity. Furthermore, this technique can describe properly the spatio-temporal organisation of the seizures of medio-temporal lobe origin and could become a very useful tool to aid the localization of the epileptogenic regions at the origin of epileptic seizures and their pathways of propagation.
Article
Spatiotemporal chaos can be produced by complicated local dynamics in a small spatial region and observed globally through a process we call information transport. Information transport can be detected by computation of an information-theoretic quantity, the time-delayed mutual information, between measurements of the system at separate spatial points.
  • K Kaneko
K. Kaneko, Physica D 23, 436 (1986);
  • P Grassberger
  • T Schreiber
  • C Schaffrath
  • B Pompe
P. Grassberger, T. Schreiber, and C. Schaffrath, Int. J. Bifurcat. Chaos 1, 521 (1991); B. Pompe, J. Stat. Phys. 73, 587 (1993); D. Prichard and J. Theiler, Physica D 84, 476 (1995).
  • J Arnhold
  • P Grassberger
  • K Lehnertz
  • C E Elger
  • M G Rosenblum
  • A S Pikovsky
  • J Kurths
J. Arnhold, P. Grassberger, K. Lehnertz, and C. E. Elger, Physica (Amsterdam) 134D, 419 (1999); M. G. Rosenblum, A. S. Pikovsky, and J. Kurths, Phys. Rev. Lett. 76, 1804 (1996); M. Le Van Quyen, C. Adam, M. Baulac, J. Martinerie, and F. J. Varela, Brain Research 792, 24 (1998).
  • J A Vastano
  • H L Swinney
J. A. Vastano and H. L. Swinney, Phys. Rev. Lett. 60, 1773 (1988).
Multi-channel physiological data: Description and analysis Time series prediction: Forecasting the future and understanding the past
  • D R Rigney
  • A L Goldberger
  • W Ocasio
  • Y Ichimaru
  • G B Moody
  • R Mark
D. R. Rigney, A. L. Goldberger, W. Ocasio, Y. Ichimaru, G. B. Moody, and R. Mark, Multi-channel physiological data: Description and analysis, in A. S. Weigend and N. A. Gershenfeld, eds., " Time series prediction: Forecasting the future and understanding the past ", Santa Fe Institute Studies in the Science of Complexity, Proc. Vol. XV, Addison-Wesley, Reading MA (1993).