ArticlePDF Available

Abstract

The Lagrangians of physics arise out of a mathematical game between a ``smart'' measurer and nature (personified by a demon). Each contestant wants to maximize his level of Fisher information I. The game is zero sum, by conservation of information in the closed system. The payoff of the game introduces a variational principle-extreme physical information (EPI)-which fixes both the Lagrangian and the physical constant of each scenario. The EPI approach provides an understanding of the relationship between measurement and physical law. EPI also defines a prescription for constructing Lagrangians. The prior knowledge required for this purpose is a rule of symmetry or conservation that implies a unitary transformation for which I remains invariant. As an example, when applied to the smart measurement of the space-time coordinate of a particle, the symmetry used is that between position-time space and momentum-energy space. Then the unitary transformation is the Fourier one, and EPI derives the following: the equivalence of energy, momentum, and mass; the constancy of Planck's parameter h; and the Lagrangian that implies both the Klein-Gordon equation and the Dirac equation of quantum mechanics.
... We claim that this is due to quantum gravitational effects. To quantify this claim, we use Fisher information, which effectively quantifies information about a parameter that can be obtained from a given distribution [15,16]. We thus directly use Fisher information to analyze the information we obtain about quantum gravitational corrections. ...
... This behavior can be precisely quantified using Fisher information. Thus, we will use Fisher information [15,16] to explicitly analyze to what scale even a quantum corrected metric cannot obtain new Fisher information about quantum gravitational corrections, as the spacetime breaks down around that scale. ...
... In the previous section, we analyzed the Kullback-Leibler divergence between the original and corrected probability distributions and observed that its behavior changed near critical value near Planck scale. In this section, we will properly analyze this by using the concept of can be obtained from a probability distribution [15,16]. To analyze how much information can be obtained about quantum gravitational corrections from the probability distribution of particles emitted from a black hole during its evaporation, we obtain Fisher information of the parameter η. ...
Article
Full-text available
A bstract In this paper, we investigate the scales at which quantum gravitational corrections can be detected in a black hole using information theory. This is done by calculating the Kullback-Leibler divergence for the probability distributions obtained from the Parikh-Wilczek formalism. We observe that as quantum gravitational corrections increase with decrease in scale, the increase the Kullback-Leibler divergence between the original and quantum gravitational corrected probability distributions will also increase. To understand the impact of such quantum gravitational corrections we use Fisher information. We observe that it again increases as we decrease the scale. We obtain these results for higher-dimensional black holes and observe this behavior for Kullback-Leibler divergence and Fisher information also depending on the dimensions of the black hole. Furthermore, we observe that the Fisher information is bounded and approaches a fixed value. Thus, information about the nature of quantum gravitational corrections itself is intrinsically restricted by quantum gravity. Thus, this work establishes an intrinsic epistemic boundary within quantum gravity.
... Several applications of the approach to stylized models of economics have been recently put forward [1,2]. The aim of this paper is to show how the extremal physical information (EPI) method of Frieden and Soffer [3,4] can be used to develop a generalization of the Aoki-Yoshikawa sectoral productivity model (AYM) [5,1]. Below, its modified [6,7] version, which abandons the previous, arbitrary metrical form is presented and the solution to entailed equations of the fully analytical formulation of the information principles problem [6] is given. ...
... Thus, both the sample space B and the base space of events are in AY case equivalent to Y a [14]. 4 In [3,4] the condition of the minimal value of the information (kinematical) channel capacity I → min is postulated as the one that fixes the value of N in a unique way. However, sometimes the non-minimal values of I are also discussed as they lead to the EPI method's models, which are of a physical significance [3,4]. ...
... 4 In [3,4] the condition of the minimal value of the information (kinematical) channel capacity I → min is postulated as the one that fixes the value of N in a unique way. However, sometimes the non-minimal values of I are also discussed as they lead to the EPI method's models, which are of a physical significance [3,4]. Some discussion on this topic can be also found in [14]. ...
Preprint
This paper presents a continuous variable generalization of the Aoki-Yoshikawa sectoral productivity model. Information theoretical methods from the Frieden-Soffer extreme physical information statistical estimation methodology were used to construct exact solutions. Both approaches coincide in first order approximation. The approach proposed here can be successfully applied in other fields of research.
... Subsequently, I study a ring with periodic boundary conditions, enabling an analytical solution and eliminating the need for numerical integration on the infinite lattice. A unique aspect of this work is the application of classical information theory to describe the system, including Fisher information, Shannon entropy, and the Cramér-Rao bound [35][36][37][38][39][40][41]. The results reveal intriguing patterns in these measures, demonstrating power-law behavior in the Fisher information and complexity. ...
... Another way to characterize the systems under study is through the Fisher information I [36][37][38][39]. This measure quantifies the level of disorder in the systems, where strong disorder indicates a lack of predictability in the spatial variable across its range. ...
Article
Full-text available
I study a lattice with periodic boundary conditions using a non-local master equation that evolves over time. I investigate different system regimes using classical theories like Fisher information, Shannon entropy, complexity, and the Cramér–Rao bound. To simulate spatial continuity, I employ a large number of sites in the ring and compare the results with continuous spatial systems like the Telegrapher’s equations. The Fisher information revealed a power-law decay of t−ν, with ν=2 for short times and ν=1 for long times, across all jump models. Similar power-law trends were also observed for complexity and the Fisher information related to Shannon entropy over time. Furthermore, I analyze toy models with only two ring sites to understand the behavior of the Fisher information and Shannon entropy. As expected, a ring with a small number of sites quickly converges to a uniform distribution for long times. I also examine the Shannon entropy for short and long times.
... The key to understanding this lies in the measurement theory, specifically Fisher information [20][21][22][23][24], which quantifies the standard deviation of a random variable. It was shown that the information quantifier defined by Fisher substitutes the internal energy for an electron, provided that this electron is both non-relativistic and spinless. ...
Article
Full-text available
In earlier works, it was demonstrated that Schrödinger’s equation, which includes interactions with electromagnetic fields, can be derived from a fluid dynamic Lagrangian framework. This approach treats the system as a charged potential flow interacting with an electromagnetic field. The emergence of quantum behavior was attributed to the inclusion of Fisher information terms in the classical Lagrangian. This insight suggests that quantum mechanical systems are influenced not just by electromagnetic fields but also by information, which plays a fundamental role in driving quantum dynamics. This methodology was extended to Pauli’s equations by relaxing the constraint of potential flow and employing the Clebsch formalism. Although this approach yielded significant insights, certain terms remained unexplained. Some of these unresolved terms appear to be directly related to aspects of the relativistic Dirac theory. In a recent work, the analysis was revisited within the context of relativistic flows, introducing a novel perspective for deriving the relativistic quantum theory but neglecting the interaction with electromagnetic fields for simplicity. This is rectified in the current work, which shows the implications of the field in the current context. @MDPIOpenAccess
... [2,3]. In Ref. [4], the concept of Fisher information was employed to present a systematic approach to deriving Lagrangians of relevance in physics. Of particular interest are the applications of the notion of Fisher information in quantum theory. ...
Preprint
The relevance of the concept of Fisher information is increasing in both statistical physics and quantum computing. From a statistical mechanical standpoint, the application of Fisher information in the kinetic theory of gases is characterized by its decrease along the solutions of the Boltzmann equation for Maxwellian molecules in the two-dimensional case. From a quantum mechanical standpoint, the output state in Grover's quantum search algorithm follows a geodesic path obtained from the Fubini-Study metric on the manifold of Hilbert-space rays. Additionally, Grover's algorithm is specified by constant Fisher information. In this paper, we present an information geometric characterization of the oscillatory or monotonic behavior of statistically parametrized squared probability amplitudes originating from special functional forms of the Fisher information function: constant, exponential decay, and power-law decay. Furthermore, for each case, we compute both the computational speed and the availability loss of the corresponding physical processes by exploiting a convenient Riemannian geometrization of useful thermodynamical concepts. Finally, we briefly comment on the possibility of using the proposed methods of information geometry to help identify a suitable trade-off between speed and thermodynamic efficiency in quantum search algorithms.
... This is demonstrated in Fig. 5. The previous work suggested the following sustainability hypothesis: "sustainable systems do not lose or gain Fisher information over time" [16,22,23]. One of the utility of Fisher's information measure has been in the development of the basic theory of sustainability, for instance, in order to determine whether the system is sustainable or not [23,24] in diverse physical systems (see [25] and references there in). ...
Preprint
Information theory provides a useful tool to understand the evolution of complex nonlinear systems and their sustainability. In particular, Fisher Information (FI) has been evoked as a useful measure of sustainability and the variability of dynamical systems including self-organising systems. By utilising FI, we investigate the sustainability of the logistic model for different perturbations in the positive and/or negative feedback. Specifically, we consider different oscillatory modulations in the parameters for positive and negative feedbacks and investigate their effect on the evolution of the system and Probability Density Functions (PDFs). Depending on the relative time scale of the perturbation to the response time of the system (the linear growth rate), we demonstrate the maintenance of the initial condition for a long time, manifested by a broad bimodal PDF. We present the analysis of FI in different cases and elucidate its implications for the sustainability of population dynamics. We also show that a purely oscillatory growth rate can lead to a finite amplitude solution while self-organisation of these systems can break down with an exponentially growing solution due to the fluctuation in negative feedback.
... According to Frank (2009), natural selection acts maximizing the Fisher information within a Darwinian system. As a consequence, assuming that the flow of information between a system and its surroundings can be modeled as a zero-sum game (Frieden and Soffer, 1995), Darwinian systems would follow the Principle 2. ...
Preprint
The scheme of a unified Darwinian evolutionary theory for physical and biological systems is described. Every physical system is methodologically endowed with a classical information processor what turns every system into an agent being also susceptible to evolution. Biological systems retain this structure as natural extensions of physical systems from which they are built up. Optimization of information flows turns out to be the key element to study the possible emergence of quantum behavior and the unified Darwinian description of physical and biological systems. The Darwinian natural selection scheme is completed by the Lamarckian component in the form of the anticipation of states of surrounding bio-physical systems.
... (The Jeffreys' prior has been "shown to be a minimax solution in a -two personzero sum game, where the statistician chooses the 'non-informative' prior and nature chooses the 'true' prior" [9,31]. Quantum mechanics itself has been asserted to arise from a Fisher-information transfer zero sum game [23].) To examine this possibility, (1.9) was embedded as a specific member (u = .5) of a one-parameter family of spherically-symmetric/unitarily-invariant probability densities, ...
Preprint
Clarke and Barron have recently shown that the Jeffreys' invariant prior of Bayesian theory yields the common asymptotic (minimax and maximin) redundancy of universal data compression in a parametric setting. We seek a possible analogue of this result for the two-level {\it quantum} systems. We restrict our considerations to prior probability distributions belonging to a certain one-parameter family, q(u), <u<1-\infty < u < 1. Within this setting, we are able to compute exact redundancy formulas, for which we find the asymptotic limits. We compare our quantum asymptotic redundancy formulas to those derived by naively applying the classical counterparts of Clarke and Barron, and find certain common features. Our results are based on formulas we obtain for the eigenvalues and eigenvectors of 2n×2n2^n \times 2^n (Bayesian density) matrices, ζn(u)\zeta_{n}(u). These matrices are the weighted averages (with respect to q(u)) of all possible tensor products of n identical 2×22 \times 2 density matrices, representing the two-level quantum systems. We propose a form of {\it universal} coding for the situation in which the density matrix describing an ensemble of quantum signal states is unknown. A sequence of n signals would be projected onto the dominant eigenspaces of \ze_n(u).
Preprint
Full-text available
In this paper we use statistical complexity and information theory metrics to study structure within solar wind time series. We explore this using entropy-complexity and information planes, where the measure for entropy is formed using either permutation entropy or the degree distribution of a horizontal visibility graph (HVG). The entropy is then compared to the Jensen complexity (Jensen-Shannon complexity plane) and Fisher information measure (Fisher-Shannon information plane), formed both from permutations and the HVG approach. Additionally we characterise the solar wind time series by studying the properties of the HVG degree distribution. Four types of solar wind intervals have been analysed, namely fast streams, slow streams, magnetic clouds and sheath regions, all of which have distinct origins and interplanetary characteristics. Our results show that, overall, different metrics give similar results but Fisher-Shannon, which gives a more local measure of complexity, leads to a larger spread of values in the entropy-complexity plane. Magnetic cloud intervals stood out in all approaches, in particular when analysing the magnetic field magnitude. Differences between solar wind types (except for magnetic clouds) were typically more distinct for larger time lags, suggesting universality in fluctuations for small scales. The fluctuations within the solar wind time series were generally found to be stochastic, in agreement with previous studies. The use of information theory tools in the analysis of solar wind time series can help to identify structures and provide insight into their origin and formation.
Chapter
This chapter discusses physical information and the derivation of electron physics. The chapter discusses that Fisher Information I is one-half of a two-term information concept, called “physical information” I, whose extremization (or zero-root) derives most of the known physical theory. The chapter presents a table on the Lagrangians for the various fields of physics. These disparate Lagrangians have a common term that is quadratic in the field function of interest, in the form of a dot or inner product. This term provides the link to information theory. It is basically Fisher's information I. Hence, Fisher information I occurs, not only in the Lagrangian for the Schrödinger wave equation (SWE), but in most Lagrangians of physics. It provides the unifying concept. The Lagrangians and laws of physics arise out of a parameter measurement-estimation effect—in particular, a gedanken experiment, whereby the mean or ideal value of a parameter is to be estimated from experimental data. This naturally brings in Fisher information I. A second property of the Lagrangians of physics is that most have value zero when evaluated at their extremum solutions. For fields of physics that do not obey this property, the physical information (PI) theory defines alternative Lagrangians that do.
Article
Heisenberg-like inequalities are derived from commutator identities, linking the quadratic dispersion in momentum to several functions of the position variable (or conversely). They have various physical and mathematical applications, and provide new links between the quadratic widths and the overall and mean peak widths of wave-functions.
Article
A new method of estimating physical probability laws is given. This arises from a new form of information, which in turn follows from four physically based axioms. Extremization of this “physical information” gives rise to the required probability laws. As verifications of the approach, the following statistical (and non-statistical) laws of physics are derived: the complex Schrödinger and Helmholtz wave equations, relativistic quantum mechanics in the Klein-Gordon, Dirac, and Weyl-Pauli formulations, the Boltzmann energy- and Maxwell-Boltzmann velocity distributions, the Lorentz transformation group of special relativity, and Maxwell's equations. Also derived are inequalities defining a class of uncertainty principles (e.g., Heisenberg's). Finally, the Einstein equations of motion (but not the fields) of general relativity are derived, along with an equivalence dI / dτ ∞ mc2 between information flow rate and matter energy.
Article
Bell System Technical Journal, also pp. 623-656 (October)
Book
The 2nd edition adds material on the role of errors in scientific observation and a critical discussion of determinism from the standpoint of information theory to the material of the 1st edition, which applied information theory to a great number of problems of physics, including: the analysis of signals; thermodynamics; Brownian movement; thermal agitation in electronic tubes, rectifiers, etc.; entropy; Maxwell's demon; Szilard's well-informed heat engine; observations and error; communication; and computing. The new material on determinism leads to Brillouin's "matter of fact" point of view that strict determinism is impossible in scientific prediction because the high cost at some point makes increasing accuracy unattainable. The limit of accuracy is a practical rather than an inevitable limitation in the logical sense. The limitations can be formulated in precise ways by quantum conditions and information theory and should be included in the physical theory.
Article
A procedure to determine the surface temperature from images taken by AVHRR (advanced very high resolution radiometer) on board NOAA11 satellite is described in this paper. The importance of the emissivity parameter to estimate the surface temperature and the possibility to compute both temperature and emissivity maps from the two thermal infrared channels of the AVHRR is shown.
Article
The Cramer-Rao inequality (CRI) is a general uncertainty relation expressing reciprocity between the mean-square error in an estimate and the Fisher information I present in the data. The CRI is shown to derive strong uncertainty principles in quantum mechanics and optics. It also derives the Maxwell-Boltzmann law of gas theory as a distribution achieving the minimum CR product.
Article
A certain analogy is found to exist between a special case of Fisher's quantity of information I and the inverse of the “entropy power” of Shannon (1949, p. 60). This can be inferred from two facts: (1) Both quantities satisfy inequalities that bear a certain resemblance to each other. (2) There is an inequality connecting the two quantities. This last result constitutes a sharpening of the uncertainty relation of quantum mechanics for canonically conjugated variables. Two of these relations are used to give a direct proof of an inequality of Shannon (1949, p. 63, Theorem 15). Proofs are not elaborated fully. Details will be given in a doctoral thesis that is in preparation.