ArticlePDF Available

Assessing Uncertainty in Physical Constants

Authors:

Abstract

Assessing the uncertainty due to possible systematic errors in a physical measurement unavoidably involves an element of subjective judgment. Examination of historical measurements and recommended values for the fundamental physical constants shows that the reported uncertainties have a consistent bias towards underestimating the actual errors. These findings are comparable to findings of persistent overconfidence in psychological research on the assessment of subjective probability distributions. Awareness of these biases could help in interpreting the precision of measurements, as well as provide a basis for improving the assessment of uncertainty in measurements.
... No caso do experimento, para se testar a hipótese referente à deflexão da luz, previsto pela teoria da relatividade geral de Einstein, deve-se levar em conta, além da hipótese principal de deflexão gravitacional da luz nas proximidades de massas gigantescas, uma série de hipóteses auxiliares: a órbita da Terra em torno do Sol; o movimento do Sol e da Terra em relação às estrelas; a dilatação dos materiais que compõem os instrumentos utilizados (metais, plásticos, lentes, espelhos e chapas fotográficas); o efeito da refração da luz ao passar pela atmosfera da Terra e pela "atmosfera" ou coroa solar; e até mesmo os efeitos de vieses psicológicos (EARMAN; GLYMOUR, 1980;HENRION;FISCHHOFF, 1986;COLLINS;PINCH, 2010). ...
... No caso do experimento, para se testar a hipótese referente à deflexão da luz, previsto pela teoria da relatividade geral de Einstein, deve-se levar em conta, além da hipótese principal de deflexão gravitacional da luz nas proximidades de massas gigantescas, uma série de hipóteses auxiliares: a órbita da Terra em torno do Sol; o movimento do Sol e da Terra em relação às estrelas; a dilatação dos materiais que compõem os instrumentos utilizados (metais, plásticos, lentes, espelhos e chapas fotográficas); o efeito da refração da luz ao passar pela atmosfera da Terra e pela "atmosfera" ou coroa solar; e até mesmo os efeitos de vieses psicológicos (EARMAN; GLYMOUR, 1980;HENRION;FISCHHOFF, 1986;COLLINS;PINCH, 2010). ...
Article
Rejeitar uma teoria ou hipótese científica não é tão simples como se pensa. A condução de um teste experimental de uma teoria requer a utilização de diversas hipóteses auxiliares para adequar a teoria ou hipótese às evidências. Nunca se pode ter certeza de que é realmente a teoria e não uma hipótese auxiliar, que é a causa da falsificação do experimento. Este é o denominado problema ou tese de Duhem-Quine, tema pouco conhecido entre estudantes com interesse em ciências experimentais. Este artigo tem como objetivo examinar o papel das hipóteses auxiliares ao analisar como o problema de Duhem-Quine impacta a validade das teorias científicas. Para ilustrar esse conceito, o artigo analisa criticamente o experimento sobre a transferência de memória em planárias, conduzido pelo biólogo norte americano James McConnell. Apesar do cuidado de McConnell em criar uma grande variedade de grupos e procedimentos de controle em seus experimentos, identificamos que muitas hipóteses auxiliares possivelmente não foram consideradas, não ficando claro se a não consolidação dos resultados se deve à inveracidade da hipótese de transferência de memória, ou se uma das hipóteses auxiliares estava incorreta ou não foi adequadamente controlada. Por fim, mostramos como o problema de Duhem-Quine pode ampliar a visão das pessoas sobre ciências e o papel dos testes experimentais de novas hipóteses e teorias.
... Heavy tails have many potential causes, including bias [7], overconfident uncertainty underestimates [75], and uncertainty in the uncertainties [17], but it is not immediately obvious how these would produce the observed t distributions with so few degrees of freedom. ...
Preprint
Judging the significance and reproducibility of quantitative research requires a good understanding of relevant uncertainties, but it is often unclear how well these have been evaluated and what they imply. Reported scientific uncertainties were studied by analysing 41000 measurements of 3200 quantities from medicine, nuclear and particle physics, and interlaboratory comparisons ranging from chemistry to toxicology. Outliers are common, with 5{\sigma} disagreements up to five orders of magnitude more frequent than naively expected. Uncertainty-normalized differences between multiple measurements of the same quantity are consistent with heavy-tailed Student-t distributions that are often almost Cauchy, far from a Gaussian Normal bell curve. Medical research uncertainties are generally as well evaluated as those in physics, but physics uncertainty improves more rapidly, making feasible simple significance criteria such as the 5{\sigma} discovery convention in particle physics. Contributions to measurement uncertainty from mistakes and unknown problems are not completely unpredictable. Such errors appear to have power-law distributions consistent with how designed complex systems fail, and how unknown systematic errors are constrained by researchers. This better understanding may help improve analysis and meta-analysis of data, and help scientists and the public have more realistic expectations of what scientific results imply.
... For example, the velocity of light in a vacuum has been measured in different ways since the late 19th century. Some of the reported values, especially during the 1920s and 1930s, were significantly lower than the currently accepted value, even after taking into account their reported uncertainty margins (Henrion and Fischhoff 1986). This suggests that the uncertainties of those measurements were under-estimated. ...
Chapter
Full-text available
... A stated motivation for science is the organized identification and dissemination of scientific facts. Nevertheless, from historical data on shifts in scientific methodology [HF86,SS11] and epistemological standards [Ste23,KC13,Car83,Str20], we know that the correlation between facts and published claims lies far from identity. The reproducibility of a research finding refers to the scientific community's ability to obtain the same results using the same methods and data. ...
Preprint
We explore a paradox of collective action and certainty in science wherein the more scientists research together, the less that work contributes to the value of their collective certainty. When scientists address similar problems and share data, methods, and collaborators, their understanding of and trust in their colleagues' research rises, a quality required for scientific advance. This increases the positive reinforcement scientists receive for shared beliefs as they become more dependent on their colleagues' knowledge, interests, and findings. This collective action increases the potential for scientists to reside in epistemic ''bubbles'' that limit their capacity to make new discoveries or have their discoveries generalize. In short, as scientists grow closer, their experience of scientific validity rises as the likelihood of genuine replication falls, creating a trade-off between certainty and truth.
... This chapter proposes the FIQ-based method as an alternative for uncertainty analysis. This method avoids subjectivity by focusing on the concept of "optimal uncertainty" (ε opt ) and "experimental comparative uncertainty" (ε exp ) [108]. (Table 2). ...
Article
Full-text available
Traditional methods for selecting models in experimental data analysis are susceptible to researcher bias, hindering exploration of alternative explanations and potentially leading to overfitting. The Finite Information Quantity (FIQ) approach offers a novel solution by acknowledging the inherent limitations in information processing capacity of physical systems. This framework facilitates the development of objective criteria for model selection (compara-tive uncertainty) and paves the way for a more comprehensive understanding of phenomena through exploring diverse explanations. This work presents a detailed comparison of the FIQ approach with ten established model selection methods, highlighting the advantages and limitations of each. We demonstrate the potential of FIQ to enhance the objectivity and robustness of scientific inquiry through three practical examples: selecting appropriate models for measuring fundamental constants, sound velocity, and underwater electrical discharges. Further research is warranted to explore the full applicability of FIQ across various scientific disciplines.
... However, some scholars believe that the availability of information can influence the adoption of an objective approach to risk assessment. For example, Kahneman et al. (1982) and Henrion and Fischhoff (1986) argue that experts often rely on intuition and extrapolation when they are required to go beyond the bounds of available information. As a result, they are susceptible to the same biases and heuristics as the general public. ...
Article
Full-text available
This study aims to investigate the dynamics of contagion and its impact on firms, specifically focusing on how a rival’s failure to control an event can have adverse consequences for other firms. Through a comprehensive analysis of relevant theories, literature, and real-world cases, the study identifies key factors that contribute to the contagion process and proposes a framework for assessing the associated risk. The research highlights the crucial role of stakeholders in mediating the effects of rivals’ misfortunes on other firms and emphasizes how stakeholders’ identities shape their risk evaluations, thereby affecting the occurrence of contagion. This study contributes to the existing literature by providing a conceptualization of the contagion process and introducing the concept of “stakeholder identity” within the context of organizational and operational risk management. The findings offer practical insights to firms by emphasizing the significance of contagion risk, which is often overlooked in operational risk management strategies. Additionally, the study provides valuable guidance on how firms can effectively assess their vulnerability to contagion, enabling them to proactively manage and mitigate their risk.
... The results may increase uncertainty, as noted by Pavese [28]. Additionally, statistical expert bias driven by personal beliefs or preferences [29] and the presence of subjective judgment [30] cannot be ignored. ...
Article
Full-text available
This article contends that understanding measurement accuracy requires considering the thinker's consciousness in the process. In contrast to modern statistical methods, which analyze sources of uncertainty related to variables and measurements, this study proposes an approach that also accounts for the thinker's role in storing, transmitting, processing, and utilizing information to formulate the model. By incorporating the finite amount of information in the model, the study proposes a method for estimating the limit of measurement accuracy and provides examples of its application in experimental physics and technological processes.
... However, note that r is calculated already in the measurement process by the researcher, based on his subjective experience and knowledge. This situation leads to the idea that relative uncertainty includes an element of subjective judgement [22]. The results of the theoretical conclusions of the informational method have been applied to several practical problems, including the measurement of a physical constant [13], the determination of the required simplicity of a physical law [9], and the evaluation of the efficiency of a technological process (thermal energy accumulation and ice maker performance) [ ...
Article
Full-text available
When building a model of a physical phenomenon or process, scientists face an inevitable compromise between the simplicity of the model (qualitative-quantitative set of variables) and its accuracy. For hundreds of years, the visual simplicity of a law testified to the genius and depth of the physical thinking of the scientist who proposed it. Currently, the desire for a deeper physical understanding of the surrounding world and newly discovered physical phenomena motivates researchers to increase the number of variables considered in a model. This direction leads to an increased probability of choosing an inaccurate or even erroneous model. This study describes a method for estimating the limit of measurement accuracy, taking into account the stage of model building in terms of storage, transmission, processing and use of information by the observer. This limit, due to the finite amount of information stored in the model, allows you to select the optimal number of variables for the best reproduction of the observed object and calculate the exact values of the threshold discrepancy between the model and the phenomenon under study in measurement theory. We consider two examples: measurement of the speed of sound and measurement of physical constants.
Article
Full-text available
Over the past few decades, the world has become an increasingly dangerous and complex place, and thus, expectations from spatial planning have changed. The study defines the concept of uncertainty as an important problem area of spatial planning. Based on lack of native studies on this subject, it is aimed to reveal how the uncertainties in spatial planning process are handled in international literature. It consists of two basic steps. In the first step, a three-stage model, "Uncertainty Components of Spatial Planning" is proposed. These stages involve (i) the conceptualization, (ii) the classification and (iii) the evaluation of uncertainty. In the second step, a triangular framework was formed for the conceptualization stage of this model having components of (1) identification and modelling, (2) theories and processes, (3) legal regulations. The theoretical handling suggested that the concept of uncertainty is synonymously used with the concepts of vagueness and ambiguity in everyday life despite their differences. It is also found that uncertainty is the subject of many international studies having a common point of presenting either a model or a method to evaluate uncertainty. These studies were categorized in three groups in handling uncertainty; (1) in multidisciplinary context within a general framework, (2) in the field of planning under two subcategories (2a and 2b), and (3) in the field of environment. The studies carried out in the second category allowed for regular conceptual patterns in themselves, and they were shallower and more inward-oriented than those studies in the 1 st and 3 rd groups, and there is an apparent interaction between the 1 st and the 3 rd groups. In the model proposed, the focus was only on (i) the conceptualization. However, as the origin, definition and basis of the concept of uncertainty were revealed, it might provide an important initiation for future studies. The study is original in introducing the concept of uncertainty to native literature by elaborating on how it is handled in international studies. Proposals were offered on how to place this concept on a theoretical basis before establishing an evaluation framework for uncertainties within the spatial planning process in Türkiye.
Article
Full-text available
Studies of the psychology of hindsight have shown that reporting the outcome of a historical event increases the perceived likelihood of that outcome. Three experiments with a total of 463 paid volunteers show that similar hindsight effects occur when people evaluate the predictability of scientific results—they tend to believe they "knew all along" what the experiments would find. The hindsight effect was reduced, however, by forcing Ss to consider how the research could otherwise have turned out. Implications for the evaluation of scientific research by lay observers are discussed. (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Article
Clear statements of the uncertainties of reported values are needed for their critical evaluation.
Article
The 1952 data used by DuMond and Cohen in an evaluation of the atomic constants are analyzed for the presence of systematic errors by a variance analysis performed by an electronic digital computer. For simplicity the velocity of light is treated as a fixed constant of known value and there remain then eleven linear equations in four unknowns subject to least-squares adjustment. Least-squares adjustments of 219 over-determined subsets of these equations have been made and {\chi{}}^{2} has been evaluated for each such subset. An analysis of these data indicates that small systematic errors are most likely to exist in the following input data: (1) The determination of the Faraday by the silver voltameter. (2) The determination of the cyclotron resonance frequency of the proton by the inverse cyclotron method of Bloch and Jeffreys. (3) Certain of the higher voltage determinations of he\frac{h}{e} by the continuous x-ray quantum limit. In descending order of magnitude of discrepancy from the remaining data on the constants are the determinations of (a) Felt, Harris, and DuMond made at 24 500 volts, (b) Bearden and Schwarz at 19 600 volts, (c) Bearden and Schwarz and also Bearden, Johnson, and Watts in the region between about 10 kv and about 6 kv. An analysis of the various observations taken by these observers at different voltages reveals a possible systematic trend when discrepancy is plotted against either voltage or window width in volts. Conjectures to account for the effect are discussed.
Article
DOI:https://doi.org/10.1103/RevModPhys.1.1
Article
Present status of least squares' calculations.—There are three possible stages in any least squares' calculation, involving respectively the evaluation of (1) the most probable values of certain quantities from a set of experimental data, (2) the reliability or probable error of each quantity so calculated, (3) the reliability or probable error of the probable errors so calculated. Stages (2) and (3) are not adequately treated in most texts, and are frequently omitted or misused, in actual work. The present article is concerned mainly with these two stages.
Article
Recent accusations of scientific fraud have raised serious questions both for science policy and science itself. If experimental results cannot be trusted then science becomes virtually impossible. Four cases from twentieth-century physics are examined to see if the normal procedures of science provide adequate safeguards against fraud. I conclude that repetition of experiments, particularly for those of theoretical importance, does provide a sufficient safeguard. © 1984, American Association of Physics Teachers. All rights reserved.
Article
Experimental data bearing on the precision determination of the numerical values of the fundamental physical constants are reviewed, with particular emphasis being placed on the identification and isolation of discrepancies and inconsistencies. The purpose of the analysis is to present a consistent set of values of the fundamental constants and to present a careful and complete description of the steps taken to reach this end. The Introduction discusses the significance of such an analysis and indicates the general method of approach. The indispensability of local unit systems and conversion factors connecting them, in order to avoid a sacrifice of precision peculiar to different metrological techniques, is emphasized. The point is stressed that conversion constants introduce the danger of ignoring error-statistical correlations between physically measured quantities, and the effects of such correlations on the assignment of errors is discussed. All available sources of experimental information relative to the necessary input data are presented, and changes in definitions of units since our last review are discussed. After the available stochastic input data have been reviewed and the less reliable items eliminated, the third section examines the remainder for mutual compatibility by means of an analysis of variance in which special criteria for recognizing the incompatibility of a datum are developed, using the analogy of the energy of internal strain introduced in overdetermined mechanical structures. Tables of least-squares adjusted values of fundamental constants and conversion factors of physics and chemistry based on the 1963 adjustment are given. Research pertinent to the constants which has been completed or published subsequent to the 1963 "recommended" adjustment is discussed, and the effect of these on our knowledge of the numerical values of the fundamental constants is presented.
Article
DOI:https://doi.org/10.1103/RevModPhys.25.691
Article
The Environmental Protection Agency (EPA) has responsibility to set uniform national air quality standards at the highest level which protects public health with an adequate safety margin. The precise point at which a standard is judgmentally set depends upon the scientific estimate of the exposure-response relationship for the pollutant of interest as well as value judgments regarding the point at which health effects are adverse, the meaning of adequate safety margins, and the size of a sensitive group within the general population. Because risk data is generally unavailable for the exposure levels of interest, subjective estimates based on related experience (e.g., animal tests, human exposures at higher levels) are needed for these risk estimates. This report describes many factors which have been observed to be sources of error or overconfidence when such subjective estimates are made. The implications of these factors are explored for each participant group and for alternative approaches.
Article
This review of the properties of leptons, mesons, and baryons is an updating of Review of Particle Properties, Particle Data Group [Rev. Mod. Phys. 52 (1980) No. 2, Part II]. Data are evaluated, listed, averaged, and summarized in tables. Numerous tables, figures, and formulae of interest to particle physicists are also included. A data booklet is available.