Article

The New SI and fundamental constants: different meanings assigned to the same data, and how to proceed from recommended numerical values to their stipulation and beyond

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

THIS NOTE IS A REVISION AND AUGMENTED VERSION OF Note arXiv.gen-ph:1601.00857v1 "The New SI and fundamental constants: different meanings assigned to the same data" OF DEC 30 2015. This note discusses the role of fundamental constants in the proposed “New SI” formulation of the definition of the International System of Units, namely in the present official documents and in some relevant literature. The meaning assigned to their use is found substantially different even among the advocates of the proposal. Then, how the stipulation of the numerical values in a SI unit is obtained according to the specific features of a measurement unit is discussed, with some special problems, not all cleared yet, that the stipulation of a fundamental constant can place when it becomes part of the definition of the New SI and of a base unit of it. Some reasons are discussed why it is urgent that these basic issues are clarified.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... Thus, we wish to propose the following criterion for selecting an optimal set of DCs for new defi nitions of the units [9,19,[20][21][22]. The set of DCs consisting of h, e, k, and N A is the preferred variant. ...
Article
Full-text available
Different sets of constants the fixed values of which may be selected for new definitions of the four units (kilogram, mole, ampere, and kelvin) of the International System of Units are discussed. The concept of the “order of a constant” in a given system of units is proposed. Criteria for arriving at an optimal selection of defining constants as well as a set of constants consisting of Planck’s constant h, Avogadro’s constant NA, Boltzmann’s constant k, and the magnetic permeability of a vacuum (magnetic constant) μ0 are considered. The proposed set is an alternative to the base set consisting of h, e, k, and NA.
... At present time discussions about strengths and shortcomings of different new definitions of base SI units still continue (see, e.g. [7,15,17,27,28]). The preferred version is the DC set consisted of h, e, k, and N A [4]. ...
Article
Full-text available
We discuss different sets of defining constants, fixed values of which are considered in connection with the transition to new definitions of four SI units (the kilogram, the mole, the ampere, and the kelvin). The notion of constant's order in a given system of units is suggested. We propose an alternative set of fixed constants applicable for new definitions of the four SI units. We analyse and discuss in detail the set, which consists of the Planck constant, the Avogadro constant, the Boltzmann constant and the magnetic constant.
Article
Different ways of choosing the fundamental physical constants in connection with the planned transition to new definitions of the fundamental SI units (kilogram, mole, ampere, and kelvin) are discussed. Criteria for an optimum choice of the fundamental physical constants for this transition are considered. The advantages and disadvantages of the new definitions of the kilogram, mole, ampere, and kelvin based on various sets of constants chosen from the atomic mass unit, Avogadro constant, elementary electric charge, Boltzmann constant, Planck constant, and permeability of free space are analyzed.
Technical Report
Full-text available
(in Italian) Detailed digest of the activities developed at the IMGC-CNR, later INRIM, Italian Institutes of Metrology by the Research Group of thermal metrology in the field of cryogenics. The main text is organised chronologically. The 8 Appendices contain: A - Bibliography; B - International and National collaborations; C - List of of the Members of the Group; D - Fixed-point cells made by the Group; E - Contracts; F - International Intercomparisons; G - Detailed Summary of the activities performed and main results obtained; Z - Adjournment 2015-2018.
Conference Paper
Full-text available
FREE DOWNLOAD. The paper is a digest of some problems to be considered should the proposed use of fundamental constants in the definition of measurement units of the SI be implemented: (a) more base units being multi-dimensional, instead of fixing the present problems in this respect; (b) the multidimensionality in the definitions; (c) how the magnitude of the base units is established; (d) the use of CODATA adjusted values of the constants for this specific purpose; (e) formal issues in stipulating algebraic expressions of the definitions, and in respect to the rounding or truncation of the numerical values; (f) formal issues with the use of the integer number N A ; (g) limitations in new determinations that might arise from the stipulation of the values of several constants, for the CODATA to continue performing in future meaningful least squares adjustments of the fundamental constants taking into account future data; (h) implementation at the NMIs and Society level.
Article
Full-text available
Recent proposals to re-define some of the base units of the SI make use of definitions that refer to fixed numerical values of certain constants. We review these proposals in the context of the latest results of the least-squares adjustment of the fundamental constants and against the background of the difficulty experienced with communicating the changes. We show that the benefit of a definition of the kilogram made with respect to the atomic mass constant (mu) may now be significantly stronger than when the choice was first considered 10 years ago.
Article
Full-text available
We discuss the role of fundamental constants and measurement data for the Planck, Avogadro and Boltzmann constants and the elementary electric charge in connection with the planned transition to new definitions of four base SI units (the kilogram, mole, ampere and kelvin) in terms of fixed values of these constants. It is proposed to choose a new definition of any base SI unit in terms of a particular fundamental physical constant using a number of criteria, or principles, such as succession relative to the current SI, a sufficient stability of the new unit standards, and concordance between physical dimensions of the unit and the corresponding fundamental constant. It is argued that a redefinition of the kilogram and mole by fixing the values of the atomic mass unit and the Avogadro constant satisfies all these criteria and bears some more advantages against the version with fixed Planck constant: a well founded approach to definition of the ampere and the opportunity to preserve the current relationship between definitions of the mole and the kilogram. It is also argued that the kelvin can be redefined independently of the other three units.
Article
Full-text available
Plans are underway to redefine the International System of Units (SI) around 2018. The New SI specifies the values of certain physical constants to define units. This article explains the New SI in a way that could be used to present it to high-school physics classes.
Article
Full-text available
The full paper can be download freely from http://www.metrologybytes.net/documents2014.php The International System of Units (SI) was first adopted in 1960, as the more recent implementation of the Metre Treaty signed in 1875. Basic features of the original SI are that (a) seven units are chosen as “base units”, all the others being “derived units”, and (b) the definitions of the base units should not create interdependence. This way, the SI conforms to the basic principle of the Metre Treaty that each signatory country can realise its choice of primary national standards of the very definitions of the units without needing to resort to calibrations obtained from another country, and without obligation to have them realised for all the units. A mismatch already occurs to some extent with respect to the above features in the present definitions of SI base units. This contribution, strictly based on metrological considerations, illustrates how the present proposal concerning new definitions for the base units, called “New SI”, would extend the mismatch. In this frame, also the meaning is discussed of the concepts of hierarchy and traceability in metrology. By outlining some of the consequences, a discussion is stimulated related to the status of base unit, to the meaning of calibration at the level of the standards of the unit definitions, and to the interdependence of countries’ standards.
Article
Full-text available
The paper illustrates and discusses some problems that should be taken into account, should the proposed use of fundamental constants in the definition of measurement units of the SI be implemented: (a) more base units being multi-dimensional, instead of fixing the present problems in this respect; (b) the multidimensionality in the definitions; (c) the use of CODATA adjusted values of the constants for this specific purpose; (d) formal issues in stipulating algebraic expressions of the definitions, and in respect to the rounding or truncation of the numerical values in their transformation from uncertain to exact values; (e) formal issues with the use of the integer number NA; (f) limitations that can arise from the stipulation of the values of several constants for the CODATA Task Group to continue performing in future meaningful least squares adjustments of the fundamental constants taking into account future data.
Article
Full-text available
Since the beginning of the history of modern measurement science, the experimenters faced the problem of dealing with systematic effects, as distinct from, and opposed to, random effects. Two main schools of thinking stemmed from the empirical and theoretical exploration of the problem, one dictating that the two species should be kept and reported separately, the other indicating ways to combine the two species into a single numerical value for the total uncertainty (often indicated as ‘error’). The second way of thinking was adopted by the GUM, and, generally, adopts the method of assuming that their expected value is null by requiring, for all systematic effects taken into account in the model, that corresponding ‘corrections’ are applied to the measured values before the uncertainty analysis is performed. On the other hand, about the value of the measurand intended to be the object of measurement, classical statistics calls it ‘true value’, admitting that a value should exist objectively (e.g. the value of a fundamental constant), and that any experimental operation aims at obtaining an ideally exact measure of it. However, due to the uncertainty affecting every measurement process, this goal can be attained only approximately, in the sense that nobody can ever know exactly how much any measured value differs from the true value. The paper discusses the credibility of the numerical value attributed to an estimated correction, compared with the credibility of the estimate of the location of the true value, concluding that the true value of a correction should be considered as imprecisely evaluable as the true value of any ‘input quantity’, and of the measurand itself. From this conclusion, one should derive that the distinction between ‘input quantities’ and ‘corrections’ is not justified and not useful.
Article
Full-text available
This paper intends to tackle, in the context of measurement and the definition of measurement units, a problem well known in computing science, the inherent propagation and accumulation of rounding errors throughout the intermediate steps of numerical calculation, and some issues in notation, namely of integer numbers.
Article
Full-text available
This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.
Article
Full-text available
The paper discussed a classification of the inter-comparisons that is relevant to identify the proper statistical method to combine the data provided for each participant to the inter-comparison. The proposed approach for Class 2 IC's constructs a single probabilistic model for the reference distribution probability function, based on the use of the mixture density model. This approach allows the estimate of the reference value simply as the expected value of the mixture density function. The method does not require strong assumptions –as N large– or limitations on the local probability distributions, such as the Normality density in each participant laboratory. It is particularly valuable for inter-comparisons of physical-state realisations, e.g., for temperature standards, where the population associated to an IC can be viewed as a super-population. However, Class 2 inter-comparisons probably include a wider range of IC's, such as the ones where a single standard is circulated and measured. The paper also compared the main features of the statistical treatments suitable for the IC outcomes in both cases of Class 1 and Class 2. Some aspects of the treated problems, especially the meaning of uncertainty in Class 2 IC's and the consequent approach, still deserve a deeper subsequent insight.
Article
Full-text available
In many practical situations, we only know the upper bound D on the (absolute value of the) measurement error d, i.e., we only know that the measurement error is located on the interval [-D,D]. The traditional engineering approach to such situations is to assume that d is uniformly distributed on [-D,D], and to use the corresponding statistical techniques. In some situations, however, this approach underestimates the error of indirect measurements. It is therefore desirable to directly process this interval uncertainty. Such "interval computations" methods have been developed since the 1950s. In this chapter, we provide a brief overview of related algorithms, results, and remaining open problems.
Article
The universally accepted method of expressing physical measurements for world commerce, industry, and science is about to get a facelift, thanks to our improved knowledge of fundamental constants.
Article
A case study is presented of a recent proposal by the major metrology institutes to redefine four of the physical base units, namely kilogram, ampere, mole, and kelvin. The episode shows a number of features that are unusual for progress in an objective science: for example, the progress is not triggered by experimental discoveries or theoretical innovations; also, the new definitions are eventually implemented by means of a voting process. In the philosophical analysis, I will first argue that the episode provides considerable evidence for confirmation holism, i.e. the claim that central statements in fundamental science cannot be tested in isolation; second, that the episode satisfies many of the criteria which Kuhn requires for scientific revolutions even though one would naturally classify it as normal science. These two observations are interrelated since holism can provide within normal science a possible source of future revolutionary periods.
Chapter
In general, a vocabulary is a “terminological dictionary that contains designations and definitions from one or more specific subject fields” (ISO 1087–1:2000, subclause 3.7.2). The present Vocabulary pertains to metrology, the “science of measurement and its application”. It also covers the basic principles governing quantities and units.In this Vocabulary, it is taken for granted that there is no fundamental difference in the basic principles of measurement in physics, chemistry, laboratory medicine, biology or engineering. Furthermore, an attempt has been made to meet conceptual needs of measurement in fields such as biochemistry, food science, forensic science and molecular biology.Development of this 3rd edition of the VIM has raised some fundamental questions about different current philosophies and descriptions of measurement.
Article
This paper is the first of two parts presenting the result of a new evaluation of atomic masses (Ame2003). In this first part we give full information on the used and rejected input data and on the procedures used in deriving the tables in the second part. We first describe the philosophy and procedures used in selecting nuclear-reaction, decay, and mass spectrometric results as input values in a least-squares evaluation of best values for atomic masses. The calculation procedures and particularities of the Ame are then described. All accepted data, and rejected ones with a reported precision still of interest, are presented in a table and compared there with the adjusted values. The differences with the earlier evaluation are briefly discussed and information is given of interest for the users of this Ame. The second paper for the Ame2003, last in this issue, gives a table of atomic masses, tables and graphs of derived quantities, and the list of references used in both this evaluation and the Nubase2003 table (first paper in this issue).Amdc: http://csnwww.in2p3.fr/AMDC/
DRAFT 9 th edition of the SI Brochure
  • Ccu New
CCU, New SI, DRAFT 9 th edition of the SI Brochure, issued on Dec 11 2015, in Draft Documents on http://www.bipm.org/en/measurement-units/new_si/ (January 4, 2016)
OIML-Bulletin-New-SI-Draft-v4
  • R Schwartz
  • Richard Ph
  • C Ehrlich
  • Y Miki
Schwartz R, Richard Ph, Ehrlich C, Miki Y, OIML-Bulletin-New-SI-Draft-v4-20120831_Draft.doc, 2012, arXiv.org > physics > arXiv:1601.00857v.2 available online
  • F Pavese
  • Accred
Pavese F., Accred. Qual. Assur. 19 (2014) 307–314
Mathematical and statistical tools in metrological measurement Chapter in Physical Methods, Instruments and Measurements
  • F Pavese
Pavese F., Mathematical and statistical tools in metrological measurement, 2013, Chapter in Physical Methods, Instruments and Measurements, [Ed. UNESCO-EOLSS Joint Committee], in Encyclopedia of Life Support Systems(EOLSS), Developed under the Auspices of the UNESCO, Eolss Publishers, Oxford, UK, http://www.eolss.net
arXiv: 1512.03668 physics.data-an 9
  • F Pavese
Pavese F., arXiv: 1512.03668 physics.data-an 9 December 2015
Atomic weights and isotopic compositions
  • G Audi
  • M Wang
  • A H Wapstra
  • F G Kondev
  • M Mccormick
  • X Xu
  • B Pfeiffer
Audi G., Wang M., Wapstra A. H., Kondev F. G., McCormick M., Xu X., Pfeiffer B. Atomic weights and isotopic compositions. Chinese Physics C 2012; 36:1287–1602.
Atomic mass compilation 2012, Atomic Data and Nuclear Data Tables
  • B Pfeiffer
  • K Venkataramaniah
  • U Czoka
  • C Scheidenberger
Pfeiffer B., Venkataramaniah K., Czoka U., Scheidenberger C., Atomic mass compilation 2012, Atomic Data and Nuclear Data Tables 100 (2014) 403–535
  • Iec Bipm
  • Ifcc
  • Ilac
  • Iso
  • Iupap Iupac
Guide to the Expression of Uncertainty in Measurement (GUM) 2 nd edn 1995 (BIPM, IEC, IFCC, ILAC, ISO, IUPAC, IUPAP, and OIML) ISBN 92-67-10188-9 (electronic version JCGM100:2008 from BIPM/JCGM at http://www.bipm.org/utils/common/documents/jcgm/JCGM_100_2008_E.pdf