Ebook: Metrology and Physical Constants
The reliability and accuracy of systems of measurement continue to advance. We are about to enter a period of the most stable measurement system we can imagine with the anticipated new definitions of the SI units of measurement; a direct link between fundamental physics and metrology which will eliminate the current definition of the kilogram, until now based upon an artifact.
This book presents selected papers from Course 185 of the Enrico Fermi International School of Physics, held in Varenna, Italy, in July 2012 and jointly organized with the Bureau International des Poids et Mesures (BIPM). The papers delivered at the school covered some of the most advanced topics in the discipline of metrology, including nano-technologies; quantum information and quantum devices; biology and medicine; food; surface quality; ionising radiation for health, environment, art and archaeology; and climate. The continuous and striking advances in basic research concerning atomic frequency standards operating both in the visible range and at microwave levels and the applications to satellite systems are also considered, in the framework of a historical review of the international organization of metrology, as are the problems inherent in uncertainty statements and definitions.
This book will be of interest to all those whose work involves scientific measurement at the highest levels of accuracy.
The Course 185 of the International School of Physics “Enrico Fermi”, Metrology and Physical Constants, was held in Varenna in July 2012, and was organized by the Italian Physical Society (SIF), the Istituto Nazionale di Ricerca Metrologica (INRIM), Torino, and the Bureau International des Poids et Mesures (BIPM). In the last twelve years international courses on Metrology were organized both in Varenna (2000, 2006, 2012) and at the BIPM (2003, 2008). The programs of the courses in Italy were developed with the participation of the BIPM, thanks to the presence of the Director pro tempore, as already occurred in the previous ones held in Varenna (1976) and in Lerici (1989). Moreover the National Metrological Laboratories have been present providing many teachers and students supported by their institutions.
Regarding the Varenna course of 2012, the international audience of PhD students, post-docs and young scientists, who could attend this course, had the opportunity to listen to the presentations of the most advanced topics of this discipline. Room has been given to topics which constitute new areas of development in metrology, such as nano-technologies, quantum information and quantum devices, biology and medicine, food, surface quality, ionizing radiations for health, environment, art and archaeology, and climate. Particular attention has been paid to the determination of the physical constants directly involved in the anticipated new definitions of the SI units of measurement, which shall be based on fixed values to be assigned to fundamental constants. This straight and direct link between fundamental physics and metrology will provide, according to our knowledge of physics, the most stable measurement system we can imagine and will eliminate the current definition of the kilogram till now based on an artefact. The “New SI” will also support a high stability level to recent measurement areas, which could be summarized by the expression “Metrology in Chemistry”. The continuous and striking advancements in basic research concerning atomic frequency standards operating in the visible range and at microwaves and the applications to satellite systems have been recalled and the problems inherent to the uncertainty statements and definitions have been reviewed. These items have been presented in the frame of a historical review of the international organization of metrology—since the Metre Convention signature—the SI development, the BIPM mission and role and the developments of the Mutual Recognition Arrangement (CIPM MRA).
Unfortunately, not all the lectures given during the course could be included in these proceedings. These presentations however were very important to complete the scientific panorama of the school and deserve to be mentioned: Steven Choquette: “Biochemical measurements methods I and II”; Kristian Helmerson: “Cold atoms, optical clocks and atom interferometers I and II”; Stephen A. Wise: “Chemical Metrology, for Environment and Human Health Assesment”.
Sponsors of this course have been the Lecco Chamber of Commerce, the National Institute of Nuclear Physics, the National Council of Research, the firm Thermo Fisher Scientific and the CRT foundation. The remarkable scientific and economic support of the INRIM must be acknowledged together with the organizing effort by SIF. Patronage has been given by the Commission A (metrology) of the Union Radio-scientifique International (URSI), the Italian National Agency for New Technologies, Energy and Sustainable Economic Development (ENEA), and the Politecnico di Torino.
The Directors wish to warmly thank all the lecturers and seminar speakers for providing the students with their expertise both during the scheduled lessons and being available for discussions. A particular feeling of gratitude must be expressed to Andrea Mario Rossi, the scientific secretary of the school, who took contacts with teachers, students and SIF before starting the course and kept contacts with the authors during the preparation of these proceedings.
Moreover the extreme valuable help and the friendly co-operation of Mrs. Barbara Alzani and Mrs. Ramona Brigatti acting on behalf of SIF must be acknowledged.
The editions of the Varenna courses on Metrology have been chances of cultural exchanges between teachers and students and among the students themselves and fruitful contacts have been maintained. The Directors hope that for many students the attendance at this Varenna school edition be a seminal point in their career development as “metrologists” and they become in the future leaders in their specific fields.
E. Bava and M. Kühne
This article outlines the origins and history of the Metre Convention, the BIPM and the International System of Units (SI), with particular reference to the historical development of units based on fundamental constants or invariants of nature. In the past, the ideas and the intention to proceed towards a unit system based on invariants of nature had existed but it has only recently become a practical possibility. The adoption by the 24th General Conference on Weights and Measures in October 2011 of a Resolution outlining the principles of such a system is the culmination of more than two hundred years of advances in physics and metrology.
In 1875 the “Convention of the Metre” was signed in Paris by representatives of seventeen nations in which they agreed to establish an international system of units based on the metre and the kilogram. The Metre Convention founded the “International Bureau of Weights and Measures (Bureau International des Poids et Mesures, BIPM)” as a permanent scientific laboratory with the task to conserve the international prototypes of the metre and kilogram and to provide comparisons between the national standards and the international prototypes in order to establish worldwide uniformity of measurements. Since 1875 metrology has become much more complex but the fundamental mission of the BIPM has not changed significantly and can still be defined as supporting worldwide uniformity of measurement. A historical review of the BIPM and its mission and role in the 21st century is presented.
The CIPM MRA (Mutual recognition of national measurement standards and of calibration and measurement certificates issued by national metrology institutes) was drawn up by the Comité International des Poids et Mesures (CIPM) and implemented in 1999. It has been signed by most of the National Metrology Institutes throughout the world, and through its web support, the KCDB (BIPM key comparison database), provides formal evidence of the technical equivalence of national measurement standards at the highest level, and of international recognition of the services provided by National Metrology Institutes. The value of the arrangement and the benefits it offers to other sectors, including accreditation, legislation and industry, are now widely acknowledged in the world of metrology.
There is much confusion on the topic of uncertainty of measurement. Yet, measurement uncertainty is both a pivotal concept in measurement theory and, above all, a basic requisite in practice, from the physics laboratory measuring some exotic property of Nature to the shop floor. The purpose of this paper is to give the author's view on measurement uncertainty in an unambiguous way, thus privileging clarity over diplomacy, for which he apologizes once and for all. Accordingly, the scope of the paper is to discuss the fundamental metrological concepts and associated terms, as given in the International Vocabulary of Metrology, VIM, in the light of their relevance to the topic of uncertainty, as treated in the Guide to the expression of uncertainty in measurement, GUM. In this scheme, the focus is on the concepts of error and uncertainty and on their intimate connection, often masked by misunderstanding when not buried under the misconception that they are opposite and competing concepts. It will be shown that probability theory is the correct framework in which error and uncertainty are reconciled in a convenient and rigorous way. The author is convener of the Joint Committee for Guides in Metrology (JCGM) Working Group 1 (GUM). The opinion expressed in this paper does not necessarily represent the view of this Working Group.
In this first part the general aspects of accepted or recommended parameter definitions useful to evaluate quasi-periodic signals frequency stability are introduced, based on the most common model suitable for discussion and development. The parameters belong to two different areas: the time and the frequency domains, with different experimental techniques and theoretical approaches. The two domains, however, are linked to each other and they support each other in picking up experimental data to be fruitfully applied in science and technology. This paper is mainly concerned with the developments in experimental methods and with the Allan variance and the Modified Allan variance analyses.
Atomic clocks exhibit the highest performance levels as far as time/frequency accuracy and stability are concerned, moreover show reliability and lifetime which makes them undisputable references in modern applications such as telecommunication networks and navigation systems. However their stability is affected by environmental factors, by aging and by component damages. In this second paper attention is paid to the estimate of frequency instability introduced by deterministic processes as drifts or periodic modulations and by variations of device characteristics due to aging or failures. It is important therefore to use techniques to analyze and to define the clock stability as a function of time, that is under non-stationary conditions. According to the problem under discussion the analysis is carried out by exploiting the properties of the modified Hadamard variance weighed with binomial coefficients or the use of the Dynamic Allan Variance (DAVAR). Moreover the Total variance method (Totvar) to face the problem of a limited number of data available for statistical evaluation of slowly varying noise is described. A few examples are given with simulated or real experimental data.
Optical techniques are extremely well received in the field of Cultural Heritage conservation because they allow completely safe testing. This requirement is essential due to the uniqueness and fragility of artworks. A variety of optical investigation methods applied to paintings are, by now, an integral part of the repair process, both to study and evaluate the conservation state of the artwork, and to plan the restoration intervention and monitor its various phases. We present several techniques, developed at the Istituto Nazionale di Ottica CNR-INO in Florence, that have proven to be very useful in the analysis of paintings. They are not routinely applied yet, because research is still at its early stage in those fields. The prototypes developed are currently placed at the Opificio delle Pietre Dure in Florence, a key Institution for conservation and restoration, where INO's researchers work in an Optical Metrology Laboratory where the new instruments are calibrated and tested.
The Earth's climate is undoubtedly changing, however the timescale, consequences and causal attribution remain the subject of significant debate and uncertainty. Detection of subtle indicators from a background of natural variability require measurements over a time-base of decades. This places severe demands on the instrumentation used—requiring measurements of sufficient, accuracy and sensitivity that can allow reliable judgements to be made decades apart. The SI and network of National Metrology Institutes was developed to address such requirements. However, ensuring SI traceability of sufficient accuracy can be established and maintained to instruments orbiting the Earth presents a significant new challenge to the metrology community. This paper highlights some key measurands and applications driving the uncertainty demand of the climate community in the solar reflective domain, e.g., Solar irradiances and Earth reflectances/radiances. It discusses how meeting these uncertainties facilitate significant improvement in the forecasting abilities of climate models. After discussing the current state-of-the-art it describes a new satellite mission, called TRUTHS, which enables, for the first time, high accuracy SI traceability to be established in-orbit. The direct-use of a “primary standard” and replication of the terrestrial traceability chain, extends the SI into space in effect realising a “metrology laboratory in space”.
In the present paper an overview of metrology at the nanometric scale is given. After an introductory part on terminology and definitions concerning generic objects at the nanoscale, nanoparticles are taken as example to describe the overall features of these systems and typical characterization issues occurring from a metrological point of view. The case of nanoparticles is then further examined in depth with reference to gold nanoparticles, whose peculiar properties deriving from nano-scaled effects, like surface plasmon resonance, are discussed in detail, along with the state of the art on the main fabrication methods and applications in different sectors. Finally, a case study on the detection, by gold nanoparticles, of a contaminant used for food adulteration, namely melamine, is presented and analyzed.
An overview is presented of the methods used to describe the behaviour of ferromagnetic materials on small spatial scales. Methods based on energy minimisation (micromagnetics) are discussed. It is shown that these methods naturally identify a characteristic length (exchange length) that controls the insurgence of phenomena that are inherent to nanomagnetism. It is then shown how the previous methods can be extended to dynamical conditions by introducing the Landau-Lifshitz equation for magnetization dynamics. The characteristic time, field, and energy scales associated with this description are discussed. Finally, the extension of dynamical methods to the description of spin-transfer-driven magnetization dynamics is discussed. It is shown that in this case one can take important advantage of the equivalence between the dynamics of interest and general aspects of nonlinear dynamical system theory. This equivalence permits one to identify properties of the dynamics that are particularly robust in nature because they are the consequence of general geometrical and topological constraints that the magnetization state space imposes on the dynamics.
This paper reviews the state of the art and metrological tools to characterize the parameters of the individual optical components that are used in Quantum Communication systems. These parameters are critical to Quantum Communication performance to improving the device performance through selection from characterised components and providing traceable measurements. A statement is made on the estimated uncertainties for each measurement.
The role of ionizing radiation metrology is outlined in its major sectors of interest: medicine and radiation protection. The reliability of results is an essential requirement in radiation measurements both for medical and protection purposes. This calls primarily for measurement traceability to the primary standards, hence for appropriate instrument calibration. In ionizing radiation measurements the quantities directly measured are not necessarily the quantities of ultimate interest. Conversion procedures are then necessary to obtain a quantity from another one. The essential aspects on these procedures are presented together with the principal physical quantities, the primary standards and the calibration methods of more wide use. Just the generalities on the above subjects are described in this lecture, being it addressed to physicists and metrologists not familiar with ionizing radiation measurements.
High-precision measurements of the quantized Hall resistance and the theories for the quantum Hall effect (QHE) suggest that the quantized value depends exclusively on fundamental constants and can be used as “mises en practique” for a realization of the electrical resistance in a new International System of Units (new SI). Such a new SI will be based on fixed values for fundamental physical constants or properties of atoms. Together with a similar application of the Josephson effect all electrical units could be realized with a much higher precision than in the present SI system. In addition high-precision realizations of the units ampere, farad, henry, coulomb, and watt with connections to the kilogram via the watt balance will be possible. In such a new SI system the conventional values of the von Klitzing constant and the Josephson constant which are used to maintain a worldwide uniformity of electrical units outside the official SI units become obsolete and the permittivity μ0 of vacuum has to be determined experimentally.
The quantum Hall effect allows the standard for resistance to be defined in terms of the elementary charge and Planck's constant alone. The effect comprises the quantization of the Hall resistance in two-dimensional electron systems in rational fractions of RK=h/e2=25812.8074434(84)Ω (Mohr P. J. et al., Rev. Mod. Phys., 84 (2012) 1527), the resistance quantum. Despite 30 years of research into the quantum Hall effect, the level of precision necessary for metrology, a few parts per billion, has been achieved only in silicon and III-V heterostructure devices. In this lecture we show that graphene — a single layer of carbon atoms — beats these well-established semiconductor materials as the system of choice for the realisation of the quantum resistance standard. Here we shall briefly describe graphene technology, discuss the structure and electronic properties of graphene, including the unconventional quantum Hall effect and then present in detail the route, which led to the most precise quantum Hall resistance universality test ever performed.
The base quantity of most importance for measurements in chemistry is amount of substance, and its unit in the SI system is the mole. This paper introduces the ideas behind the mole and amount of substance and summarises the reasons why a new definition for the mole is being considered.
This article is about how science provides the answer to the apparently simple question “What is the temperature?”. The question is asked every day in every field of science and engineering. We consider first the purpose of temperature measurement, and compare and contrast temperature metrology with metrology for mass and length. We discuss current practice—which is a description of the International Temperature Scale of 1990 (ITS 90)—and then look at the rationale for a possible re-definition of the unit of temperature, and the role of measurements of the Boltzmann constant in that process. We then review possible techniques for measurement of the Boltzmann constant, and finally reflect on a recent development in experiments to determine the Boltzmann constant using accurate measurements of the speed of sound in argon.
The Avogadro constant, the number of entities in the amount of substance of one mole, links the atomic and the macroscopic properties of matter. Since the molar Planck constant is very well known via the measurement of the Rydberg constant, the Avogadro constant is also closely related to the Planck constant. In addition, its accurate determination is of paramount importance for a new definition of the kilogram in terms of a fundamental constant. Here, we describe a new and unique approach for the determination of the Avogadro constant by “counting” the atoms in 1 kg single-crystal spheres, which are highly enriched with the 28Si isotope. This approach has enabled us to apply isotope dilution mass spectroscopy to determine the molar mass of the silicon crystal with unprecedented accuracy. The value obtained, NA=6.02214082(18)×1023mol−1, is now the most accurate input datum for a new definition of the kilogram.
Since 1889 the International Prototype of the Kilogram had served as the definition of the unit of mass in the International System of Units (SI). It is the last material artefact to define a base unit of the SI, and it influences several other base units. This situation is no longer acceptable in a time of ever increasing measurement precision. It is therefore planned to redefine the unit of mass by fixing the numerical value of the Planck constant. At the same time three other base units, the ampere, the kelvin and the mole, will be redefined. As a first step, the kilogram redefinition requires a highly accurate determination of the Planck constant in the present SI system, with a relative uncertainty in the order of 1 part in 108. The most promising experiment for this purpose, and for the future realization of the kilogram, is the watt balance. It compares mechanical and electrical power and makes use of two macroscopic quantum effects, thus creating a relationship between a macroscopic mass and the Planck constant. In this article the background for the choice of the Planck constant for the kilogram redefinition is discussed and the role of the Planck constant in physics is briefly reviewed. The operating principle of watt balance experiments is explained and the existing experiments are reviewed. An overview is given of all presently available experimental determinations of the Planck constant, and it is shown that further investigation is needed before the redefinition of the kilogram can take place.
It is recognized now that the international system of units will be redefined in terms of fundamental constants. Presently, the best estimate of fundamental constants values is given by a least square adjustment, carried out under the auspice of the Committee on Data for Science and Technology (CODATA) task group on fundamental constants. Among the fundamental constants, the fine-structure constant plays a particular role. It is a dimensionless constant therefore independent of the system of units. The fine-structure constant scales the electromagnetic interaction and can be measured in various fields of physics. Some of the experiments from which the fine-structure constant can be deduced will be reviewed, even if its present value is only determined by two experiments. The specific role of the fine-structure constant in the proposed new SI will be underlined.