Ebook: Metrology: from Physics Fundamentals to Quality of Life
Metrology is a constantly evolving field, and one which has developed in many ways in the last four decades.
This book presents the proceedings of the Enrico Fermi Summer School on the topic of Metrology, held in Varenna, Italy, from 26 June to 6 July 2017. This was the 6th Enrico Fermi summer school devoted to metrology, the first having been held in 1976. The 2017 program addressed two major new directions for metrology: the work done in preparation for a possible re-definition of four of the base units of the SI in 2018, and the impact of the application of metrology to issues addressing quality of life – such as global climate change and clinical and food analysis – on science, citizens and society. The lectures were grouped into three modules: metrology for quality of life; fundamentals of metrology; and physical metrology and fundamental constants, and topics covered included food supply and safety; biomarkers; monitoring climate and air quality; new IS units; measurement uncertainty; fundamental constants; electrical metrology; optical frequency standards; and photometry and light metrology.
The book provides an overview of the topics and changes relevant to metrology today, and will be of interest to both academics and all those whose work involves any of the various aspects of this field.
We are delighted to introduce the proceedings of the 6th of the Enrico Fermi Summer Schools that has addressed the topic of Metrology. The school was held in Varenna from June 26th to July 6th, 2016, and was organised by the Italian Physical Society (SIF), the Bureau Internationals des Poids et Mesures (BIPM), and the Italian Istituto Nazionale di Ricerca Metrologica (INRIM).
Metrology is an active field that has developed in many ways since the first of these schools devoted to metrology was held in 1976. The successive schools have reflected these developments. The sixth one is no exception, and may in the course of time be considered to have been the one that has signalled the greatest changes.
The programme addressed two major new directions for metrology. The first was the work done in preparation for a possible re-definition of four of the base units of the SI in 2018. The main principles of the possible redefinition of the base units were explained, the challenges involved in the accurate measurement of fundamental constants were described, and attention was brought to the proposed changes in the dissemination of the units following the new definitions.
Secondly, the lectures included a deep reflection on the impact of the application of metrology to chemical measurements on science, citizens and society. The need for precise and traceable measurements in chemistry was exemplified by their application to issues addressing quality of life, global climate change, clinical and food analysis.
For the first time, the lectures were grouped into three modules held over 10 days. All students took part in a core module on the Fundamentals of Metrology. They were then able to choose to attend one or both of the optional modules that addressed: Metrology for Quality of Life and Physical Metrology and Fundamental Constants. In this way, students could choose to be part of the school for the whole 10 days, or only for 7 days according to the topics of greatest interest to them. The topics addressed in the modules were:
Module I: Metrology for Quality of Life (26th June to 29th June 2016)
Metrology in chemistry, Food supply and safety, Biomarkers, Methods and materials for clinical measurements, Climate and air quality monitoring, Redefinition of the mole.
Module II: Fundamentals of Metrology (30th June to 2nd July 2016)
The New International System of Units, Fundamental constants, Quantum metrology, Nanotechnology for metrology, Measurement uncertainty, the Organization of international metrology.
Module III: Physical Metrology and Fundamental Constants (3rd July to 6th July 2016)
Electrical metrology, the Future of the standard of mass, Temperature metrology, Optical frequency standards, Metrology in space, Photometry and light metrology.
As always, the success of a school is due to the dedication and effort of many people and institutions. Our special thanks go to:
– the lecturers who prepared their lessons and manuscripts and were available to spend time with the students, visiting their posters sessions, discussing ideas and participating in the award jury.
– the 77 students and observers who were selected from 120 applications and who created a stimulating and warm atmosphere throughout the period of the school.
– the scientific secretary, Natascia De Leo from INRIM, who devoted a lot of her time, energy, and creativity to the success of the school.
We particularly want to thank the sponsors: the Fondazione Cariplo, CalPower, FEI, Europhysics Letter, METAS, SIM, and NIST. They supported the participation of students who did not have access to sufficient funding of their own and also the best poster award which stimulated students in the clear and efficient presentation of their work.
Finally, we wish to thank Prof. Luisa Cifarelli, SIF President, for promoting the 6th edition of a school devoted to metrology, Mrs Barbara Alzani (SIF), Ramona Brigatti and Arianna Minonzio for their invaluable support in the management and organisation of the school.
P. Tavella, M.J.T. Milton and M. Inguscio
Reliability of medical tests is a major public-health challenge to establish the correct diagnosis and adjust treatments. In order to evaluate and improve comparability and reliability of medical tests, the input of metrology is necessary. As imposed by the Directive 98/79/EC on in vitro diagnostic medical devices establishing results traceability to higher-order reference methods and materials helps harmonizing results obtained in different laboratories, even if they do not use the same method. Then, using reference methods to value assign quality control materials makes it possible to assess trueness of routine methods. However, a prerequisite is that quality control materials are commutable, that is to say that they should mimic the behavior of patient samples that were not modified. In this lecture, the importance of reference methods and commutability will be explained through different examples.
Reference methods and Certified Reference Materials are necessary to assess and improve comparability and reliability of medical tests. However, development of reference measurement systems is more complex for some biomarkers than for others. While the state of the art in metrology is mature enough to address the case of most metabolites and electrolytes quite easily, absolute quantification of protein biomarker is much more difficult due to many factors. Hyphenated quantification techniques are often necessary to measure low concentrations of large proteins in complex matrices. As different isoforms can have different clinical significance, structural heterogeneity of proteins complicates the situation even further and requires to carefully define the measurand. Due to the relationship between folding and activity, absolute quantification of some complexity proteins remains a challenge and some biomarkers are so large that they can even be considered as bionanoparticles. Through different case studies, this presentation will highlight a number of challenges associated with absolute quantification of biological entities and establishment of traceability chains in clinical diagnostics.
The Earth's climate is changing, but the rate of change and scale/timing of its impacts are far from understood and remain the subject of much debate. Variances of up to 5 °C can be seen between climate forecast models making it difficult for policy makers to take the necessary measures to mitigate and adapt to a warming planet. Global observations of the Earth by satellites are the only way to provide the date necessary to improve the fidelity of the predictions and to test and constrain the models. These data must be collected over decades to allow small trends to be sufficiently aggregated so that they can be reliably detected from a background of natural variability. This places very high demands on the performance of satellite sensors since no single satellite will have the lifetime necessary to attempt to monitor change and thus the accuracy needs to be sufficient to remove any prospect of bias from instrument drift propagating into the long-time base measurements which will necessarily be derived from a combination of satellites. At present few satellites are designed specifically for climate and none with the ability to demonstrate robust SI traceability in orbit at the uncertainty levels necessary (close to those of primary standards in the laboratory). This paper describes some of the scientific challenges and culminates in a proposed satellite mission designed explicitly to address them, and in so doing also has the ability to upgrade the rest of the Earth Observing system through reference cross-calibration.
The physical quantity “amount of substance” is measured in the unit “mole” (symbol: mol). In and because of its current definition (via 12 g of 12C) it is closely related to another SI base unit, namely the mass unit kilogram. In principle there is an important difference between mass and amount of substance, because in case of the amount of substance not only the amount of material is relevant, but also the unambiguous specification of the entities forming this amount of material. The practical realization and dissemination of the unit mole is usually based on materials characterized with regard to their purity. The amount of substance is then derived from the mass, the molar mass, and said purity. In the wake of the redefinition of “The International System of Units” (SI) aiming at a set of defining fundamental constants, also the definition of the “mole” will be rewritten. In its new definition it will be based on a fundamental constant, namely the Avogadro constant. This way also the link to the definition of the kilogram will be broken, underpinning the importance of the “mole” as a base unit in its own right.
Reducing air pollution and preventing climate change are major challenges for the future. The monitoring of atmospheric gas concentration and emissions that contribute to these effects is vital in assessing the impact of policies to reduce both air pollutant and greenhouse gas emissions. For this data to be reliable, calibration of monitoring instrumentation is required with gas standards with values traceable to Primary Reference Gas Mixtures. Such gas standards are produced by National Metrology Institutes. The paper describes the level of uncertainty and compatibility that can be achieved for a number of greenhouse gas and air quality gas standards based on the methods used to establish their values: CH4 in air and NO in nitrogen (prepared by static gravimetry); NO2 and HCHO (prepared by dynamic measurements); O3 and impurities in NOx gases (by spectroscopy); CO2 and O3 cross sections (by manometric methods).
The accurate measurement of organic compounds in a wide variety of matrices is important to a great number of sectors including health, the environment, forensics, food safety, pharma. The instruments used to make such measurements require calibration, and for well-defined chemical entities the calibration hierarchy leads to a pure material that has been value-assigned for the mass fraction of the analyte within the material. Such organic primary reference materials are provided as Certified Reference Materials (CRMs) by National Metrology Institutes. This paper reviews the various methods that can be used to value-assign such reference materials, both for small organic molecules such as valine and folic acid, as well as larger molecules such as peptides.
We live in a world of uncertainty. We measure phenomena and quantities, and the outcomes are uncertain. Our predictions, be they about future climate or stock trend, are based on uncertain data. We continuously take decisions in conditions of incomplete knowledge. The purpose of this paper is to make the reader aware of the uncertainty that surrounds our lives, and to give him some tools, both conceptual and technical, to cope with uncertainty, specifically in the domain of measurement. It is shown that probability theory is the appropriate framework in which uncertainty can be treated in a convenient and rigorous way. The author is convener of the Joint Committee for Guides in Metrology (JCGM) Working Group 1 (GUM). The opinion expressed in this paper does not necessarily represent the view of this Working Group.
The CIPM Mutual Recognition Arrangement (CIPM MRA) has now been signed by the representatives of 101 institutes and covers a further 153 institutes designated by these signatories. These come from 57 Member States, 40 Associate States and Economies of the CGPM, and 4 international organizations. Through it the national metrology institutes can demonstrate their measurement abilities and publish internationally recognized statements of their so-called calibration and measurement capabilities (CMCs). All the data are openly available in the CIPM MRA database (the KCDB), which has become an essential reference for the NMIs themselves, accredited laboratory community as well as a small number of high end industrial and other organisations. In this paper we review the situation that led to the development of the CIPM MRA, identifying the three main drivers: the challenges of regulators wanting traceability to the national NMI in an increasingly globalised world; the emergence of a laboratory accreditation and with it the need for laboratories to demonstrate metrological competence; and finally the emergence and strengthening of the Regional Metrology Organizations (RMOs). The paper also addresses the CIPM MRA structure, its operational mechanisms and impact, and concludes with some speculative remarks as to how it might evolve in the future.
The proposals to re-define the base units of the SI are expected to be implemented in 2018. This paper describes the development of the proposals from their first publication in 2006 through to 2016. They will provide greater opportunities to perform primary realizations of the base units and thereby address one of the longest-standing goals of measurement science.
The Metre Convention was signed in Paris on 20 May 1875. Its stated purpose was to assure the international unification and perfection of the metric system. At its origin was a set of recommendations made at a Conference on geodesy held in Berlin in 1867 which called for the manufacture of a new European prototype of the metre and the creation of a European international bureau of weights and measures. The response of the French Government was to create an International Metre Commission to consider the question. The Commission duly recommended what the Berlin Conference had proposed and a Diplomatic Conference on the Metre took place in Paris. It opened on 1 March 1875 and culminated on 20 May in the signing of the Metre Convention by the representatives of 17 States. Included in the Convention was the creation of an International Bureau of Weights and Measures, a General Conference on Weights and Measures and an International Committee for Weights and Measures. The international organization comprising these three organs still exists and is the means through which Governments today arrange for and support the International System of Units, SI, the successor to the metric system. This article gives a brief description of the discussions and events surrounding all this.
This article is about the development of the metric system from its origins at the time of the French Revolution to the present day. In November 2018, the 26th General Conference on Weights and Measures (CGPM) will be invited to adopt a new definition of the International System of Units, SI, based on fixed numerical values of a set of seven defining constants, broadly the fundamental constants of physics. From these, new definitions of the seven base units of the SI will be deduced. It will then be a little more than two hundred years since a Committee of the l'Académie Royale des Sciences made a proposal to base a new unit of length on a fraction of the meridian of Paris and thereby initiated the creation of the metric system. The redefinition in 2018 will at last put into practice their original proposal for a system independent of time and place, accessible to all and belonging to no one nation. The key is the new possibility of replacing the present definition of the kilogram, an artefact of platinum-iridium kept at the International Bureau of Weights and Measures at Sèvres, by one based on a fixed numerical value of a fundamental constant, the Planck constant.
My presentation on “Frequency Combs” was intended to be a tutorial of the basic technology without an extended reference to the latest developments in this field. I repeated parts of my contributions to the course CLXVI “Metrology and Fundamental Constants” that was held at the same school in 2006. Since I did not cover new material, the proceedings of 2006 is reproduced here.
The first lecture begins with a brief overview of the origin of the metric system and the international system of units (SI). Even from the earliest days the concept was to create a system of units based on nature. However, as one can easily see, we actually created in practice an artifact-based system of units that has since 1960 been slowly transforming to a set of units that are increasingly based on nature. The first standard was the unit of time followed by length and for all practical purpose electrical units have been quantum based since the early 1990 s. With the planned redefinition in 2018 the new SI or quantum SI will be based on a set of principles that are based on the Standard Model of Physics. This lecture provides a description of how the SI began and how we have evolved to the present day. Toward the end of the lecture a summary of quantum standards and metrology today is provided before presenting a brief picture of the future that will include integrated embedded standards that provide far better metrology for both the factory floor and consumer technology.
The second lecture in this series starts by providing a definition of quantum technology, quantum metrology, and quantum-based measurements along with a short summary of the properties required of the devices and underlying technology. These properties include that the devices must be deployable, usable, flexible, and manufacturable. The lecture then provides a brief introduction to single-photon technologies including their potential application as a source of certifiable random numbers. The lecture continues with an explanation of how this technology may be the basis for a future redefinition of the candela. This is followed by an overview of photon pressure and its use for calibrating small mass and force. The lecture then presents new concepts for measuring ultra-high vacuum and how atom-based sensors provide a means for measuring electric and magnetic fields in addition to chip-scale atomic clocks. The lecture concludes with a brief description of the dissemination of the quantum SI including the development of embedded “chip-scale” metrology in the broader infrastructure.
Gravity is the most mysterious of all interactions, even though it was the first to be formalized. It is the only interaction for which the equivalence principle holds between the gravitational charge and the inertial mass, with a number of far-reaching consequences. In addition, gravity is so weak with respect to all other interactions that it is not surprising that most precise tests of general relativity are performed in outer space. General relativity tells us that gravity concerns the curvature of space and time, so testing gravity means applying metrology at its best. In this review I discuss some of the most important tests of general relativity performed in space, starting from the observables, discussing the experimental methods, the current results and some of the the future developments.
This paper is a review of the existing primary standards for the measurement of the optical radiation; blackbody, synchrotron radiation and electrical substitution radiometer. A summary of the recent and significant advances in this filed with the carbon nanotube absorber in the electrical substitution radiometer and the development of the predictable quantum efficiency detector is described. Photon-counting principles for the realization of the radiometric and photometric quantities are introduced and finally the route for the realization of the candela, the measurement of the luminous intensity, is depicted.
Every day people make millions of successful temperature measurements. The people who make these measurements frequently identify ways in which they would like temperature measurement to be improved. Typically people would like results with lower uncertainty and at lower cost, particularly at extremes of temperature. But nobody has ever expressed — to me at least — any dissatisfaction with the definition of what we mean by “one degree”. Nonetheless, in 2018, the CIPM will introduce a new definition of the kelvin and degree Celsius. In this paper I discuss the background to this redefinition and explain why — despite not addressing users' pressing needs — the redefinition is desirable and indeed long overdue.
The international system of units will be redefined in terms of fundamental constants in 2018, specially the kilogram which is still defined, nowadays, by a material artefact. For many years, two experiments have been carried out to define the kilogram based on an atomic or a fundamental physical constant: the X-ray crystal density (XRCD) methods to determine the Avogadro constant (NA) and the watt balance to determine the Planck constant (h). Both constants can be compared using other fundamental constants issued from atomic physics. An overview of the determinations of these fundamental constants from atomic physics, since 1998, is given. Their contributions to the accuracy of the comparison between the two methods will be underlined. The consequences of an eventual evolution of their estimates will be also considered.
The lecture notes that follow are meant to provide a written accompaniment to the lectures given by one of the authors (WDP) in July 2016 at the “Metrology: from physics fundamentals to quality of life” Enrico Fermi International School of Physics in Varenna. The notes were originally written for an earlier Metrology school, but still provide a good summary of the material presented in 2016.
The third lecture in this series provides a summary of the primary realization of mass in the new SI and concludes with the relationship between the new Quantum SI and the field of quantum information science. Before describing how mass will be realized in the new SI, the lecture briefly reviews the problems with the current SI which includes mass and the fact that the quantum electrical standards are part of conventional units and not the SI. The lecture provides a very brief summary of how the kilogram will be realized using the X-ray crystal diffraction method and then provide a more detailed mise-en-pratique based on the NIST-4 watt balance as a primary realization. The lecture also provides an overview of future technologies for primary realization of small mass and force. The lecture then proceeds to look at the future of metrology by creating a connection between the Quantum SI and quantum information science. The new Quantum SI will allow us to test the Standard Model of physics and build sensors and technology good enough to provide high resolution gravity gradiometers, improved geodesy, and detectors that may detect dark matter. In concluding, the lecture returns to the value of embedded sensors and presents a view of the coming second quantum revolution.
Precise timekeeping and navigation systems have a lot in common. Both are based on ultraprecise atomic clocks, on the capacity to measure and synchronise clocks, on the formation of a common reference time traced to the international standard time. These similarities are described showing how the time and frequency metrology has found its major application in the navigation systems, and how the pressing needs of navigation have promoted the research in the time metrology fields. The paper mostly focus on the clock and navigation equations, the necessary algorithms, and the definition of time scales used as reference in navigation and in timekeeping.
The analysis of time series of measures and the important instabilities that they may reveal is presented, describing some of the techniques that can find useful application in the different fields of metrology.