Ebook: Foundations of Quantum Theory
This volume provides a summary of the lectures presented at the International School of Physics "Enrico Fermi" on the Foundations of Quantum Theory, organized by the Italian Physical Society in Varenna, Italy from 8-13 July 2016, in collaboration with the Wilhelm und Else Heraeus-Stiftung. It was the first "Enrico Fermi" Summer School on this topic since 1977.
Its main goal was to provide an overview of the recent theoretical and experimental developments in an active field of research, the foundations of quantum mechanics. The field is characterized by a dichotomy of unparalleled agreement between theory and experiment on the one hand, and an enormous variety of interpretations of the underlying mathematical formalism on the other hand.
This proceedings of the "Enrico Fermi" Summer School of July 2016 contains 21 contributions on a range of topics: the history and interpretations of quantum theory; the principle of complementarity and wave-particle duality; quantum theory from first principles; the reality of the wave function; the concept of the photon; measurement in quantum theory; the interface of quantum theory and general relativity; and quantum optical tests of quantum theory.
The International School of Physics “Enrico Fermi” on the Foundations of Quantum Theory was organized by the Italian Physical Society in Villa Monastero, Varenna, Italy, during July 8–13, 2016 in collaboration with the Wilhelm und Else Heraeus-Stiftung. In the great tradition of the Fermi Schools the main goal was to provide an overview of the recent theoretical and experimental developments of an active field of research, in this case the foundations of quantum mechanics. The timing is especially appropriate considering the fact that the last “Enrico Fermi” Summer School on this topic took place in 1977.
Quantum mechanics is characterized by a dichotomy of unparalleled agreement between theory and experiment on the one hand, and an enormous variety of interpretations of the underlying mathematical formalism on the other. David Mermin(
See for example: N.D. Mermin, What's wrong with this pillow?, Physics Today, 42 (4) (1989) 9. It is interesting that this quote is often attributed to Richard P. Feynman but we recommend N. D. Mermin, Could Feynman have said this?, Physics Today, 57 (5) (2004) 10.
However, triggered by the rapid advance of experimental techniques in quantum optics, and the development of the field of quantum technology which takes advantage of the correlations of entangled quantum systems, the question of the interpretation of quantum mechanics has recently again received a lot of attention. Moreover, the old conundrum of the physical reality of the wave function has now been tested by experiments using single photons.
These two examples are only to serve as an illustration of this active field. Indeed, the topics discussed at our school included but were not limited to the history and interpretations of quantum theory, the principle of complementarity and wave-particle duality, quantum theory from first principles, the reality of the wave function, the concept of the photon, measurement in quantum theory, the interface of quantum theory and general relativity, and quantum optical tests of quantum theory.
The present volume summarizes the lectures presented at our School which was attended by more than 80 participants including students, lecturers and seminar speakers from all over the world. All young scientists presented their research in two poster sessions which were each introduced by a “poster-flash” session. We are most grateful to Europhysics Letters for supporting a prize for the best posters. Dennis Rätzel (First Prize), Yael Avni (Second Prize), and Da-Wei Wang (Third Prize) together with the winners of the Fourth Prize Sven Abend, Lorenzo Catani and Piotr Roztocki were invited to summarize their contributions in this volume.
The Proceedings of the International School of Physics “Enrico Fermi” on the Foundations of Quantum Theory start from a historical perspective by two contributions by Nancy Thorndike Greenspan describing the life and the science of Max Born based on her book entitled The End of the Certain World.
In the spring of 1925 Werner Heisenberg had escaped from Göttingen to the island of Helgoland to cure his hay fever. It was there that he discovered quantum mechanics. Considering the radical changes in the principles it is not surprising that his seminal article in Zeitschrift für Physik is hard to read. Manfred Kleber in his contribution summarizes in a pregnant way the underlying ideas of Heisenberg's transition from Born's Atommechanik to Matrizenmechanik.
Although quantum mechanics originally started from matrix mechanics, today we almost always employ the formulation pioneered by Erwin Schrödinger based on a time-dependent wave function. The history and the derivation of the Schrödinger equation constitute the topic of the lectures by Wolfgang P. Schleich. His main theme is the linearization of the nonlinear wave equation of statistical mechanics.
Gerd Leuchs in his contribution complements the previous more mathematical approach towards the Schrödinger wave equation by a more intuitive one. He employs a combination of the analogy between matter and water waves, and dispersion relations.
In 1935 Schrödinger identified entanglement as the trademark of quantum mechanics. Indeed, it is at the very heart of Bell's inequalities and many alien features of quantum theory can be traced back to it. Edward Fry in his lectures summarizes in an impressive way the early history of this field emphasizing the important role of Grete Hermann. Already in 1935 she discovered the flaw in von Neumann's proof that it is impossible to complete quantum mechanics. Da-Wei Wang extends these two-particles considerations to three particles and proposes a new way to generate mesoscopic Greenberger-Horne-Zeilinger states. In the same spirit Piotr Roztocki employs in his contribution a frequency comb to create scalable quantum states.
The lectures by Marlan O. Scully illuminate the problem of time in the process of a quantum measurement. When and how does the observer change or reduce the state vector? Three frequently employed scenarios illustrate how “before” and “after” arguments can be misleading: The Einstein-Podolsky-Rosen situation, Wigner's friend and the quantum eraser.
John Archibald Wheeler (1911–2008) in his seminal article It from bit has vividly argued that quantum theory is information theory. In this way he can be considered the father of quantum information. He often stated that it should be possible to derive quantum mechanics from information theory. Christopher A. Fuchs has followed this path and has proposed the new interpretation of quantum mechanics, Quantum Bayesianism (QBism). In his lectures he first summarizes the most prominent interpretations of quantum mechanics and then provides an introduction into QBism. In the same spirit Giacomo Mauro D'Ariano derives from elementary principles of information theory free quantum field theories.
Across from Varenna at the West end of Lake Como is the place where Niels Bohr in 1927 introduced the principle of complementarity stimulated by Heisenberg's uncertainty relation. Sabine Wölk in her lecture notes employs simple measurements on quantum systems to compare complementarity and entanglement both of which have their roots in non-commuting operators.
The field of experimental quantum optics has opened new avenues towards tests of the foundations of quantum mechanics. Here the process of spontaneous parametric down conversion (SPDC) plays a central role and has created an avalanche of applications. Ralf Menzel in his lectures emphasizes the role of the mode function of the electromagnetic field as the carrier of the photon, and reviews his experiments on stimulated coherence and complementarity.
Quantum imaging is another product of SPDC. In his lecture Robert W. Boyd summarizes this active field by giving three examples: ghost imaging, imaging based on interaction-free measurements and imaging based on Mandel's induced coherence.
The reality of the wave function is an often debated question. So far it has been part of more philosophical discussions. However, SPDC has moved this realm from Gedanken experiments to real ones. The lectures by Andrew White addressed these issues. Unfortunately, due to time constraints he was not able to provide us with a paper. Likewise Aephraim Steinberg could not contribute. Fortunately, the notes by Lorenzo Catani et al. address some aspects of these questions. In particular, they discuss contextuality as a resource in quantum computation.
We recall that quantum mechanics originated from the analysis of blackbody radiation. In contrast to conventional wisdom Max Planck did not quantize the light in the resonator but the mechanical oscillators in the walls. Quantized electrodynamics had to wait till the Drei-Männer-Arbeit of Born, Heisenberg and Pascal Jordan. They rederived the result of Albert Einstein concerning the fluctuations of the radiation field in the thermal state from a field theoretical approach. The Casimir effect, that is the attraction of two uncharged conducting plates, is another consequence of these fluctuations. Yael Avni in her contribution summarizes her recent work with Ulf Leonhardt on Casimir forces in spherically symmetric dielectric media.
The theories of special and general relativity together with quantum mechanics are rightfully considered the major revolutions in physics of the 20th century. Despite the fact that by now they are almost 100 years old, general relativity and quantum mechanics have not been unified yet. The lectures of Daniel M. Greenberger provide insight into the reasons for the resistance and identify the strange roles of proper time and mass. He also discusses consequences originating from considering them as dynamical variables.
The field of atom optics is perfectly suited to probe this interface of quantum mechanics and general relativity. On the one hand we use the wave nature of atoms for interferometry, on the other hand due to their mass the atoms feel gravity. Ernst M. Rasel in his lectures provides an introduction into atom interferometry, discusses a quantum test of the equivalence principle and gives an outlook to experiments in space. The contribution of Sven Abend et al. expands on this theme and discusses a new avenue based on an atom-chip gravimeter.
Since light represents energy it must also gravitate. Already in 1931 Richard Tolman together with Paul Ehrenfest and Boris Podolsky showed that a pencil of light leads to a curvature of spacetime. The contribution by Dennis Rätzel et al. summarizes gravitational properties of light.
Not included in this volume are two other highlights of our school. Nancy Thorndike Greenspan had discovered a movie taken by the Nobel Prize winner Irving Langmuir at the Solvay Meeting of 1927. It was impressive to see the famous quantum physicists of the time “in action” rather than sitting around a table. Moreover, our school ended with the playing of the Mozart piano concerto A-major KV 488 recorded around 1965 by the Bavarian Radio Symphony Orchestra under the conductor Rudolf Albert. The soloist was Werner Heisenberg. We owed this pleasure to Manfred Kleber who had found this treasure. Many thanks, Manfred, for sharing it with us!
All activities were inspired by the breathtaking beauty of Lake Como, the Villa Monastero and its gardens, and by the rich heritage by the Enrico Fermi International School of Physics. The success of the school measured by the exceptionally large number of interactions between the participants and the extremely lively discussions, during and immediately after the talks, in the park and on several excursions, is also due to the excellent organizational and administrative support provided by the staff of the Italian Physical Society. We are also most grateful to the Wilhelm und Else Heraeus-Stiftung for its generous monetary support.
E.M. Rasel, W.P. Schleich and S. Wölk
Germany and German science underwent seismic upheavals from the start of World War I to the end of World War II. The political history exemplifies nationalistic aggression. The scientific history showcases breath-taking insights into understanding nature. Both histories revolutionized the world's future. How did the individual scientist maintain his focus and discipline to make these break-throughs in the midst of chaos? The lives of Max Born and his friends and colleagues give a glimpse into what they endured and what they accomplished.
Quantum mechanics has many fathers. The contributions of some have been lost to its story in part because of political events, the personalities of the fathers, and the overarching Copenhagen Interpretation that highlighted the ideas of Niels Bohr and Werner Heisenberg. One physicist who made fundamental contributions but who is little acknowledged is Max Born. His mathematical formulations provided much of the basis for the solution as well as the interpretation and completion of Heisenberg's mathematical insight.
The development of quantum mechanics is inseparably connected with Niels Bohr and Werner Heisenberg. We review the period during which the Bohr model was developed, reached its limits, and was finally replaced by Heisenberg's quantum mechanics. We show how the theory was obtained by a team of brilliant scientists. In this lecture we bring together historical aspects and mathematical details.
We obtain the Schrödinger equation from the Hamilton-Jacobi equation of classical mechanics together with the law of conservation of matter. It is the quantum current in this continuity equation which ensures the linearity of quantum mechanics.
For any kind of wave phenomenon one can find ways to derive the respective dispersion relation from experimental observations and measurements. This dispersion relation determines the structure of the wave equation and thus characterizes the dynamics of the respective wave. Different wave phenomena are thus governed by different differential equations. Here we want to emphasize the experimental approach to matter waves, but before doing so we will discuss and test the procedure for other types of waves, in particular water waves.
A brief history of the beginnings of quantum mechanics will be presented together with arguments regarding its interpretation; including the very important, but ignored, argument by Grete Hermann. The early discussions were basically philosophical and it was not until Bell produced his inequality that an experimental test became a real possibility. Nevertheless, in the two decades following Bell's development of his inequality, there was a negative attitude by many physicists towards questioning quantum mechanics and a disdain for doing experiments to test a Bell inequality. Nevertheless, four experiments were done in the 1970's by young physicists at the beginning of their careers; those initial experiments and their results will be briefly described. It should be noted that by the 1980's and with the completion of the experiments led by the young Alain Aspect, the culture had begun to change and many experiments have since been done; these culminate in the three recent tests of Bell inequalities that for the first time simultaneously closed both the detection and locality loopholes.
A mechanism of a chiral spin wave rotation is introduced to systematically generate mesoscopic Greenberger-Horne-Zeilinger states.
In the classical regime, frequency comb sources have revolutionized high-precision metrology and spectroscopy; in this paper, we discuss recent developments, which are extending their use to scalable quantum state generation.
The famous Einstein-Podolsky-Rosen (EPR) “paradox” is a good example of how “before” and “after” arguments can be misleading. Wigner's friend is an even better case asking questions such as: when and how does the observer change or “reduce” the state vector? Perhaps the best and most insightful Lehrbeispiel of how to think about before and after issues comes from the quantum eraser. In this case Baysian logic helps clear up before and after confusion via detailed, but simple, calculations.
This article situates QBism among other well-known interpretations of quantum mechanics. QBism has its roots in personalist Bayesian probability theory, is crucially dependent upon the tools of quantum information, and in latest developments has set out to investigate whether the physical world might be of a type sketched in the philosophies of pragmatism, pluralism, nonreductionism, and meliorism. Beyond conceptual issues, the technical side of QBism is focused on the mathematically hard problem of finding a good representation of quantum mechanics purely in terms of probabilities, without amplitudes or Hilbert-space operators. The best candidate representation involves an entity called a symmetric informationally complete quantum measurement, or SIC. Contemplation of it gives a way to think of the Born rule as an addition to the rules of probability theory, applicable when an agent considers gambling on the consequences of her actions on the external world, duly taking into account a new universal capacity: namely, Hilbert-space dimension. The article ends by showing that the egocentric elements in QBism represent no impediment to pursuing quantum cosmology and even open up possibilities never dreamt of in the usual formulations.
Recently the new information-theoretic paradigm of physics advocated by John Archibald Wheeler and Richard Feynman has concretely shown its full power, with the derivation of quantum theory and of free quantum field theory from informational principles. The paradigm has opened for the first time the possibility of avoiding physical primitives in the axioms of the physical theory, allowing a re-foundation of the whole physics over logically solid grounds. In addition to such methodological value, the new information-theoretic derivation of quantum field theory is particularly interesting for establishing a theoretical framework for quantum gravity, with the idea of obtaining gravity itself as emergent from the quantum information processing, as also suggested by the role played by information in the holographic principle. In this lecture notes I review how free quantum field theory is derived without using mechanical primitives, including space-time, special relativity, Hamiltonians, and quantization rules. The theory is provided by the simplest quantum algorithm encompassing a countable set of quantum systems whose network of interactions satisfies the three following simple principles: homogeneity, locality, and isotropy. The inherent discrete nature of the informational derivation leads to an extension of quantum field theory in terms of a quantum cellular automata and quantum walks. A simple heuristic argument sets the scale to the Planck one, and the currently observed regime where discreteness is not visible is the so-called “relativistic regime” of small wave vectors, which holds for all energies ever tested (and even much larger), where the usual free quantum field theory is perfectly recovered. In the present quantum discrete theory Einstein relativity principle can be restated without using space-time in terms of invariance of the eigenvalue equation of the automaton/walk under change of representations. Distortions of the Poincaré group emerge at the Planck scale, whereas special relativity is perfectly recovered in the relativistic regime. Discreteness, on the other hand, has some plus compared to the continuum theory: 1) it contains it as a special regime; 2) it leads to some additional features with GR flavor: the existence of an upper bound for the particle mass (with physical interpretation as the Planck mass), and a global De Sitter invariance; 3) it provides its own physical standards for space, time, and mass within a purely mathematical adimensional context. The lecture ends with future perspectives.
Since the beginning of quantum mechanics, many puzzling phenomena which distinguish the quantum from the classical world, have appeared such as complementarity, entanglement or contextuality. All of these phenomena are based on the existence of non-commuting observables in quantum mechanics. Furthermore, theses effects generate advantages which allow quantum technologies to surpass classical technologies. In this lecture note, we investigate two prominent examples of these phenomena: complementarity and entanglement. We discuss some of their basic properties and introduce general methods for their experimental investigation. In this way, we find many connections between the investigation of complementarity and entanglement. One of these connections is given by the Cauchy-Schwarz inequality which helps to formulate quantitative measurement procedures to observe complementarity as well as entanglement.
In quantum physics we are confronted with new entities which consist indivisible of an energy packet and a coupled wave. The complementarity principle for certain properties of these quantum objects may be their main mystery. Photons are especially useful to investigate these complementary properties. A series of new experiments using spontaneous parametric down-conversion (SPDC) as a tool allowed the detailed analysis of the physical background of this complementarity and offers a new conceptual perspective. Based on these results a straightforward explanation of these sometimes counterintuitive effects is given.
We present a brief overview of the field of quantum imaging, concentrating on some recent results. Quantum imaging is a specific example of quantum metrology, and we thus start out with a discussion of quantum metrology including the generation of squeezed light and the generation of entangled photon pairs through the process of spontaneous parametric downconversion (SPDC). We then proceed to review three different examples of quantum imaging, namely ghost imaging, imaging based on interaction-free measurements, and imaging based on Mandel's induced coherence.
Spekkens' toy model (SM) is a non-contextual hidden-variable model made to support the epistemic view of quantum theory, where quantum states are states of partial knowledge about a deeper underlying reality. Despite being a classical model, it has reproduced many features of quantum theory (entanglement, teleportation, …): (almost) everything but contextuality, which therefore seems to be the inherent quantum feature. In addition to the importance in foundation of quantum theory, the notion of contextuality seems to be a crucial resource for quantum computation. In particular it has been proven that, in the case of odd prime discrete dimensional systems, contextuality is necessary for universal quantum computation in state-injection schemes of computation based on stabilizer quantum mechanics (SQM). The latter is a subtheory of quantum mechanics which is very popular in the field of quantum computation and quantum error correction. State-injection schemes consist of a classically-simulable part (like SQM) and a resource state that boosts the computation to a quantum improvement. In the odd-dimensional case, SM is operationally equivalent to SQM. In the even-dimensional case, the equivalence only holds in terms of structure, not in terms of statistical predictions. This because qubit-SQM shows contextuality, while qudit(odd dimensions)-SQM does not. We believe that SM can be a valid tool to study contextuality as a resource in the field of quantum computation. Restricted versions of SM compatible with quantum mechanics (QM) can be used as the non-contextual classically-simulable part of state-injection schemes thus opening other scenarios where studying if contextuality is necessary for quantum computational speed-up.
The most well-known manifestation of the Casimir effect is the attraction of two uncharged conducting plates. However, it turns out that Casimir forces are all around us: they originate from vacuum fluctuations of the electromagnetic field that excite dipoles in dielectric and conducting materials. These dipoles then interact with each other, generating measurable forces between macroscopic bodies: the Casimir force. A naive calculation of the Casimir force produces infinities, and though extensive work has been done in the field, there is still no universal prescription to renormalize the force. In this paper, we introduce the subject of Casimir forces and focus on the Casimir self-stress inside a homogeneous sphere. We discuss previous calculations and suggest an additional renormalization scheme that could solve the problem.
Mass is conventionally introduced into physical theories as a passive parameter, m0. As such, it plays no dynamical role in the theory, nor can it change. But in practice, particles decay and recombine, changing their mass. They also acquire binding energies, changing their mass, and may also have an energy uncertainty, and so also a mass uncertainty. Similarly, the proper time of a particle is described along its trajectory. But quantum mechanically, trajectories can be split and recombined, or they may not be well-defined at all. So the proper time also has a dynamical role to play. We also show that there is a natural extension to the equivalence principle that is needed to include unstable particles. Both proper time and mass should be treated as quantum-mechanical operators, whose values are determined by measurement. The Hamiltonian formalism has a natural extension to include them as an extra coordinate and conjugate momentum, allowing one to construct both a classical and quantum theory of particles that can decay, have binding energies and obey the uncertainty principle.
We examine here some of the effects that are produced by considering mass and proper time as dynamical variables. First we consider Galilean invariance and point out that the Bargmann Theorem that masses cannot be superimposed in non-relativistic (NR) quantum theory is no longer valid. We also point out that while Galilean invariance is a consistent requirement of the NR Schrödinger equation as such, it provides a poor description of the NR limit of Lorentz invariance, as the proper time leaves a residue that is independent of c in this limit. Next we show that there is an inevitable uncertainty relation between mass and proper time and give several examples. Finally, we show that the classical limit is different for non-gravitational forces, and for gravitational forces that lead to the equivalence principle.
We provide an introduction into the field of atom optics and review our work on interferometry with cold atoms, and in particular with Bose-Einstein condensates. Here we emphasize applications of atom interferometry with sources of this kind. We discuss tests of the equivalence principle, a quantum tiltmeter, and a gravimeter.
We introduce two generations of quantum gravimeters using Bose-Einstein condensates generated in atom-chip–based set-ups. The first one is a prototype gravimeter implemented in QUANTUS-1, that allows us to demonstrate the first atom-chip–based gravity determination. The second device is a next generation quantum gravimeter QG-1, targeting sub-μGal uncertainties for mobile applications.
In this article, recent work by the authors on the gravitational properties of light is reviewed. In the first part, the gravitational field of a laser pulse of finite lifetime is investigated in the framework of linearized gravity. In the second part, the dependence of the differential cross section for gravitational photon-photon scattering on the polarization entanglement between the photons is investigated in Perturbative Quantum Gravity. These investigations are of conceptual interest regarding the properties of light and its constituents, the photons.