History of quantum field theory
In particle physics, the history of quantum field theory starts with its creation by Paul Dirac, when he attempted to quantize the electromagnetic field in the late 1920s. Major advances in the theory were made in the 1940s and 1950s, and led to the introduction of renormalized quantum electrodynamics. QED was so successful and accurately predictive that efforts were made to apply the same basic concepts for the other forces of nature. By the late 1970s, these efforts successfully utilized gauge theory in the strong nuclear force and weak nuclear force, producing the modern standard model of particle physics.
Efforts to describe gravity using the same techniques have, to date, failed. The study of quantum field theory is still flourishing, as are applications of its methods to many physical problems. It remains one of the most vital areas of theoretical physics today, providing a common language to several different branches of physics.
Early developments
Quantum field theory originated in the 1920s from the problem of creating a quantum mechanical theory of the electromagnetic field. In particular, de Broglie in 1924 introduced the idea of a wave description of elementary systems in the following way: "we proceed in this work from the assumption of the existence of a certain periodic phenomenon of a yet to be determined character, which is to be attributed to each and every isolated energy parcel".In 1925, Werner Heisenberg, Max Born, and Pascual Jordan constructed just such a theory by expressing the field's internal degrees of freedom as an infinite set of harmonic oscillators, and by then utilizing the canonical quantization procedure to these oscillators; their paper was published in 1926. This theory assumed that no electric charges or currents were present and today would be called a free field theory.
The first reasonably complete theory of quantum electrodynamics, which included both the electromagnetic field and electrically charged matter as quantum mechanical objects, was created by Paul Dirac in 1927. This quantum field theory could be used to model important processes such as the emission of a photon by an electron dropping into a quantum state of lower energy, a process in which the number of particles changes—one atom in the initial state becomes an atom plus a photon in the final state. It is now understood that the ability to describe such processes is one of the most important features of quantum field theory.
The final crucial step was Enrico Fermi's theory of β-decay. In it, fermion species nonconservation was shown to follow from second quantization: creation and annihilation of fermions came to the fore and quantum field theory was seen to describe particle decays.
Incorporating special relativity
It was evident from the beginning that a proper quantum treatment of the electromagnetic field had to somehow incorporate Einstein's relativity theory, which had grown out of the study of classical electromagnetism. This need to put together relativity and quantum mechanics was the second major motivation in the development of quantum field theory. Pascual Jordan and Wolfgang Pauli showed in 1928 that quantum fields could be made to behave in the way predicted by special relativity during coordinate transformations. A further boost for quantum field theory came with the discovery of the Dirac equation, which was originally formulated and interpreted as a single-particle equation analogous to the Schrödinger equation, but unlike the Schrödinger equation, the Dirac equation satisfies both the Lorentz invariance, that is, the requirements of special relativity, and the rules of quantum mechanics.The Dirac equation accommodated the spin-1/2 value of the electron and accounted for its magnetic moment as well as giving accurate predictions for the spectra of hydrogen.
The attempted interpretation of the Dirac equation as a single-particle equation could not be maintained long, however, and finally it was shown that several of its undesirable properties could be made sense of by reformulating and reinterpreting the Dirac equation as a true field equation, in this case for the quantized "Dirac field" or the "electron field", with the "negative-energy solutions" pointing to the existence of anti-particles. This work was performed first by Dirac himself with the invention of hole theory in 1930 and by Wendell Furry, Robert Oppenheimer, Vladimir Fock, and others. Schrödinger, during the same period that he discovered his famous equation in 1926, also independently found the relativistic generalization of it known as the Klein–Gordon equation but dismissed it since, without spin, it predicted impossible properties for the hydrogen spectrum. All relativistic wave equations that describe spin-zero particles are said to be of the Klein–Gordon type.
Uncertainty, again
A subtle and careful analysis in 1933 by Niels Bohr and Léon Rosenfeld showed that there is a fundamental limitation on the ability to simultaneously measure the electric and magnetic field strengths that enter into the description of charges in interaction with radiation, imposed by the uncertainty principle, which must apply to all canonically conjugate quantities. This limitation is crucial for the successful formulation and interpretation of a quantum field theory of photons and electrons, and indeed, any perturbative quantum field theory. The analysis of Bohr and Rosenfeld explains fluctuations in the values of the electromagnetic field that differ from the classically "allowed" values distant from the sources of the field.Their analysis was crucial to showing that the limitations and physical implications of the uncertainty principle apply to all dynamical systems, whether fields or material particles. Their analysis also convinced most physicists that any notion of returning to a fundamental description of nature based on classical field theory, such as what Einstein aimed at with his numerous and failed attempts at a classical unified field theory, was simply out of the question. Fields had to be quantized.
Second quantization
The third thread in the development of quantum field theory was the need to handle the statistics of many-particle systems consistently and with ease. In 1927, Pascual Jordan tried to extend the canonical quantization of fields to the many-body wave functions of identical particles using a formalism which is known as statistical transformation theory; this procedure is now sometimes called second quantization. In 1928, Jordan and Eugene Wigner found that the quantum field describing electrons, or other fermions, had to be expanded using anti-commuting creation and annihilation operators due to the Pauli exclusion principle. This thread of development was incorporated into many-body theory and strongly influenced condensed matter physics and nuclear physics.The problem of infinities
Despite its early successes quantum field theory was plagued by several serious theoretical difficulties. Basic physical quantities, such as the self-energy of the electron, the energy shift of electron states due to the presence of the electromagnetic field, gave infinite, divergent contributions—a nonsensical result—when computed using the perturbative techniques available in the 1930s and most of the 1940s. The electron self-energy problem was already a serious issue in the classical electromagnetic field theory, where the attempt to attribute to the electron a finite size or extent led immediately to the question of what non-electromagnetic stresses would need to be invoked, which would presumably hold the electron together against the Coulomb repulsion of its finite-sized "parts". The situation was dire, and had certain features that reminded many of the "Rayleigh–Jeans catastrophe". What made the situation in the 1940s so desperate and gloomy, however, was the fact that the correct ingredients for the theoretical description of interacting photons and electrons were well in place, and no major conceptual change was needed analogous to that which was necessitated by a finite and physically sensible account of the radiative behavior of hot objects, as provided by the Planck radiation law.Renormalization procedures
This "divergence problem" was solved in the case of quantum electrodynamics through the procedure known as renormalization in 1947–49 by Hans Kramers, Hans Bethe, Julian Schwinger, Richard Feynman, and Shin'ichiro Tomonaga; the procedure was systematized by Freeman Dyson in 1949. Great progress was made after realizing that all infinities in quantum electrodynamics are related to two effects: the self-energy of the electron/positron, and vacuum polarization.Renormalization requires paying very careful attention to just what is meant by, for example, the very concepts "charge" and "mass" as they occur in the pure, non-interacting field-equations. The "vacuum" is itself polarizable and, hence, populated by virtual particle pairs, and, hence, is a seething and busy dynamical system in its own right. This was a critical step in identifying the source of "infinities" and "divergences". The "bare mass" and the "bare charge" of a particle, the values that appear in the free-field equations, are abstractions that are simply not realized in experiment. What we measure, and hence, what we must take account of with our equations, and what the solutions must account for, are the "renormalized mass" and the "renormalized charge" of a particle. That is to say, the "shifted" or "dressed" values these quantities must have when due systematic care is taken to include all deviations from their "bare values" is dictated by the very nature of quantum fields themselves.
Gauge invariance
The first approach that bore fruit is known as the "interaction representation", a Lorentz-covariant and gauge-invariant generalization of time-dependent perturbation theory used in ordinary quantum mechanics, and developed by Tomonaga and Schwinger, generalizing earlier efforts of Dirac, Fock and Podolsky. Tomonaga and Schwinger invented a relativistically covariant scheme for representing field commutators and field operators intermediate between the two main representations of a quantum system, the Schrödinger and the Heisenberg representations. Within this scheme, field commutators at separated points can be evaluated in terms of "bare" field creation and annihilation operators. This allows for keeping track of the time-evolution of both the "bare" and "renormalized", or perturbed, values of the Hamiltonian and expresses everything in terms of the coupled, gauge invariant "bare" field-equations. Schwinger gave the most elegant formulation of this approach. The next and most famous development is due to Richard Feynman, who, with his brilliant rules for assigning a "graph"/"diagram" to the terms in the scattering matrix. These directly corresponded to the measurable physical processes one needs to be able to calculate. This revolutionized how quantum field theory calculations are carried-out in practice.Two classic text-books from the 1960s, James D. Bjorken, Sidney David Drell, Relativistic Quantum Mechanics and J. J. Sakurai, Advanced Quantum Mechanics, thoroughly developed the Feynman graph expansion techniques using physically intuitive and practical methods following from the correspondence principle, without worrying about the technicalities involved in deriving the Feynman rules from the superstructure of quantum field theory itself. Although both Feynman's heuristic and pictorial style of dealing with the infinities, as well as the formal methods of Tomonaga and Schwinger, worked extremely well, and gave spectacularly accurate answers, the true analytical nature of the question of "renormalizability", that is, whether ANY theory formulated as a "quantum field theory" would give finite answers, was not worked-out until much later, when the urgency of trying to formulate finite theories for the strong and electro-weak demanded its solution.
Renormalization in the case of QED was largely fortuitous due to the smallness of the coupling constant, the fact that the coupling has no dimensions involving mass, the so-called fine-structure constant, and also the zero-mass of the gauge boson involved, the photon, rendered the small-distance/high-energy behavior of QED manageable. Also, electromagnetic processes are very "clean" in the sense that they are not badly suppressed/damped and/or hidden by the other gauge interactions. By 1965 James D. Bjorken and Sidney David Drell observed: "Quantum electrodynamics has achieved a status of peaceful coexistence with its divergences ...".
The unification of the electromagnetic force with the weak force encountered initial difficulties due to the lack of accelerator energies high enough to reveal processes beyond the Fermi interaction range. Additionally, a satisfactory theoretical understanding of hadron substructure had to be developed, culminating in the quark model.
Non-abelian gauge theory
Thanks to the somewhat brute-force, ad hoc and heuristic early methods of Feynman, and the abstract methods of Tomonaga and Schwinger, elegantly synthesized by Freeman Dyson, from the period of early renormalization, the modern theory of quantum electrodynamics has established itself. It is still the most accurate physical theory known, the prototype of a successful quantum field theory. Quantum electrodynamics is the most famous example of what is known as an Abelian gauge theory. It relies on the symmetry group U and has one massless gauge field, the U gauge symmetry, dictating the form of the interactions involving the electromagnetic field, with the photon being the gauge boson.Beginning in the 1950s with the work of Yang and Mills, following the previous lead of Weyl and Pauli, deep explorations illuminated the types of symmetries and invariances any field theory must satisfy. QED, and indeed, all field theories, were generalized to a class of quantum field theories known as gauge theories. That symmetries dictate, limit and necessitate the form of interaction between particles is the essence of the "gauge theory revolution". Yang and Mills formulated the first explicit example of a non-abelian gauge theory, Yang–Mills theory, with an attempted explanation of the strong interactions in mind. The strong interactions were then understood in the mid-1950s, to be mediated by the pi-mesons, the particles predicted by Hideki Yukawa in 1935, based on his profound reflections concerning the reciprocal connection between the mass of any force-mediating particle and the range of the force it mediates. This was allowed by the uncertainty principle. In the absence of dynamical information, Murray Gell-Mann pioneered the extraction of physical predictions from sheer non-abelian symmetry considerations, and introduced non-abelian Lie groups to current algebra and so the gauge theories that came to supersede it.
The 1960s and 1970s saw the formulation of a gauge theory now known as the Standard Model of particle physics, which systematically describes the elementary particles and the interactions between them. The strong interactions are described by quantum chromodynamics, based on "color" SU. The weak interactions require the additional feature of spontaneous symmetry breaking, elucidated by Yoichiro Nambu and the adjunct Higgs mechanism, considered next.
Electroweak unification
The electroweak interaction part of the standard model was formulated by Sheldon Glashow, Abdus Salam, and John Clive Ward in 1959 with their discovery of the SUxU group structure of the theory. In 1967, Steven Weinberg brilliantly invoked the Higgs mechanism for the generation of the W and Z masses and keeping the mass of the photon zero. The Goldstone and Higgs idea for generating mass in gauge theories was sparked in the late 1950s and early 1960s when a number of theoreticians noticed a possibly useful analogy to the breaking of the U symmetry of electromagnetism in the formation of the BCS ground-state of a superconductor. The gauge boson involved in this situation, the photon, behaves as though it has acquired a finite mass.There is a further possibility that the physical vacuum does not respect the symmetries implied by the "unbroken" electroweak Lagrangian from which one arrives at the field equations. The electroweak theory of Weinberg and Salam was shown to be renormalizable and hence consistent by Gerardus 't Hooft and Martinus Veltman. The Glashow–Weinberg–Salam theory is a triumph and, in certain applications, gives an accuracy on a par with quantum electrodynamics.
Quantum chromodynamics
In the case of the strong interactions, progress concerning their short-distance/high-energy behavior was much slower and more frustrating. For strong interactions with the electro-weak fields, there were difficult issues regarding the strength of coupling, the mass generation of the force carriers as well as their non-linear, self interactions. Although there has been theoretical progress toward a grand unified quantum field theory incorporating the electro-magnetic force, the weak force and the strong force, empirical verification is still pending. Superunification, incorporating the gravitational force, is still very speculative, and is under intensive investigation by many of the best minds in contemporary theoretical physics. Gravitation is a tensor field description of a spin-2 gauge-boson, the "graviton", and is further discussed in the articles on general relativity and quantum gravity.Quantum gravity
From the point of view of the techniques of quantum field theory, and as the numerous efforts to formulate a consistent quantum gravity theory attests, gravitational quantization has been the reigning champion for bad behavior.There are technical problems underlain by the fact that the gravitational coupling constant has dimensions involving inverse powers of mass, and, as a simple consequence, it is plagued by perturbatively badly behaved non-linear self-interactions. Gravity is itself a source of gravity, analogously to gauge theories leading to uncontrollable divergences at increasing orders of perturbation theory.
Moreover, gravity couples to all energy equally strongly, as per the equivalence principle, so this makes the notion of ever really "switching-off", "cutting-off" or separating, the gravitational interaction from other interactions ambiguous, since, with gravitation, we are dealing with the very structure of space-time itself.
Moreover, it has not been established that a theory of quantum gravity is necessary.
Contemporary framework of renormalization
Parallel breakthroughs in the understanding of phase transitions in condensed matter physics led to novel insights based on the renormalization group. They involved the work of Leo Kadanoff and Kenneth Geddes Wilson & Michael Fisher —extending the work of Ernst Stueckelberg–André Petermann and Murray Gell-Mann–Francis Low —which led to the seminal reformulation of quantum field theory by Kenneth Geddes Wilson in 1975.This reformulation provided insights into the evolution of effective field theories with scale, which classified all field theories, renormalizable or not. The remarkable conclusion is that, in general, most observables are "irrelevant", i.e., the macroscopic physics is dominated by only a few observables in most systems.
During the same period, Kadanoff introduced an operator algebra formalism for the two-dimensional Ising model, a widely studied mathematical model of ferromagnetism in statistical physics. This development suggested that quantum field theory describes its scaling limit. Later, there developed the idea that a finite number of generating operators could represent all the correlation functions of the Ising model.
The renormalization group spans a set of ideas and methods to monitor changes of the behavior of the theory with scale, providing a deep physical understanding which sparked what has been called the "grand synthesis" of theoretical physics, uniting the quantum field theoretical techniques used in particle physics and condensed matter physics into a single powerful theoretical framework.
The gauge field theory of the strong interactions, quantum chromodynamics, QCD, relies crucially on this renormalization group for its distinguishing characteristic features, asymptotic freedom and color confinement.
Modern developments
- Algebraic quantum field theory
- Axiomatic quantum field theory
- Topological quantum field theory