Supplement to Quantum Field Theory

The History of QFT

The Early Development

The historical development of QFT is very instructive until the present day. Its first achievement, namely the quantization of the electromagnetic field is “still the paradigmatic example of a successful quantum field theory” (Weinberg 1995). Ordinary QM cannot give an account of photons which constitute the prime case of relativistic ‘particles’. Since photons have the rest mass zero, and correspondingly travel in the vacuum at the velocity, naturally, of light \(c\) it is ruled out that a non-relativistic theory such as ordinary QM could give even an approximate description. Photons are implicitly contained in the emission and absorption processes which have to be postulated, for instance, when one of an atom’s electrons makes a transition from a higher to a lower energy level or vice versa. However, only the formalism of QFT contains an explicit description of photons. In fact most topics in the early development of quantum theory (1900–1927) were related to the interaction of radiation and matter and should be treated by quantum field theoretical methods. However, the approach to quantum mechanics formulated by Dirac, Heisenberg and Schrödinger (1926/27) started from atomic spectra and did not rely very much on problems of radiation.

As soon as the conceptual framework of quantum mechanics was developed, a small group of theoreticians immediately tried to extend the methods to electromagnetic fields. A good example is the famous three-man paper by M. Born, W. Heisenberg, and P. Jordan (1926). Especially P. Jordan was acquainted with the literature on light quanta and made important contributions to QFT. The basic analogy was that in QFT field quantities, i.e., the electric and magnetic field, should be represented by matrices in the same way as in QM position and momentum are represented by matrices. The ideas of QM were extended to systems having an infinite number of degrees of freedom.

The inception of QFT is usually dated 1927 with Dirac’s famous paper on “The quantum theory of the emission and absorption of radiation” (Dirac 1927). Here Dirac coined the name quantum electrodynamics (QED) which is the part of QFT that has been developed first. Dirac supplied a systematic procedure for transferring the characteristic quantum phenomenon of discreteness of physical quantities from the quantum mechanical treatment of particles to a corresponding treatment of fields. Employing the quantum mechanical theory of the harmonic oscillator, Dirac gave a theoretical description of how photons appear in the quantization of the electromagnetic radiation field. Later, Dirac’s procedure became a model for the quantization of other fields as well. During the following three years the first approaches to QFT were further developed. P. Jordan introduced creation operators for fields obeying Fermi statistics. For an elementary discussion of quantum statistics (Fermi and Bose), see the entry on quantum theory: identity and individuality.

So the methods of QFT could be applied to equations resulting from the quantum mechanical (field like) treatment of particles like the electron (e.g., Dirac equation). Schweber points out (Schweber 1994, p. 28) that the idea and procedure of that “second quantization” goes back to Jordan, in a number of papers from 1927 (see references in Schweber 1994, pp. 695f), while the expression itself was coined by Dirac. Some difficult problems concerning commutation relations, statistics and Lorentz invariance could be solved. The first comprehensive account of a general theory of quantum fields, in particular the method of canonical quantization, was presented in Heisenberg & Pauli 1929. Whereas Jordan’s second quantization procedure applies to the coefficients of the normal modes of the field, Heisenberg & Pauli 1929 started with the fields themselves and subjected them to the canonical procedure. Heisenberg and Pauli thus established the basic structure of QFT which can be found in any introduction to QFT up to the present day. Fermi and Dirac, Fock and Podolski presented different formulations which played a heuristic role in the following years.

Quantum electrodynamics, the historical as well as systematical entrée to QFT, rests on two pillars (see, e.g., the short and lucid “Historical Introduction” of Scharf’s (1995) original book). The first pillar results from the quantization of the electromagmetic field, i.e., it is about photons as the quantized excitations or ‘quanta’ of the electromagnetic field. This procedure will be described in some more detail in the section on the particle interpretation. As Weinberg points out the “photon is the only particle that was known as a field before it was detected as a particle” so that it is natural that QED began with the analysis of the radiation field (Weinberg 1995, p. 15). The second pillar of QED consists in the relativistic theory of the electron, with the Dirac equation in its centre.

The Emergence of Infinities

Quantum field theory started with a theoretical framework that was built in analogy to quantum mechanics. Although there was no unique and fully developed theory, quantum field theoretical tools could be applied to concrete processes. Examples are the scattering of radiation by free electrons (“Compton scattering”), the collision between relativistic electrons or the production of electron-positron pairs by photons. Calculations to the first order of approximation were quite successful, but most people working in the field thought that QFT still had to undergo a major change. On the one side some calculations of effects for cosmic rays clearly differed from measurements. On the other side and, from a theoretical point of view more threatening, calculations of higher orders of the perturbation series led to infinite results. The self-energy of the electron as well as vacuum fluctuations of the electromagnetic field seemed to be infinite. The perturbation expansions did not converge to a finite sum and even most individual terms were divergent.

The various forms of infinities suggested that the divergences were more than failures of specific calculations. Many physicists tried to avoid the divergences by formal tricks (truncating the integrals at some value of momentum, or even ignoring infinite terms) but such rules were not reliable, violated the requirements of relativity and were not considered as satisfactory. Others came up with first ideas of coping with infinities by a redefinition of the parameters of the theory and using a measured finite value (for example of the charge of the electron) instead of the infinite ‘bare’ value (“renormalization”).

From the point of view of philosophy of science it is remarkable that these divergences did not give enough reason to discard the theory. The years from 1930 to the beginning of World War II were characterized by a variety of attitudes towards QFT. Some physicists tried to circumvent the infinities by more-or-less arbitrary prescriptions, others worked on transformations and improvements of the theoretical framework. Most of the theoreticians believed that QED would break down at high energies. There was also a considerable number of proposals in favour of alternative approaches. These proposals included changes in the basic concepts (e.g., negative probabilities), interactions at a distance instead of a field theoretical approach, and a methodological change to phenomenological methods that focusses on relations between observable quantities without an analysis of the microphysical details of the interaction (the so-called S-matrix theory where the basic elements are amplitudes for various scattering processes).

Despite the feeling that QFT was imperfect and lacking rigour, its methods were extended to new areas of applications. In 1933 Fermi’s theory of the beta decay started with conceptions describing the emission and absorption of photons, transferred them to beta radiation and analyzed the creation and annihilation of electrons and neutrinos (weak interaction). Further applications of QFT outside of quantum electrodynamics succeeded in nuclear physics (strong interaction). In 1934 a new type of fields (scalar fields), described by the Klein-Gordon equation, could be quantized (another example of “second quantization”). This new theory for matter fields could be applied a decade later when new particles, pions, were detected.

The Taming of Infinities

After the end of World War II more reliable and effective methods for dealing with infinities in QFT were developed, namely coherent and systematic rules for performing relativistic field theoretical calculations, and a general renormalization theory. On three famous conferences between 1947 and 1949 developments in theoretical physics were confronted with relevant new experimental results. In the late forties there were two different ways to address the problem of divergences. One of these was discovered by Feynman, the other one (based on an operator formalism) by Schwinger and independently by Tomonaga. In 1949 Dyson showed that the two approaches are in fact equivalent. Thus, Freeman Dyson, Richard P. Feynman, Julian Schwinger and Sin-itiro Tomonaga became the inventors of renormalization theory. The most spectacular experimental successes of renormalization theory were the calculations of the anomalous magnetic moment of electron and the Lamb shift in the spectrum of hydrogen. These successes were so outstanding because the theoretical results were in better agreement with high precision experiments than anything in physics before. Nevertheless, mathematical problems lingered on and prompted a search for rigorous formulations (to be discussed in the main article).

The basic idea of renormalization is to avoid divergences that appear in physical predictions by shifting them into a part of the theory where they do not influence empirical propositions. Dyson could show that a rescaling of charge and mass (‘renormalization’) is sufficient to remove all divergences in QED to all orders of perturbation theory. In general, a QFT is called renormalizable, if all infinities can be absorbed into a redefinition of a finite number of coupling constants and masses. A consequence is that the physical charge and mass of the electron must be measured and cannot be computed from first principles. Perturbation theory gives well defined predictions only in renormalizable quantum field theories, and luckily QED, the first fully developed QFT, belonged to this class of renormalizable theories. There are various technical procedures to renormalize a theory. One way is to cut off the integrals in the calculations at a certain value \(\Lambda\) of the momentum which is large but finite. This cut-off procedure is successful if, after taking the limit \(\Lambda \rightarrow \infty\), the resulting quantities are independent of \(\Lambda\) (Part II of Peskin & Schroeder 1995 gives an extensive description of renormalization).

Feynman’s formulation of QED is of special interest from a philosophical point of view. His so-called space-time approach is visualized by the famous Feynman diagrams that look like depicting paths of particles. Feynman’s method of calculating scattering amplitudes is based on the functional integral formulation of field theory (for an introduction to the theory and practice of Feynman diagrams see, e.g., chapter 4 in Peskin & Schroeder 1995). A set of graphical rules can be derived so that the probability of a specific scattering process can be calculated by drawing a diagram of that process and then using the diagram to write down the mathematical expressions for calculating its amplitude. The diagrams provide an effective way to organize and visualize the various terms in the perturbation series, and they seem to display the flow of electrons and photons during the scattering process. External lines in the diagrams represent incoming and outgoing particles, internal lines are connected with ‘virtual particles’ and vertices with interactions. Each of these graphical elements is associated with mathematical expressions that contribute to the amplitude of the respective process. The diagrams are part of Feynman’s very efficient and elegant algorithm for computing the probability of scattering processes. The idea of particles travelling from one point to another was heuristically useful in constructing the theory, and moreover, this intuition is useful for concrete calculations. Nevertheless, an analysis of the theoretical justification of the space-time approach shows that its success does not imply that particle paths have to be taken seriously. General arguments against a particle interpretation of QFT clearly exclude that the diagrams represent paths of particles in the interaction area. Feynman himself was not particularly interested in ontological questions.

The Standard Model of Particle Physics

In the beginning of the 1950s QED had become a reliable theory which no longer counted as preliminary. It took two decades from writing down the first equations until QFT could be applied to interesting physical problems in a systematic way. The new developments made it possible to apply QFT to new particles and new interactions. In the following decades QFT was extended to describe not only the electromagnetic force but also weak and strong interaction so that new Lagrangians had to be found which contain new classes of ‘particles’ or quantum fields. The research aimed at a more comprehensive theory of matter and in the end at a unified theory of all interactions. New theoretical concepts had to be introduced, mainly connected with non-Abelian gauge theories and spontaneous symmetry breaking. See also the entry on symmetry and symmetry breaking. Today there are trustworthy theories of the strong, weak, and electromagnetic interactions of elementary particles which have a similar structure as QED. A combined theory associated with the gauge group \(\text{SU}(3) \otimes \text{SU}(2) \otimes \text{U}(1)\) is considered as ‘the standard model’ of elementary particle physics which was achieved by Glashow, Weinberg and Salam in 1962. According to the standard model there are, one the one side, six types of leptons (e.g. the electron and its neutrino) and six types of quarks, where the members of both group are all fermions with spin 1/2. On the other side, there are spin 1 particles (thus bosons) that mediate the interaction between elementary particles and the fundamental forces, namely the photon for electromagnetic interaction, two \(W\) and one \(Z\) boson for weak interaction, and the gluon for strong interaction. Altogether there is good agreement with experimental data, for example the masses of \(W^+\) and \(W^-\) bosons (detected in 1983) confirmed the theoretical prediction within one per cent deviation.

Further Reading. The first chapter in Weinberg 1995 is a very good short description of the earlier history of QFT. Detailed accounts of the historical development of QFT can be found, e.g., in Darrigol 1986, Schweber 1994 and Cao 1997a. Various historical and conceptual studies of the standard model are gathered in Hoddeson et al. 1997 and of renormalization theory in Brown 1993.

Copyright © 2020 by
Meinard Kuhlmann <mkuhlmann@uni-mainz.de>

Open access to the SEP is made possible by a world-wide funding initiative.
The Encyclopedia Now Needs Your Support
Please Read How You Can Help Keep the Encyclopedia Free