Theorists at work
Theorists at work






















Teoria dei Campi ed Interazioni Fondamentali

Inquadramento dei temi







Uniting forces

In 1864, James Clerk Maxwell explained electricity and magnetism as two manifestations of a single, unified electromagnetic force. As a direct result he predicted the existence of electromagnetic waves and also showed that light is an electromagnetic wave. A hundred years later, Sheldon Glashow, Abdus Salam and Steven Weinberg independently discovered how the electromagnetic force could be described in the same theory as the weak force. In this way they linked the phenomena of radioactivity and thermonuclear fusion in stars with the more familiar effects of electricity and magnetism. The "electroweak" theory of Glashow, Salam and Weinberg predicted that there must be a neutral carrier for the weak force (called Z0) as well as charged carriers (called W+ and W-), and that the Z0 must give rise to weak neutral reactions, previously unseen. The first few of these "weak neutral current" interactions were seen at CERN in the Gargamelle bubble chamber in 1973.
Ten years later, the UA1 and UA2 experiments at CERN proved the existence of both the Z0 and the Ws. Carlo Rubbia and Simon van der Meer from CERN were awarded the Nobel prize for physics in 1984, respectively for the discovery of these particles and for the development of the "stochastic cooling", the method which has made this discovery possible. Electroweak theory now forms a central part of the description of forces in the Standard Model.The strong force, meanwhile, is described by a mathematically similar theory, in which eight different kinds of gluon carry the force between quarks. This theory is called quantum chromodynamics.

The successful unification of the electromagnetic and weak forces has led physicists to search for the possibility of also including the strong force in a unified scheme (a "grand unified theory") and even to contemplate the possibility of including gravity, thus unifying all the forces of Nature in a single "super force". However, much experimental and theoretical work is needed before such a goal is achieved.


New name for an old theory





Modern theories used today in the standard model of particle physics for describing the interactions of particles are all gauge theories. The term gauge relates to a particular feature of these theories, gauge symmetry, viewed by many researchers as one of the most fundamental features of physics. Yet as early as in the 1860s the Scotsman James Clerk Maxwell formulated a theory of electromagnetism which in today's modern terminology is a gauge theory. His theory, which still holds, united electricity with magnetism and predicted, among other things, the existence of radio waves.

We can illustrate the concept of gauge symmetry as follows. Electric and magnetic fields can be expressed using potential functions. These can be exchanged (gauge-transformed ) according to a certain rule without changing the fields. The very simplest transformation is to add a constant to the electrical potential. Physically this illustrates the well-known fact that electrical potential can be calculated from an arbitrary zero point, since only the differences in potential are of significance. This is why a squirrel can walk along a high-voltage cable without being injured. That the zero point can be moved in this way is perceived by physicists as a symmetry in the theory, gauge symmetry.

The order in which one performs two gauge transformations is immaterial. We normally say that electromagnetism is an abelian gauge theory, after the Norwegian mathematician Niels Henrik Abel, who lived between 1802 and 1829.

Quantum mechanics raises problems





Directly after quantum mechanics had been formulated around 1925 attempts were made to unify the wave functions of quantum mechanics and the fields of electromagnetism into a quantum field theory. But problems arose. The new quantum electrodynamics became complicated and attempts to perform calculations often gave unreasonable results. One reason was that quantum theory predicts that the electromagnetic fields close to e.g. an electron or a proton can spontaneously generate quantities of very short-lived particles and anti-particles, virtual particles .
A system of only one electron suddenly became a multi-particle problem!

The problem was solved in the 1940s by Sin-Itiro Tomonaga, Julian Schwinger and Richard P. Feynman (who shared the 1965 Nobel Prize in physics for their contributions). The method developed by these three is called renormalization and, simply expressed, means that individual particles can be viewed "somewhat at a distance". In this way it is unnecessary to consider the virtual particle pairs individually: the "cloud" of virtual particles can be allowed to obscure the central, original particle. In this way, the original particle gains a new charge and a new mass, among other things. In modern terminology, Tomonaga, Schwinger and Feynman renormalized an abelian gauge theory.



Quantum electrodynamics has been tested with greater accuracy than any other theory in physics. Thus for example Hans Dehmelt (Nobel Prize in Physics 1989) succeeded in measuring electron magnetism in an ion trap with an accuracy of 12 digits. The first 10 digits agreed directly with calculated results.




Unified electromagnetic and weak interaction



The discovery and study of radioactivity and the subsequent development of atomic physics during the first half of the twentieth century produced the concepts of strong and weak interaction. In simple terms strong interaction holds the atomic nucleus together while weak interaction allows certain nuclei to decay radioactively. As early as the 1930s a first quantum field theory for weak interaction was formulated. This theory suffered from problems that were even worse than those quantum electrodynamics had had and not even the renormalization method of Tomonaga, Schwinger and Feynman could solve them.

But in the mid-1950s the researchers Chen Ning Yang and Robert L. Mills found a first example of a quantum field theory with new features, a non-abelian gauge theory. As opposed to the abelian variant, in which gauge transformations can be performed in any order, the result of the non-abelian depends on the order. This gives the theory a more complicated mathematical structure but also opens up new possibilities. (A simple example of non-abelian transformations is rotations in space. Try it yourself with a pencil, as shown in
Figure 2b.)

The new possibilities of the theory were not fully exploited until the 1960s when a number of researchers collaborated in the development of a non-abelian gauge theory that unites electromagnetism and weak interaction into an electro-weak interaction (Nobel Prize 1979 to Sheldon L. Glashow, Abdus Salam and Steven Weinberg). This quantum field theory predicted the new particles W and Z which were detected in 1983 at the European CERN accelerator laboratory in Geneva (Nobel Prize 1984 to Carlo Rubbia and Simon van der Meer).




History repeats itself



While the theory of electro-weak interaction developed in the 1960s was a great step forward, the research community at first found it difficult to accept. When they tried to use the theory for calculating in more detail the properties of the new W and Z particles (and many other physical quantities) it gave unreasonable results. The situation resembled that of the 1930s before Tomonaga, Schwinger and Feynman had succeeded in renormalizing quantum electrodynamics. Many researchers were pessimistic about the possibilities of going further with such a theory.

One person who had not given up hope of being able to renormalize non-abelian gauge theories was Martinus J. G. Veltman.
With the help of Veltman's computer program 't Hooft's partial results were now verified and together they worked out a calculation method in detail. The non-abelian gauge theory of electro-weak interaction had become a functioning theoretic machinery and it was possible, just as it had become for quantum electrodynamics 20 years previously, to start performing precise calculations.


The theory's predictions verified



As described above, the theory of the electro-weak force predicted the existence of the new W and Z particles right from the start. But it was only through 't Hooft's and Veltman's work that more precise prediction of physical quantities involving properties of W and Z could start. Large quantities of W and Z have recently been produced under controlled conditions at the LEP accelerator at CERN. Comparisons between measurements and calculations have all the time showed great agreement, thus supporting the theory's predictions.

One particular quantity obtained with 't Hooft's and Veltman's calculation method based on CERN results is the mass of the top quark, the heavier of the two quarks included in the third family in the model. This quark was observed directly for the first time in 1995 at the Fermilab in the USA, but its mass had been predicted several years earlier. Here too, agreement between experiment and theory was satisfactory.



When can we expect the next great discovery?




An important ingredient in the theory 't Hooft and Veltman have developed is an as yet undemonstrated particle termed the Higgs particle (Fig. 1). In the same way as other particles have been predicted by theoretical arguments and later demonstrated experimentally, researchers are now awaiting direct observation of the Higgs particle. Using calculations similar to those of the mass of the top quark, there is a chance that one of the existing accelerators can be persuaded to produce some Higgs particles. But the only accelerator now under construction and powerful enough for more detailed study of the new particle is the Large Hadron Collider (LHC) at CERN. But researchers must contain themselves for a few years to come since it is reckoned that the LHC will not be complete until 2005.





The mass mystery

The various matter and force-carrying particles weigh in with a wide range of masses. The photon, carrier of the electromagnetic force, and the gluons that carry the strong force, are completely massless, while the conveyors of the weak force, the W and Z particles, each weigh as much as 80 to 90 protons or as much as a reasonably sized nucleus. The most massive fundamental particle found so far, the super heavyweight, is the top quark. It is twice as heavy as the W and Z particles, and weighs about the same as a nucleus of gold! The electron, on the other hand, is approximately 350,000 times lighter than the top quark, and the neutrinos may even have no mass at all.




ParticleMass (Gev/c2)
U (up).005
D (down).01
C (charm)1.5
S (strange)0.2
T (top)180
B (bottom)4.7
ParticleMass (Gev/c2)
electron-neutrino<7x10-9
electron.000511
muon-neutrino<.0003
muon0.106
tau-neutrino<.03
tau1.7771






Why there is such a range of masses is one of the remaining puzzles of particle physics. Indeed, how particles get masses at all is not yet properly understood. In the simplest theories, all particles are massless, which is clearly wrong, so something has to be introduced to give them their various weights. In the Standard Model, the particles acquire their masses through a mechanism named after theorist Peter Higgs. According to the theory, all the matter particles and force carriers interact with another particle, known as the Higgs boson. It is the strength of this interaction that gives rise to what we call mass: the stronger the interaction, the greater the mass.

Experiments have yet to show whether this theory is correct. The search for the Higgs boson (or bosons!) has already begun at the LEP collider at CERN, and will continue into the 21st century with CERN's next machine, the Large Hadron Collider.