How math trick saved particle physics

Renormalization is arguably the most important breakthrough in theoretical physics over the past 50 years.





You don't need to analyze the behavior of individual water molecules to understand the behavior of drops, or analyze drops to understand waves. The ability to switch focus between different scales is the essence of renormalization



In the 1940s, pioneering physicists stumbled upon a new layer of reality. The particles were replaced by fields - all-embracing and agitated entities that filled all space in the manner of an ocean. One small ripple in such a field could represent an electron, the other a photon, and their interactions, apparently, could explain all electromagnetic phenomena.



There was only one problem - this whole theory was based on hopes and prayers. Only with the help of such a technique as " renormalization"By carefully concealing infinite quantities, researchers could bypass the theory's nonsensical predictions. The circuit worked, but even those who developed the theory suspected it might be a house of cards holding on to a twisted mathematical trick.



“I would call it a 'crazy process,'” Richard Feynman later wrote . "We had to go down to such tricks, which is why we were unable to prove that the theory of quantum electrodynamics is mathematically consistent."



The theory was justified later, after several decades, and thanks to the seemingly unrelated field of physics. Researchers who have studied magnetization have found that renormalization is not about infinity at all. This theory concerns the division of the universe into kingdoms of independent sizes. And that perspective today rules many corners of physics.



David Tong , a theoretical physicist at the University of Cambridge, writes that renormalization is "possibly the most important breakthrough in theoretical physics in the last 50 years."



A Tale of Two Charges



From some point of view, field theories are the most successful theories in science. The theory of quantum electrodynamics (QED), one of the pillars of the Standard Model of particle physics, has produced theoretical predictions that coincide with experiments up to billionths.



But in the 1930s and 1940s, the future of the theory was not so certain. Approximating the complex behavior of fields has often produced meaningless, infinite answers, leading some theorists to believe that field theories are a dead end.



Feynman and others began looking for new perspectives - perhaps ones that would bring particles back to the scene - but instead found one trick. They found that the QED equations give reasonable predictions when applied to them by a cryptic renormalization procedure.



The exercise looks something like this. If QED calculations give an infinite amount, trim it. Convert the part that wants to go to infinity into a fixed coefficient in front of the sum. Replace it with the final measurement taken in the laboratory. Finally, let the corrected amount run to infinity.



For some physicists, this recipe was like playing in a shell. "It just can't be called meaningful mathematics," wrote the eminent quantum theorist Paul Dirac .



The crux of the problem - and the first step towards a later solution - is how physicists work with the charge of an electron.



In the described scheme, the electric charge comes from a coefficient - a value that swallows infinity in the process of mathematical shuffling. For theorists, lost in conjecture about the physical meaning of renormalization, QED hinted that the electron has two charges: theoretical, infinite, and measurable, finite. Perhaps the charge in the nucleus of an electron is infinite. But in practice, the effect of a quantum field (which can be imagined as a virtual cloud of positively charged particles) envelops the electron in such a way that experimenters measure only a modest net charge.



Two physicists, Murray Gell-Manand Francis Lowe, formalized this idea in 1954. He coupled two charges of an electron with one "effective" charge, varying with distance. The closer you get (the deeper you go into the positive electron cloud), the more charge you see.



Their work first linked renormalization to the idea of ​​scales. From it one could conclude that quantum physicists had found the right answer to the wrong question. Instead of worrying about infinities, they had to deal with the merging of the tiny with the huge.



Renormalization is "a mathematical version of the microscope," said Astrid Eichhorn, a physicist at the University of Southern Denmark using renormalization to search for theories of quantum gravity. “Conversely, you can start with the microscopic system and zoom out. This is a combination of a microscope and a telescope. "



Magnets save the day



The second clue came from the world of condensed matter, where physicists wondered how a crude model of a magnet could accurately predict the subtle details of certain transformations. Ising's model was just a grid of atomic arrows, each of which could only point up or down - and yet it predicted the behavior of real magnets with incredible accuracy.



At low temperatures, most of the atoms line up, which magnetizes the substance. At high temperatures, disorder occurs and the grating is demagnetized. But at the critical transition point, there are islands of aligned atoms of various sizes. What is important, the distribution of certain quantities at this critical point turns out to be the same in the Ising model, in real magnets of different materials, in systems not connected with magnets, like a transition at high pressure, when water becomes indistinguishable from steam. The discovery of this so-called. versatility was as strange as the discovery that the maximum speed of elephants and egrets was exactly the same.



Physicists don't usually work with objects of different sizes at the same time. However, this universal behavior in the vicinity of the critical point forced them to deal with lengths of all scales at once.



Leo Kadanov, a condensed matter researcher, figured out how to deal with this in 1966. He developed the technique of "splitting spins into blocks". The Ising lattice, too complex to work with directly, was divided into blocks of modest size, with several arrows on each side. He calculated the average orientation of the group of arrows, and replaced the entire block with this value. By repeating the process, he smoothed out fine mesh details by zooming out in order to understand the overall behavior of the system.





Block spin renormalization averages a grid with many individual spins, turning them into blocks of ever increasing size



Finally, Ken Wilson - a former student of Gell-Man, who was immediately engaged in particle physics and condensed matter physics - combined the ideas of Gell-Man and Lowe with those of Kadanoff. His " renormalization group ", first described by him in 1971, justified perverted QED calculations and provided a scale ladder for universal systems. This work earned him a Nobel Prize and changed physics forever.



Paul Fendley , a condensed matter scientist at the University of Oxford, believes that it is best to present the concept of Wilson's renormalization group as a "theory of theories" that combines the microscopic with the macroscopic.



Let's take a magnetic grid. At the microscopic level, it is easy to write an equation connecting two adjacent arrows. However, it will be nearly impossible to extrapolate this formula to trillions of particles. You are on the wrong scale.



Wilson's renormalization group describes the transformation of building block theory into structure theory. You start with the theory of small pieces, say, the atoms of a billiard ball. Twist the handle of Wilson's mathematical apparatus and you have a related theory describing groups of these pieces - for example, the molecules of a billiard ball. Spin further, the scale decreases, and the volumes of the groups grow - clusters of molecules, sectors of a billiard ball appear, etc. As a result, it will be possible to calculate something interesting - for example, the path of the entire ball.



This is the magic of the renormalization group: it helps determine which quantities will be useful to measure, and which complex microscopic details can be ignored. The surfer is interested in the height of the waves, not the crushing of water molecules. In subatomic physics, renormalization tells physicists when they can work with a relatively simple proton instead of the tangled tangle of its internal quarks.



Wilson's renormalization group also suggested that the misfortunes of Feynman and his contemporaries stemmed from attempts to understand the electron while being infinitely close to it. “Theories cannot be expected to work at any distance scale, however small,” said James Fraser., a philosopher of physics from the University of Durham Britain. Physicists now understand that truncating sums and shuffling infinities is the right way to do the math when your theory has a minimum grid size. "Cutting off the excess compensates for our ignorance of what is happening at the lower levels," said Fraser.



In other words, QED and the Standard Model are simply unable to say what the electron charge will be at a distance of zero nanometers. Such theories of physics are called "effective". They work best at well-defined distances. The main goal of high energy physics is to figure out what happens when particles get closer together.



From big to small



Today, Feynman's "crazy process" is used in physics as often as algebra, and its application is responsible for both the greatest advances in this field and current challenges. During renormalization, complex submicroscopic subtleties usually disappear. Perhaps they exist, but they do not affect the overall picture. “Simplicity is a blessing,” Fendley said. "There is something divine in this."



This mathematical fact describes nature's tendency to divide into largely independent worlds. When designing a skyscraper, engineers ignore individual steel molecules. Chemists analyze molecular bonds while remaining blissfully unaware of quarks and gluons. The division of phenomena by linear dimensions, numerically expressed in renormalization groups, allowed scientists over the centuries to gradually move from large to small, instead of attacking all dimensions simultaneously.



Yet at the same time, renormalization's hostility to microscopic detail works against modern physicists eager to discover signs of the next scale of the microworld. From the principle of separation of scales, it follows that they will have to dig deeper to overcome nature's tendency to hide small details from curious giants like us.



“Renormalization helps us simplify the problem,” said Nathan Seiberg , a theoretical physicist at the Institute for Advanced Study in Princeton. “However, she also hides what is happening at short distances. You can't get everything right away. "



All Articles