randombio.com | Science Dies in Unblogginess | Believe All Science | I Am the Science Saturday, March 08, 2025 | science The universe is not really fine-tunedHow physicists' attempts to explain the mass of the Higgs boson almost turned into a proof of the existence of God |
he fine-tuning theory says that the physical constants in the universe
are too finely balanced to happen by chance. If they were slightly
different, it's claimed, life could not exist.
Some sources cite Paul Davies as a source for the fine-tuning argument. But in his ArXiv article titled “How bio-friendly is the universe?” Davies called the fine-tuning theory “bio-determinism.” He was lukewarm about it:
[The emergence of life] might require exceptional, fluky, physical conditions (such as the chance formation of some extremely unusual molecules). Alternatively it might require additional, yet-to-be-elucidated, laws or principles, possibly themselves requiring an element of fine-tuning. . . . I shall refer to this second distinct aspect of biophilicity as biological determinism. It is the assertion that life will be almost inevitable given earth-like conditions.
Davies's idea was that life will evolve with whatever is available. If carbon was unable to form complex molecules, life would use some other element. But the idea that the physical constants of the universe were too finely balanced to happen by chance had religious people getting really excited for a while.
Before we get into that, let's discuss two other arguments that are sometimes cited as objections to fine-tuning: the multiverse and the anthropic objection. According to some philosophers, quantum indeterminacy could be explained by the idea that every decision splits the universe in half, effectively creating an infinite number of worlds where every possibility is played out.
For example, in the famous two-slit experiment, a particle may travel through one slit or the other. Each quantum decision creates a new world. A one-watt light bulb creates 1.1 × 1018 photons per second. According to the theory, each one would have to decide which slit to pass through. In just one second, that means we'd have 1.635 × 10301029995661981195 universes—a number with 301 trillion digits. That's effectively an infinite number.
An infinity of universes from one tiny light bulb. It sounds like something out of Hinduism, where some guy named Ashvatthama turns an ordinary blade of grass into a ‘stupefyingly’ powerful weapon capable of destroying the world. Of course we know photons don't really go through one slit or the other. It's just a model in our heads. Even so, the universe is constantly experiencing vast numbers of quantum-level events. With that huge number of universes, most would be indistinguishable from our own. The anthropic principle is not a theory, but a just-so story.
Physicist Yorikiyo Nagashima puts it this way: physics predicts things; the anthropic principle does not.
The anthropic principle seems capable of finding an answer to any question. But it may not predict anything in which, we believe, physics justification lies.[1]
Most of the fine-tuning claims are based on physical constants. For instance, it's claimed that if the strength of gravity were too high or too low, or if the strong nuclear force had been 50% stronger, the universe would be vastly different and life probably couldn't exist. The plato.stanford.edu article has a long list. A guy named Victor Stenger even wrote a book on the subject, producing philosophical arguments why fine-tuning is false.
But the real problem with fine-tuning is that it's an argument from ignorance. Fine-tuning is a bug, not a feature. It comes from what physicists call the hierarchy problem. According to physicist Yorikiyo Nagashima, [1] there are actually two hierarchy problems: the little one and the big one.
The little hierarchy problem concerns the mass of the Higgs boson, the particle that gives fermions their mass. The mass term of the Higgs, expressed in giga-electron volts, is invariant under gauge symmetry, which means you need radiative corrections. To get the theory to predict the correct mass, you have to subtract four large numbers to get the correct answer of 125 GeV. Specifically, we would need [2,3]
Starting (“tree”) value, which says the Higgs should be somewhere between 800 GeV and 10 TeV. This depends on Λ, the unification scale, where the four forces are united and become the same. It may or may not be the same as the Planck scale, which is the smallest scale and therefore has the highest energy.
Yukawa coupling with the top quark ∼−(2000 GeV2), that is to say, interactions with fermions
Gauge boson loop ∼+(700 GeV2)
Higgs loop ∼+(500 GeV2)
What these four things mean in physical terms isn't important. What's important is that these large positive and negative numbers should add up to 125 GeV, and they probably could with sufficient hammering. But it's scientifically questionable because by ‘bending’ the numbers just a little you could explain almost any mass you wanted. That is the little hierarchy problem. As Chris Quigg says:
If the reference scale is indeed very large, either various contributions to the Higgs mass must be precariously balanced or new physics must control it.[3]
Both the Standard Model and the Grand Unification Theories also suffer from the “big hierarchy problem”, which means they predict particles over such a huge range of masses that precise fine-tuning—to an extraordinary degree of accuracy—is needed to get the formulas to match reality. Specifically, the Grand Unification Theory scale, which is around 1015 GeV, is about 1013 times higher than the electroweak scale of 250 GeV. Because mass radiative corrections for scalar bosons like the Higgs increase quadratically, that means the corrections to the Higgs mass have to be exact to within one part in 1013 squared, or one part in 1026.
This is not a feature of reality, but a failure of the theory. Nagashima says physicists have considered three different ways of fixing it.
Assume that the Higgs isn't a real particle, but a composite (as in the so-called Technicolor model) or maybe even a quasiparticle similar to those proposed to explain superconductivity.
Add one or more extra spatial dimensions. Gogberashvili [4] and many others [5,6,7] proposed that there are five dimensions, where the extra one is curled up according to string theory, or M theory. This would explain many things, including why gravity is so weak compared to the other forces.
Supersymmetry, where there is a whole new set of gigantic particles. These particles couldn't exist today. The would have existed only at the time of the Big Bang. If the masses of the partners were only around 1 TeV or so, it would alleviate the hierarchy problem. So far, these particles remain to be discovered. The only way to find them would be to create them in a supercollider.
There are 29 fixed constants in the Standard Model. There are 120 in the minimal supersymmetric model, called MSSM. The clue is in the name. They aren't faithful representations of reality but models, invented by mortals to help us calculate things and to help us decide what we do and don't know. No one knows why these parameters have their values. Why does a neutron have two down quarks (charge = −1/3) and one up quark (charge = +2/3) that just happens to make it exactly electrically neutral? Just because some of our theories rely on precise subtraction of big numbers doesn't mean nature works that way.
To physicists, fine-tuning is a problem. That doesn't mean they've discovered it's unsolvable so they throw up their hands and admit defeat. They recognize the theory is incomplete. The radiative corrections are only needed because that is how particle physicists have decided to calculate things. It doesn't mean they're real.
There's also a theological argument against fine-tuning. A fine-tuned universe would be unstable. If it's true that indeterminism is baked into the universe, then minute instabilities and fluctuations would self-amplify and create chaos, maybe even destroy the universe. No Supreme Being would create a universe with such a big mathematical instability. Sooner or later it would disintegrate, the fabric of space-time changed into something else, much the way PVC cement turns to goo when it gets exposed to air, and he'd have to throw it out and create it all over again. A whole week down the drain.
It may be fun to imagine what would happen if the gravitational constant or the forces of nature were different. The world would be a lot different if a hydrogen atom weighed, say, ten pounds. But we need to keep in mind that nobody knows why these things have the properties we measure. It's not proof of a multiverse; and if you use unanswered questions in science as evidence of existence or non-existence of something, you're bound to be disappointed.
[1] Nagashima Y., 2014, Beyond the Standard Model of Elementary Particle Physics Wiley-VCH Verlag GmbH & Co., 2014, p.523
[2] Nagashima Y., 2014, ibid, p. 17
[3] Quigg C. (2013) Gauge Theories of the Strong, Weak, and Electromagnetic Interactions, 2nd. ed. Princeton, p. 244
[4] Gogberashvili M., 2002. Hierarchy problem in the shell universe model Int. J. Mod. Phys. D. 11, 1635–1638. arXiv: hep-ph/9812296. arxiv:hep-ph/9812296
[5] Krause A, 2003. A Small Cosmological Constant and Backreaction of Non-Finetuned Parameters arxiv:hep-th/0007233
[6] Randall L., Sundrum R. (1999). Large Mass Hierarchy from a Small Extra Dimension. Phys. Rev. Lett. 83, 3370–3373. arxiv:hep-th/9905221
[7] Arkani-Hamed N, Dimopoulos S, Dvali GR. (1998). The Hierarchy problem and new dimensions at a millimeter. Phys. Lett. B429, 263–272. arxiv:hep-ph/9803315
mar 08 2025, 4:53 am. updated mar 09 2025, 5:44 am
Metaphysics of causal sets
A model for quantum gravity is remarkably similar
to neural networks—and may resolve
the multiverse question
Time and space don't exist, scientists say
Why space is probably quantized and elementary particles
may have incredibly complex structure
Why is the speed of light not infinity?
Wonky Minkowski diagrams, Rindler frames, and quantum foam, oh myyyy
Beyond the Standard Model of
Elementary Particle Physics by Yorikiyo Nagashima
book review
Dark Matter:
Evidence, Theory, and Constraints by J.E. Marsh, D. Ellis,
and V. M. Mehta
book review
Gauge Theories of the Strong,
Weak, and Electromagnetic Interactions, 2e by Chris Quigg
book review