Are The Fundamental Constants Finely Tuned? | The Naturalness Problem
06/26/25 | 17m 21s | Rating: NR
Did God have any choice in creating the world? So asked Albert Einstein. He was being poetic. What he really meant, was whether the universe could have been any other way. Could it have had different laws of physics, driven by different fundamental constants. Or is this one vast and complex universe the inevitable result of an inevitable and unique underlying principle, perhaps expressible as a su
Copy and Paste the Following Code to Embed this Video:
Are The Fundamental Constants Finely Tuned? | The Naturalness Problem
Did God have any choice in creating the world?
So asked Albert Einstein.
He was being poetic.
What he really meant, was whether the universe could have been any other way.
Could it have had different laws of physics, driven by different fundamental constants.
Or is this one vast and complex universe the inevitable result of an inevitable and unique underlying principle, perhaps expressible as a supremely elegant Theory of Everything.
It certainly seems that Einstein thought this should be the case that God had no choice in whether or how to create the world.
It seems like a pretty arm-chair philosophical and perhaps unanswerable question, but the modern problem of naturalness may lead to an answer.
Naturalness or fine-tuning problems in general are those in which some property of the universe seems oddly specificas though whatever process decided on the parameters of the natural laws cared that this particular property has the value it does. Weve been talking about this a bit, including in two recent episodes where we discussed why many physicists think that the small value of the mass of the Higgs boson seems unnaturalthis is called the hierarchy problem.
The idea, if you recall, is that there should be processes influencing the Higgs mass that operate at very high energies, which should in turn lead to a very high Higgs mass unless theres some precision suppression of those influences.
Those influences are interactions with quantum fields, and the suppression is supposed to be that they cancel each other out.
No natural mechanism has yet been discovered to achieve such high precision and coordinated cancelling, but the alternative to coordinated canceling is fortuitous cancelling.
If the latter is true then it looks like the various cancellations were finely tuned to achieve that small Higgs mass, which feels unnatural.
Another supposed fine-tuning problem is the apparently very small value of the cosmological constant.
So we observe that the expansion of the universe is accelerating, and a possible culprit is vacuum energya faint energy density in the quantum fields that fill all of space.
We call this influence dark energy and describe it with the cosmological constant in the Einstein equations.
And for basically the same reason that physicists are confused by the low Higgs mass, many are also confused that dark energy isnt extremely strong due to the contributions of high-energy components of those fields.
And again, quantum cancellations could reduce this energy. Its not such a stretch to imagine a perfect cancellation if theres a high level of symmetry in those contributions.
But to cancel almost but not quite perfectly seems an oddly specific result.
Finely tuned, in fact.
So far weve been talking about these issues in very mechanistic terms.
Thats a potential source of confusion, because in a sense the mechanisms dont matter.
Theres a much more general way to express the problem that gets to the heart of the questions we started with: Is our very particular universe inevitable?
Im going to get to that after a few more words to put the mechanistic picture to bed.
Quantum field theory describes essentially everything in terms of myriad interactions between quantum fields, expressed in this theory in terms of virtual particle interactions.
Many, perhaps most physicists understand this virtual stuff to be just a mathematical tool representing the messy interactions of quantum fields.
But even if the Feynman sums in QFT are mathematical fictions, we can still consider the complex field interactions they represent as being real in some sense, whatever real really means.
The astonishing success of the standard model of particle physics, itself a QFT, means we kind of have to take quantum field interactions seriously.
That includes ideas like field interactions contributing to particle mass and even the idea of field interactions canceling each others influence.
Theres also independent evidence that this virtual activity is in a sense realfor example the Casimir effect, in which a measurable force is induced by excluding supposed virtual field modes from between two conducting plates.
So, the clockwork behind the Standard Modelthe mechanism of quantum field interactions and cancelationsmake very sensible predictions in some cases, and nonsensical ones in other cases.
In particular, prior to being fixed, the QFT behind the standard model predicted the Higgs mass and dark energy should be enormous. Can we just ignore those predictions?
In the case of the standard model, the answer was yes.
At least, if all we want to do is to use the theory in its domain of applicability.
The supposed exploding masses of the particles can be dealt with if we just input by hand their true masses as measured in the lab.
The process is called renormalization, and it involves adding some made-up canceling terms to get the exploding masses back down to where we know they should be.
With this change, those particles' masses are no longer predicted by the theory, but rather become free parameters. In this way the Standard Model became internally self-consistent.
It doesnt predict very large masses, but thats by construction.
Its the unconstrained, un-renormalized quantumfield theory underneath the standard model that predicts large masses.
During the development of the Standard Model, some physicists had hoped that the quantum field theory on which the model is built would be able to fully explain the masses of the particles.
They were not pleased that this feature didnt make it into the release version of the theory.
Richard Feynman, who led the development of quantum electrodynamicsthe electromagnetic part of the standard modelcalled renormalization a shell game and a dippy process.
He said such hocus-pocus has prevented us from proving that the theory of quantum electrodynamics is mathematically self-consistent.
Paul Dirac said he was, very dissatisfied with the situation because this so-called 'good theory' does involve neglecting infinities which appear in its equations, ignoring them in an arbitrary way. There are two takeaways here.
One is that the underlying mechanics of the Standard Model does lead to confusing stuff about the particles masses if you remove the hocus-pocus of renormalization.
And some would argue that sans-hocus-pocus, if you take this mechanism seriously, there appear to be finely tuned cancellations in order to get the masses reasonable.
The other takeaway is that the Standard Model fails to realize the dream of Einstein, Feynman and Dirac.
They, and perhaps we, want a singular ultimate theory, one from which every property of the world should be derivable in a non-dippy, non-arbitrary, and mathematically self-consistent way.
The Standard Model isnt that, and so we can imagine a more true, more general theoryperhaps a theory of quantum gravityof which the Standard Model, general relativity, and everything else are slightly crappy approximations, each with only a limited range of validity.
We talked about this stuff last timehow we have these layers of theory, with theories describing larger scales and lower energies being course-grainings of deeper theories.
But all of those so-called effective theories are entirely defined by the theory at the bottom, at least thats how the world works in the reductionist paradigm.
We like to call the more course-grained, more zoomed-out, more emergent theory the infrared or IR theory, while the deeper, more fundamental theory is the ultraviolet or UV theory.
The reason is by analogy with the ultraviolet catastrophe, which was in the last episode.
So lets say we have an IR theory thats a course-graining of a UV theory.
Both are defined by their own set of parameters.
For example, there are 19 free parameters in the Standard Model that seem arbitrary in the context of that theory alone.
The free parameters of the IR theory are not free in the context of the UV theory from which it comes.
Instead they are calculable in principle from the parameters of the UV theory.
What are the parameters of the deepest UV theory? We dont know.
Knowing that fully means knowing the Theory of Everything.
But we can at least imagine that the UV theory of our universe lives in some sort of platonic space of all imaginable such theories.
The UV theory that generates our universe has a specific location in that space defined by its specific parameters.
If we locate that specific UV theory and you should be able to calculate the precise parameters of the IR theory that inevitably emerges from it.
At least if Einstein has his way.
Einstein would also like the UV theory itself to be inevitable, but really we have no idea how those parameters were chosen.
OK, how does naturalness play into this picture? Imagine making a new universe by choosing a random spot in theory-space.
The resulting UV theory defines the IR parameters like the Higgs mass etc.
You havent chosen your IR theory with any intention to get specific IR parameters, so you probably wouldnt expect them to be unusual or interesting in any way.
If they are, it would seem unnatural not as random as you imagined.
Lets try an analogy.
You shoot an arrow into a barn wall.
Blindfolded.
You look to see where you hitjust some random patch of wood.
Do you act all surprised, like, What are the chances that I hit this square inch in particular? How lucky! No, you had to hit somewhere, and thats where you hit.
Perfectly natural.
On the other hand, imagine if there was a target painted on the wall with a very tiny bullseye.
You hit the bullseye by chance without even knowing it was there.
Youd be justified in thinking it a crazy fluke, and not just some spot on the wall.
In case you missed it, in this analogy the barn wall is the space of possible UV theories.
Each spot on the wall is a different set of parameters for that theoryeach a different underlying master theory leading to a different universe with different IR parameters.
The bullseye is a very particular UV theory, and it's special because it happens to lead to a very special IR theory in our case a small Higgs mass and small cosmological constant.
And the blind archer?
God, perhaps?
Or perhaps we could think of it as representing the mechanism by which the parameters of the UV theory are set.
This could be random, meaning the whole barn wall is fair game.
Or those parameters could be set by some mechanism thats intrinsic to the UV theorya mechanism that only permits a single outcomeonly one spot on the barn wall was ever possible in the first place.
And once you know the UV theory, you see that no other set of UV parameters are possible.
That seems to be what Einstein preferred.
God has no choice.
But whether the UV parameters are set randomly or are unique and inevitable, the archer is still blind and we do still have the problem of fine tuning.
How so?
Well, the blindness of the archer doesnt really represent the freedom of the UV theory to have different parameters.
What it really represents is a combination of two things: 1) Our own lack of knowledge of those parameters.
We know nothing about them, and so they could be anythingthe whole barn wall is open as far as our prior knowledge is concerned.
And 2) the fact that the UV theory is blind to the IR theory.
So even if the mechanisms that set the parameters of the UV theory have no choice in where they land, they arent doing that with the emergent IR theory in mind.
At least, thats the default assumption of reductionism.
So thats the situation were inwe find an arrow in the bullseye, and it seems that the arrow was put in place by a process that had no idea that the bullseye existed.
We can talk about all of this in terms of Bayesian reasoningby assessing likelihoods relative to our prior knowledge.
When physicists talk about a prior distribution for the parameter space of the UV theory, they typically arent talking about some real process that sets the parameters of that theory.
Instead, they usually mean a probability distribution that quantifies our degree of knowledge and ignorance about some aspect of the world.
Its the possible range for the parameters given what little we know what we call a Bayesian prior.
In reality the true parameters of the UV theory are what they are, presumably a set of well defined values.
But with little prior knowledge, their possible distribution with respect to that knowledge is vast.
In our Bayesian reasoning wed use a wide Bayesian prior because its job is exclusively to represent our knowledge.
Now let's say we know that this wide-open UV theory-space maps to some as-yet unmeasured infrared parameter.
We go to measure the parameter and ask how much of the theory space in our Bayesian prior could produce an IR parameter like the one we just measured? Ifthe answer is a vanishingly small fraction, then we might be suspicious.
Thats the case with the Higgs mass and cosmological constant.
Its the arrow being suspiciously in the bullseye. We would be justified in assuming that the archer actually peeked, or that theres some powerful collusion in the UV theory that propagates down to the IR theory, in this case working to make the IR theory the low-energy theory that it is.
And it doesnt really matter if the mechanism for suppressing IR parameters such as the Higgs mass is quantum cancellations oranything else.
As long as our UV theory natively lives at much higher energy than the IR theory, either collusion between the high-energy mechanics or extreme chance is needed to yield that stable low energy theory.
So thats why it is reasonable to wonder at things like the smallness of the Higgs mass and cosmological constant.
Not because theyre actually unnatural, but because they seem unnatural given the mechanisms we think are at work behind them.
Nature is obviously being perfectly natural, and so it means were missing something.
Either the archer peekedthats the UV theory being influenced by the IR theoryit sees the target.
Or the archer shot many, many arrows and the whole barn wall is a pin-cushion of alternate universes and of course were in one of the extremely rare bullseye universes with Survivable higgs mass and so on.
I keep saying well come back to these ideas of UV-IR mixing and anthropic reasoning, and we still will.
But the choice between these two may be our answer to Einsteins question.
Did God have a choice in creating the world?
If no then there was only one arrow and that seems to demand a connection between the fundamental and the emergent that we dont fully understand.
If yes then the arrow could have landed anywhere on that wall and perhaps did, solving naturalness but generating a multiverse. Of course, in that case we can just redefine the multiverse as the world and say that it is singular and inevitable and Einstein gets to be right again.
Just like he was with all that stuff about curved spacetime.
Search Episodes
Donate to sign up. Activate and sign in to Passport. It's that easy to help PBS Wisconsin serve your community through media that educates, inspires, and entertains.
Make your membership gift today
Only for new users: Activate Passport using your code or email address
Already a member?
Look up my account
Need some help? Go to FAQ or visit PBS Passport Help
Need help accessing PBS Wisconsin anywhere?
Online Access | Platform & Device Access | Cable or Satellite Access | Over-The-Air Access
Visit Access Guide
Need help accessing PBS Wisconsin anywhere?
Visit Our
Live TV Access Guide
Online AccessPlatform & Device Access
Cable or Satellite Access
Over-The-Air Access
Visit Access Guide
Passport

Follow Us