Dark Forest: Should We Not Contact Aliens?
03/14/24 | 14m 12s | Rating: NR
In 1974 we sent the Arecibo radio message towards Messier 13. The message was mostly symbolic; we weren’t really expecting a reply. Yet surely other civilisations out there are doing the same thing. So, why haven’t we heard anything? What if the silence from the stars is a hint that we shouldn’t be so outgoing? What if aliens are deliberately keeping quiet for fear that they might be destroyed?
Copy and Paste the Following Code to Embed this Video:
Dark Forest: Should We Not Contact Aliens?
In 1974 we sent the Arecibo radio beamed a message towards the few hundred thousand stars of Messier 13, a globular cluster near the edge of the Milky Way,.
Itll take 10s of thousands of years to arrive, so no surprise we havent heard back yet.
But if there are other civilizations out there, surely we expect many of them to have a head start on usand so could have been shouting into the void louder and longer than we have.
So why havent we spotted their own efforts to make contact?
What if the silence from the stars is a hint that we shouldnt be so outgoing?
What if aliens are deliberately keeping quiet for fear that they might be destroyed?
For over 60 years we have been listening for messages from aliens.
Yet we havent heard, or seen, anything or anyone.
This is dubbed the Great Silence by author David Brin.
With over five thousand exoplanet discoveries confirmed to date, it seems increasingly likely that there has to be life out there.
The mismatch between the number of possible origins of life and the apparent absence of any other technological life is known as the Fermi Paradox, which of course weve discussed once or twice in the past.
But today were going to explore one of the more terrifying solutions to the Fermi Paradox - the idea that there are other technological species in our galaxy, but theyre all silent - because those that broke their silence were quickly destroyed.
This is the dark forest hypothesis, and its the core idea of Liu Cixins book of the same name, the sequel to the Three Body Problem.
Liu Cixin coined the term, although the idea has been proposed by others.
But the imagery of the universe as a dark forest is so chilling, lets lean into it.
Imagine you are a lone hunter stalking through dense forest.
Visibility range is almost zero so you have no idea whats hiding over the rise or behind the next tree.
You suspect there may be other hunters but youre not sure.
You carry a deadly weapon and know you could destroy other hunters in an instant but only if you spot them first.
You are forced to assume that they have the same capability, which means they could also destroy you.
So what do you do?
Do you call out and hope to find allies?
Friends?
Or do you keep very still and quiet and hope no one finds you?
Or do you keep stalking, ready to destroy any other hunter you find out of fear that theyve also chosen this last option?
The dark forest solution to the Fermi Paradox proposes that almost all civilizations will choose silencewhether a passive silence or a watchful, trigger-happy silence.
And those that do not so choose are no longer with us.
Its certainly a great creepy premise for a scifi novel, but the idea is really interesting because its based on some reasonably concrete game theory.
And also some pretty solid physics.
Lets look at the scenario again without the forest metaphor.
We have two planets that have developed advanced civilisations.
Lets call them the Alicians and Bobarians, or A and B for short.
Both A and B are capable of sending messages across the universe and could, if required, destroy another planet relatively easily and without personal risk.
Well come back to how.
We can model their interactions in game theory as a sequential game, just like chessin which players choose their actions in sequence.
Each of the civilisations act in response to the others last action, leading to a tree of possibilities and some payoff associated with each branch.
Each civilization should choose its responses to maximise the payoff likelihood.
For example, if the civilizations make friends that would have a positive payoff for boththey gain strength in numbers and maybe share technology.
If one civilization destroys the other that would be a relatively neutral payoff for the destroyer, but an extremely negative payoff for the destroyee.
How negative?
Well nothing could be worse, so maybe infinitely negative payoff or infinite cost.
So lets look at one possible scenario.
Civilization B intercepts a signal from A, but A doesn't know about B.
Maybe the Alicians are actively sending out signals hoping for a response.
Or maybe their own radio communications are just really loudthe equivalent of walking noisily through the forest.
So when B notices A, what options do they have?
They could ignore A, reply to A, or destroy A.
Instead of Text see the following part of the Matt decision tree Lets say they ignore.
A will remain blissfully oblivious of the existence of B, and B will continue its antisocial ways.
This scenario has a zero payoff for both A and B.
Slightly boring, but at least nobody gets hurt.
Add an insert of this part of the decision tree in the above section What if B takes the second option and destroys A?
Therell be a finite cost to destroying a planet in resourcesbut for an advanced civilization thats pretty small.
And there might also be a resource gain if that solar system had any good loot.
Maybe theres a cost in guilt at annihilating billions of sentient creatures, but the point is that if theres a cost its finite.
For the Bobarians anyway.
Add an insert of this part of the decision tree in the above section At worst, B experiences a finite negative payoff by taking the destroy option, while A experiences infinite negative payoff.
Finally, if B chooses to reply to As message, A now knows about the existence of B.
This gives additional knowledge to A that they previously did not have.
Then, A has the same three choices that B hadincluding the potential to attack and destroy B.
SEE THIS PART OF DECISION TREE FOR ABOVE pragrah So of all the optionsignore, destroy, replyonly reply leads to a potential infinite negative payoff for B.
If you just sum the potential payoffs of the three branches, its clear that B should never replyassuming the Bobarians care about their own survival.
SEE THIS PART OF DECISION TREE FOR ABOVE paragraph and highlight reply, turn A detroy and the cost to B (infinity symbol) boxes In fact, if we extend the tree a little more we uncover new outcomesfor example, in the ignore branch theres a possible future branching in which A discovers the existence of B and gets its turn to play the game.
That means both ignore and reply have the potential for infinite cost and by that logic a civilization should always choose to destroy any other civilization they detect.
Highlight the chains or reasoning.
Red paths all lead to B death and green path ensures B survival So, if we assume that civilizations that can detect other civilizations also know game theory, they know the game-theoretic dominant strategydestroy the other.
Or, if they place very heavy weight on the guilt cost, at the very least they will remain very very quiet.
Hence the Great Silence and the solution to the Fermi Paradox.
This is a pretty dark conclusion, so lets see if we can find flaws in this reasoning in the hope that the universe isnt as terrifying as the dark forest proposes.
First up, lets look at some of the assumptions.
We said that these civilizations can destroy each other relatively easily.
Thats actually a reasonable assumption if both civilizations have concentrated population centerssay, on one or a handful of planets.
AT the risk of telling you how to destroy planetsPerhaps the simplest approach is the relativistic kill vehicle.
An advanced civilization worth its salt should be able to harness, say, a percent of its home stars energy towards accelerating one or more masses to a good fraction of the speed of light.
Send those bodies to the offendin world and you at the very least ionize their atmosphere and vaporize their oceans.
Moreover, the target planet really has no way to protect itself.
The vehicle is traveling at a good fraction of the speed of light, so the target will have little warning before the weapon arrives, and not much they could do about it anyway.
It doesnt even matter if the Bobarians dont think the Alicians have the technology to do this.
Technological progress can be exponential, and so enormous advances can happen on the timescale of the light travel time between two star systems.
Say those systems are 100 light years apart.
In 100 years, Moores law yields a quadrillion-fold increase in computing power.
Other technologies have also been shown to obey exponential improvement rates, albeit with different doubling times.
Even if that rate is more staggered, one thing is clear: whatever technological state you perceive your neighbours to be in is probably not the technological state theyll be in when they receive your reply, or when they find you.
The vast distances between the stars is really the driver of the dark forest hypothesis.
It ensures that both sides have potential access to the instant kill option, and in general its what allows us to approximate this as a sequential game.
Civilizations cant interact with each other in real timethey cant feel out the intentions of the other, or react to perceived hostility.
They can only choose their move, knowing that if the other side chooses hostility in response it likely means complete annihilation.
Those vast distances also mean youre not safe even if you send a friendly messagehow does the receiver of the message know that youre truthful, or that youll stay friendly over the centuries?
So the physics behind the hypothesis seems to hold together.
But there are also assumptionsfor example theres some psychology.
We assume that all aliens will ascribe an overwhelmingly large cost or negative payoff to the extinction outcome.
Seems fairthe better them than us philosophy seems mostly universal among humans.
And if all species crawled out of the mud via Darwinian processes then they should all have competitive and self-preserving tendencies.
But its also possible that were projecting our own primitive psychological tendencies unfairly.
We dont know what value systems advanced aliens might have.
Perhaps theres a psychological transition that essentially all civilizations go through in which they, like, realise the value of all sentience of something.
Perhaps the ones that dont transition exterminate themselves.
In that case they may not value their own existence as infinitely higher than that of their neighbours.
Theres also another aspect to the cost analysis besides fear and empathyand thats curiosity.
Humans did not spread across the globe or invent fire or build great civilizations or discover the laws of physics by each of us placing infinite value in our personal survival.
Yes we sought the resources and strength needed for survival, but we were also curious.
And that curiosity proved a huge survival advantage in the end.
Its that interplaythat balance between the desire to be safe and the curiosity to peer over the next horizon, or round the next treethat kept us walking through the metaphorical forest to reach the metaphorical I dunno, sun-dappled meadow of the modern world.
Anyone we meet out there will also have curiosity and wariness in different measures.
Some will want to know what other life and cultures and minds are like.
And maybe that curiosity will outweigh their fear.
This is wild speculation.
The point is, we cant really assume that the payoff calculation will be the same for everyone.
Others may be playing different games.
So, how well are we playing the game?
So far, weve sent out a few messages here and there.
The Arecibo message will reach the M13 globular cluster in something like 27,000 years.
Weve sent a motley array of signals to 30 or so stars thatll arrive over the next few decades to few centuries (some have already arrived).
Itll take a pretty advanced civilization actively looking for signals to see any of them.
Some signals have reached their destinations and theres been time for a responsefor example the invasion fleet from Altair was due in 2015.
Most likely there arent advanced civilizations quite that close, and we havent been shouting too loudly.
Were walking pretty quietly through the forest, maybe breaking a twig or two.
We havent really started playing the game yetat least as far as we know.
So should we try to contact aliens?
Well maybe its wise to be a bit cautious.
On the other hand, theres one last assumption in the Dark Forest hypothesisand that's the idea that any civilization gets to make unilateral decisions that follow the perfect logic of game theory.
We dont decide to things.
Individuals decide, and collective action emerges in very complex ways.
Even now, individual humans, or at least smallish groups, could start projects that make quite a lot of noise on a fairly short timescale.
And maybe we will.
And maybe when we meet our first non-human civilization well wish wed respected the Great Silence.
Or maybe the curiosity and empathy of the individuals from both civilizations will win the day.
Perhaps well continue through the dark forest side by side and stronger for it, in hope of finding brighter metaphorical landscapes of space time.
Search Episodes
Donate to sign up. Activate and sign in to Passport. It's that easy to help PBS Wisconsin serve your community through media that educates, inspires, and entertains.
Make your membership gift today
Only for new users: Activate Passport using your code or email address
Already a member?
Look up my account
Need some help? Go to FAQ or visit PBS Passport Help
Need help accessing PBS Wisconsin anywhere?
Online Access | Platform & Device Access | Cable or Satellite Access | Over-The-Air Access
Visit Access Guide
Need help accessing PBS Wisconsin anywhere?
Visit Our
Live TV Access Guide
Online AccessPlatform & Device Access
Cable or Satellite Access
Over-The-Air Access
Visit Access Guide
Passport

Follow Us