Since the dawn of civilisation people have speculated about apocalyptic bangs and whimpers that could wipe us out. Now a team from Oxford university’s Future of Humanity Institute and the Global Challenges Foundation has come up with the first serious scientific assessment of the gravest risks we face.
Although civilisation has ended many times in popular fiction, the issue has been almost entirely ignored by governments. “We were surprised to find that no one else had compiled a list of global risks with impacts that, for all practical purposes, can be called infinite,” says co-author Dennis Pamlin of the Global Challenges Foundation. “We don’t want to be accused of scaremongering but we want to get policy makers talking.”
The report itself says: “This is a scientific assessment about the possibility of oblivion, certainly, but even more it is a call for action based on the assumption that humanity is able to rise to challenges and turn them into opportunities. We are confronted with possibly the greatest challenge ever and our response needs to match this through global collaboration in new and innovative ways.”
There is, of course, room for debate about risks that are included or left out of the list. I would have added an intense blast of radiation from space, either a super-eruption from the sun or a gamma-ray burst from an exploding star in our region of the galaxy. And I would have included a sci-fi-style threat from an alien civilisation either invading or, more likely, sending a catastrophically destabilising message from an extrasolar planet. Both are, I suspect, more probable than a supervolcano.
But the 12 risks in the report are enough to be getting on with. A few of the existential threats are “exogenic”, arising from events beyond our control, such as asteroid impact. Most emerge from human economic and technological development. Three (synthetic biology, nanotechnology and artificial intelligence) result from dual-use technologies, which promise great benefits for society, including reducing other risks such as climate change and pandemics — but could go horribly wrong.
Assessing the risks is very complex because of the interconnections between them and the probabilities given in the report are very conservative. For instance, extreme global warming could trigger ecological collapse and a failure of global governance.
The authors do not attempt to pull their 12 together and come up with an overall probability of civilisation ending within the next 100 years but Stuart Armstrong of Oxford’s Future of Humanity Institute says: “Putting the risk of extinction below 5 per cent would be wildly overconfident.”
A catch-all category to cover the unknown unknowns — an amalgamation of all the risks that we have not thought of or that seem ridiculously unlikely in isolation (such as sending signals to extraterrestrial civilisations that attract deadly alien attention). Together they represent a significant apocalyptic threat.
An asteroid at least 5km across — big enough to end civilisation, if not wipe out human life — hits Earth about once every 20 million years. But programs to map hazardous objects are making progress and, given enough warning, a concerted effort by the world’s space powers might succeed in deflecting an incoming asteroid on to a non-collision path.
AI is the most discussed apocalyptic threat at the moment. But no one knows whether there is a real risk of extreme machine intelligence taking over the world and sweeping humans out of their way. The study team therefore gives a very wide probability estimate.
An eruption ejecting thousands of cubic kilometres of material into the atmosphere — far larger than anything experienced in human history — could lead to a “volcanic winter”, with effects similar to an asteroid impact or nuclear war. Such events are known from the geological record to have caused mass extinctions. And with today’s technology, there is not much we could do to prevent its effects.
A full collapse of the global ecosystem, so that the planet could no longer sustain a population of billions, is one of the most complex risks in the study. Because many unknown sequences would be involved, the team does not even guess at a probability.
Bad global governance
This category covers mismanagement of global affairs so serious that it is the primary cause of civilisation collapse (rather than a secondary response to other disasters). One example would be the emergence of an utterly incompetent and corrupt global dictatorship. The probability is impossible to estimate.
Global system collapse
This means economic and/or societal collapse, involving civil unrest and a breakdown of law and order that makes the continuation of civilised life impossible anywhere on Earth. There are too many unknowns to give a probability estimate.
Extreme climate change
Conventional modelling of climate change induced by human activity (adding carbon dioxide to the atmosphere) has focused on the most likely outcome: global warming by up to 4C. But there is a risk that feedback loops, such as the release of methane from Arctic permafrost, could produce an increase of 6C or more. Mass deaths through starvation and social unrest could then lead to a collapse of civilisation.
A nuclear war between the US and Russia was the chief apocalyptic fear of the late 20th century. That threat may have reduced but, with proliferation of nuclear weapons, there is still a risk of a conflict serious enough to cause a “nuclear winter” as a pall of smoke in the stratosphere shuts out sunlight for months. That could put an end to civilised life regardless of the bombs’ material impact.
An apocalyptic disease would combine incurability (like Ebola), lethality (like rabies), extreme infectiousness (like the common cold) and a long incubation period (like HIV/Aids). If such a virus spread around the world before people were aware of the danger, the international health system would have to move with unprecedented speed and resources to save mankind.
Genetic engineering of new super-organisms could be enormously beneficial for humanity. But it might go horribly wrong, with the emergence and release, accidentally or through an act of war, of an engineered pathogen targeting humans or a crucial part of the global ecosystem. The impact could be even worse than any conceivable natural pandemic.
Ultra-precise manufacturing on an atomic scale could create materials with wonderful new properties but they could also be used in frightening new weapons. There is also the “grey goo” scenario of self-replicating nanomachines taking over the planet.