Wondering just how human civilization might eventually come to an end? Well, some British researchers ave decided to help answer that question with a handy list of 12 scenarios for the end of it all. Step right up and take a pick.
Computers becoming sentient and turning against us? An unstoppable super virus? A supervolcano creating an unsurvivable "volcanic winter?" Yes, they've made the list, as have a number of other chilling alternative possibilities.
The list has been prepared by Oxford University's Future of Humanity Institute and the Global Challenges Foundation, and is being called the first serious scientific appraisal of the apocalyptic risks mankind could face.
While some of the list's scenarios would involve events out of our control, like an asteroid impact or that previously mentioned supervolcano, most would arise as a consequence of human advancements and activities, the researchers say.
Such advancements, especially those based on technology, present the potential to offer great benefits to humans but could also put us on the road to our demise.
"[This research] is about how a better understanding of the magnitude of the challenges can help the world to address the risks it faces, and can help to create a path towards more sustainable development," the study's authors say.
Although the whole idea of a doomsday list may seem like a downer, the researchers say their report's primary goal is to inspire action and dialogue.
"It is a scientific assessment about the possibility of oblivion, certainly, but even more it is a call for action based on the assumption that humanity is able to rise to challenges and turn them into opportunities," the researchers write.
The report classifies the risks in four categories: Current risks, Exogenic risks, Emerging risks and Global Policy risks.
- Current risks include extreme climate change, nuclear war, ecological catastrophe, global pandemic and global system collapse.
- Exogenic (external) risks are a major asteroid impact or supervolcanic eruption.
- Emerging risks include synthetic biology, nanotechnology, artificial intelligence, and uncertain (as yet unknown) risks.
- And finally there's Global Policy risk, defined as the consequence of future bad global governance.
The report's authors have taken pains to emphasize that despite the report's focus on almost unimaginable impacts of the listed "end scenarios," their hope is it will start a dialog about a coordinated approach to "address this group of risks and turn them into opportunities."