Deadly viruses and killer robots may cause human extinction, a report has revealed.
The report titled, "Global Catastrophic Risks," collated by researchers from Oxford University, Global Priorities Project, and the Global Challenges Foundation, ranked the catastrophic events that may wipe out as much as 10 percent of the population.
Global Catastrophic Risks
Global Priorities Project director Sebastian Farquhar said that a lot of events may not happen immediately but would eventually occur and change the world – disastrously. Farquhar added that devastating historical events could happen again.
"Many of these risks are changing and growing, as technologies change and grow and reshape our world," said Farquhar. "But there are also things we can do about the risks."
Five years from now, supervolcanic eruptions, asteroid strikes, and other unknown catastrophes will be the biggest threat to human existence, the report stated.
As technology continues to move forward, the possibility of artificial intelligence (AI) wiping out humans becomes greater. Elon Musk, for one, has recently created a new developer tool called the OpenAI Gym that primarily serves to train robots and other AI systems. The Oxford report did not exclude the fact that, in the future, these systems would overpower humans.
Farquhar said that once these AIs develop goals that differ from the values of humanity, adverse outcomes will be a possibility. Powerful AIs can become killer robots when they find it hard to understand what humans consider valuable.
The report also listed long-term threats such as climate change, engineered pandemics, and nuclear war, as their effects are already starting to reshape the world. Escalating synthetic biology could also give birth to deadly viruses.
Farquhar cited that militant groups could turn to genetic manipulation of viruses to eradicate millions of humans.
Is The World Prepared For Such Events?
The report [PDF] found it unfortunate that the international community is ill-prepared for such disastrous events. The researchers recommend international leaders to work together and start planning coordination to address pandemics, mutually agree on zero nuclear weapons, and study the potential risks of biotechnology and artificial intelligence systems.