How probable is human #catastrophe or #extinction over the next century?
☢️ or 🦠 or 🤖 🟰 💀❓
Well, artificial intelligence, nuclear event and engineered pathogens seem to raise the highest concerns.
Check the full report recently published by the Forecasting Research Institute (#OpenAccess). They interviewed *superforecasters*, i.e. historically accurate forecasters on short-run questions, and *experts* on nuclear war, climate change, AI, biological risks, and existential risk more broadly. (Note: be aware that this is an north-american based report.)
The results are fascinating. In general, the domain experts are more negative than the superforcasters: 20% and 6% versus 9% and 1% chances of catastrophe or extinction, respectively.
The Economist put forward a nice graphic with a summary of the results, divided by type of threat.
#future #AI #nuclear #pathogens #apocalypse #superforecasting
#catastrophe #extinction #openaccess #future #ai #nuclear #pathogens #apocalypse #superforecasting
We are the only species that can contemplate its own demise (but that doesn't imply we want to; #superforecasting human #extinction):
https://www.vox.com/future-perfect/23785731/human-extinction-forecasting-superforecasters
What can we learn about making good #predictions from the 1800s? A few things...
#predictions #Superforecaster #superforecasting #futurism #technology
What can we learn about making good #predictions from the 1800s? A few things...
#predictions #Superforecaster #superforecasting #futurism #technology
Novels can predict wars 5 years in advance
#literature #war #foreignPolicy #novel #politics #policy #forecast #superforecasting
#literature #war #foreignpolicy #novel #politics #policy #forecast #superforecasting