MJ Valente · @mjvalente
34 followers · 88 posts · Server hcommons.social

How probable is human or over the next century?

☢️ or 🦠 or 🤖 🟰 💀❓

Well, artificial intelligence, nuclear event and engineered pathogens seem to raise the highest concerns.

Check the full report recently published by the Forecasting Research Institute (). They interviewed *superforecasters*, i.e. historically accurate forecasters on short-run questions, and *experts* on nuclear war, climate change, AI, biological risks, and existential risk more broadly. (Note: be aware that this is an north-american based report.)

static1.squarespace.com/static

The results are fascinating. In general, the domain experts are more negative than the superforcasters: 20% and 6% versus 9% and 1% chances of catastrophe or extinction, respectively.

The Economist put forward a nice graphic with a summary of the results, divided by type of threat.

#catastrophe #extinction #openaccess #future #ai #nuclear #pathogens #apocalypse #superforecasting

Last updated 1 year ago

Brian Knutson · @knutson_brain
728 followers · 1931 posts · Server sfba.social

We are the only species that can contemplate its own demise (but that doesn't imply we want to; human ):
vox.com/future-perfect/2378573

#superforecasting #extinction

Last updated 1 year ago

person · @person72443
74 followers · 453 posts · Server mastodon.social
person · @person72443
154 followers · 577 posts · Server mastodon.social
person · @person72443
74 followers · 453 posts · Server mastodon.social