
What will be the median p(doom) of AI researchers after AGI is reached?
16
1kṀ10732101
84%
Above 5%
67%
Above 10%
25%
Above 20%
9%
Above 50%
5%
Above 80%
AGI defined as an AI that is better at AI research than the average human AI researcher not using AI.
p(doom) defined as human extinction or outcomes that are similarly bad.
In Katya Grace's 2022 survey, median values were 5% for "extremely bad outcome (e.g., human extinction)” and 5-10% for human extinction.
All answers which are true resolve Yes.
Related:
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
Sort by:
Related questions
Related questions
Will we get AGI before 2047?
83% chance
What will be the average P(doom) of AI researchers in 2025?
20% chance
How much will AI advances impact EA research effectiveness, by 2030?
Will we get AGI before 2037?
76% chance
Will we get AGI before 2039?
78% chance
ML researchers’ median probability of existential risk from AI
20
Will AGI retaliate on AI doomers in a way that makes AI doomers regret it?
3% chance
Will a Nobel prize be awarded for the invention of AGI before 2050?
26% chance
Doom if AGI by 2040?
45% chance
The probability of extremely good AGI outcomes eg. rapid human flourishing will be >24% in next AI experts survey
54% chance