Will at least 16 AI x-risk researchers move to doomsday bunkers in the Southern Hemisphere by 2026-01-01?
23
Ṁ1kṀ2.2kresolved Jan 12
Resolved
NO1H
6H
1D
1W
1M
ALL
This question is managed and resolved by Manifold.
Market context
Get
1,000 to start trading!
🏅 Top traders
| # | Trader | Total profit |
|---|---|---|
| 1 | Ṁ122 | |
| 2 | Ṁ48 | |
| 3 | Ṁ40 | |
| 4 | Ṁ19 | |
| 5 | Ṁ13 |
People are also trading
Will an AI Doomer turn to violence by the end of 2026?
31% chance
Are we about to hit another AI winter in 2026?
10% chance
Will anyone commit terrorism in order to slow the progression of AI before 2027?
24% chance
Does an AI disaster kill at least 100 people before 2029?
79% chance
Will someone take desperate measures due to expectations of AI-related risks by January 1, 2035?
91% chance
Will someone take desperate measures due to expectations of AI-related risks by January 1, 2030?
67% chance
Will there be a highly risky or catastrophic AI agent proliferation event before 2035?
81% chance
Does an AI disaster kill at least 1,000 people before 2029?
44% chance
Will there be an AI Winter between 2022 and 2026?
10% chance
Does an AI disaster kill at least 100,000 people before 2029?
17% chance
Sort by:
predictedYES
@mariopasquato The pressure on humanity to eventually build aligned ASI doesn't stop with a nuclear war. It will still be essential to have as many safety-oriented researchers working on the problem as possible.
And I suspect there is so little AI x-risk-oriented intellectual capital in locations likely to survive a nuclear war today that even $10M on a properly located and stocked compound + AI lab with a couple dozen high-calibre rationalists (e.g. Nate Soares) could increase the probability of existential security by a factor over 1/20, given a nuclear war.
People are also trading
Related questions
Will an AI Doomer turn to violence by the end of 2026?
31% chance
Are we about to hit another AI winter in 2026?
10% chance
Will anyone commit terrorism in order to slow the progression of AI before 2027?
24% chance
Does an AI disaster kill at least 100 people before 2029?
79% chance
Will someone take desperate measures due to expectations of AI-related risks by January 1, 2035?
91% chance
Will someone take desperate measures due to expectations of AI-related risks by January 1, 2030?
67% chance
Will there be a highly risky or catastrophic AI agent proliferation event before 2035?
81% chance
Does an AI disaster kill at least 1,000 people before 2029?
44% chance
Will there be an AI Winter between 2022 and 2026?
10% chance
Does an AI disaster kill at least 100,000 people before 2029?
17% chance