
Why will "If Artificial General Intelligence has an okay outcome, what will be the reason?" resolve N/A?
18
Ṁ660Ṁ8702029
8%
Too many existing humans suffer death
2%
Too many existing humans suffer other awful fates
12%
80% of currently attainable cosmopolitan value becomes unattainable
10%
The concept of "maximum attainable cosmopolitan value" is not meaningful
5%
As a demonstration of treacherous turns, trolling, or lulz
21%
Some other reason
23%
No reason given after 30 days
18%
It will not resolve N/A
This Yudkowsky market will resolve N/A.
/EliezerYudkowsky/if-artificial-general-intelligence
But can you predict why?
Resolves to the reason given by Yudkowsky.
This question is managed and resolved by Manifold.
Market context
Get
1,000 to start trading!
People are also trading
If Artificial General Intelligence has an okay outcome, what will be the reason?
If Artificial General Intelligence has a poor outcome, what will be the reason?
If Artificial General Intelligence (AGI) has an okay outcome, which of these tags will make up the reason?
Will artificial superintelligence exist by 2030? [resolves N/A in 2027]
29% chance
If Artificial General Intelligence has an okay outcome, what will be the reason?
If we survive general artificial intelligence before 2100, what will be the reason?
Will Eliezer's "If Artificial General Intelligence has an okay outcome, what will be the reason?" market resolve N/A?
31% chance
If Artificial General Intelligence has an okay outcome, what will be the reason?
When artificial general intelligence (AGI) exists, what will be true?
[Independent MC Version] If Artificial General Intelligence has an okay outcome, what will be the reasons?
Sort by:
@MartinRandall As I read it, it means AGI assimilates human culture to a tragic degree.
@IsaacKing I will take a brief break from luxuriating in 20% of max attainable value to realize that I'm in an impossible thought experiment set up to test my integrity, put down my ultra-chocolate, and carefully resolve this market to the correct answer to demonstrate my counterfactual integrity to the larger universe that is simulating me, thus slightly increasing my expected returns in the larger universe. And then I'll go back to the ultra-chocolate.
People are also trading
Related questions
If Artificial General Intelligence has an okay outcome, what will be the reason?
If Artificial General Intelligence has a poor outcome, what will be the reason?
If Artificial General Intelligence (AGI) has an okay outcome, which of these tags will make up the reason?
Will artificial superintelligence exist by 2030? [resolves N/A in 2027]
29% chance
If Artificial General Intelligence has an okay outcome, what will be the reason?
If we survive general artificial intelligence before 2100, what will be the reason?
Will Eliezer's "If Artificial General Intelligence has an okay outcome, what will be the reason?" market resolve N/A?
31% chance
If Artificial General Intelligence has an okay outcome, what will be the reason?
When artificial general intelligence (AGI) exists, what will be true?
[Independent MC Version] If Artificial General Intelligence has an okay outcome, what will be the reasons?