
Will anyone train a TokenFormer model at scale before 2026?
3
Ṁ1kṀ1.5kresolved Jan 6
Resolved
NO1H
6H
1D
1W
1M
ALL
Will anyone train a TokenFormer model using at least (the equivalent of) 200,000 H100-hours before 2026?
This question is managed and resolved by Manifold.
Market context
Get
1,000 to start trading!
🏅 Top traders
| # | Trader | Total profit |
|---|---|---|
| 1 | Ṁ395 | |
| 2 | Ṁ182 | |
| 3 | Ṁ11 |
People are also trading
Related questions
Before 2028, will there be enough inference capacity to generate 30T frontier model tokens per day?
99% chance
Will models be able to do the work of an AI researcher/engineer before 2027?
25% chance
AI: Will someone train a $1B model by 2028?
82% chance
Will a GPT-4 quality model be trained for under $10.000 by 2030?
90% chance
Before 2028, will anyone train a GPT-4-level model in a minute?
29% chance
AI: Will someone train a $1T model by 2050?
81% chance
AI: Will someone train a $1T model by 2080?
69% chance
In Jan 2027, it will be standard practise for non-AI-building tech companies to finetune and train their own models
35% chance
AI: Will someone train a $1T model by 2030?
25% chance
Will a GPT-3 quality model be trained for under $10.000 by 2030?
99% chance