AI model training time decreases fourfold by mid-2027?
5
100Ṁ140
2027
36%
chance

Hinge #2 bet proposed in this post: https://www.lesswrong.com/posts/45oxYwysFiqwfKCcN/untitled-draft-keg3

"If the fastest models trained in mid‑2027 require one‑quarter the elapsed time of equally large runs eighteen months earlier without a node shrink below four nanometres, that will be strong evidence that software‑only acceleration is real, and is not highly bottlenecked by compute or real-world experiments slowing the AI researchers down."

The other bets:

#1: https://manifold.markets/rayman2000/nvidias-datacenter-revenue-and-bigt

#3: https://manifold.markets/rayman2000/revenue-per-deployed-h100-exceeds-1

#4: https://manifold.markets/rayman2000/g7-country-manages-three-years-of-6

#5: https://manifold.markets/rayman2000/a-topthree-ai-lab-delays-a-frontier

Meta-market for majority of the five bets: https://manifold.markets/rayman2000/will-a-majority-of-the-5-hingequest?play=true

Get
Ṁ1,000
to start trading!
Sort by:

I don't understand this question. What does "fastest models" mean? Equally large as in the same number of FLOP? What if this was achieved just by having a 4x bigger cluster?

I'm not seeing how this operationalizes software intelligence explosions.

@JoshYou Yeah, I agree that this one is a little unclear. My understanding of the intent is something like "Does the number of floating point operations required to train a model of equal capability reduce by 4x?". I guess maybe the title should be training efficiency instead of training speed.

Not sure whether we will have the information to know whether this has happened, we need to know the number of ops that are being trained with, and people have to be training "equal" models. Although if there are breakthroughs of this magnitude, then probably they will be publicly known.

I think my resolution plan is:
- See if the original post author indicates any resolutions
- If not, make a best effort to resolve this by the intent described above.

© Manifold Markets, Inc.TermsPrivacy