By March 14, 2025, will there be an AI model with over 10 trillion parameters?
By March 14, 2025, will there be an AI model with over 10 trillion parameters?
57
1kṀ8951
Mar 14
5%
chance

Finished training.

it should be competitive with SOTA of the time. Rough estimates ofc but not too far behind.

I will not trade in this market.

Get
Ṁ1,000
to start trading!


Sort by:
23d

@mods

State of the Art (SOTA) in AI and machine learning refers to the best-performing models or techniques for a particular task at a given time, typically measured by standard benchmarks or evaluations:


- Paper with specific details on BaGuaLu https://dl.acm.org/doi/10.1145/3503221.3508417

Other resources pointing in the direction with this model trained with above 10 trillion parameters:


https://keg.cs.tsinghua.edu.cn/jietang/publications/PPOPP22-Ma%20et%20al.-BaGuaLu%20Targeting%20Brain%20Scale%20Pretrained%20Models%20w.pdf

https://www.nextbigfuture.com/2023/01/ai-model-trained-with-174-trillion-parameters.html

Kind regards

1mo

The evaluation shows that BaGuaLu can train 14.5-trillion-parameter models with a performance of over 1 EFLOPS using mixed-precision and has the capability to train 174-trillion-parameter models, which rivals the number of synapses in a human brain.

https://dl.acm.org/doi/10.1145/3503221.3508417

https://keg.cs.tsinghua.edu.cn/jietang/publications/PPOPP22-Ma%20et%20al.-BaGuaLu%20Targeting%20Brain%20Scale%20Pretrained%20Models%20w.pdf

https://www.nextbigfuture.com/2023/01/ai-model-trained-with-174-trillion-parameters.html

bought Ṁ250 NO3mo

Y Combinator using this as a joke headline in November. https://www.youtube.com/watch?v=lbJilIQhHko

1y

If I take llama and "mixture of experts"" it a thousand times, then the resulting system has a lot of parameters, is state of the art(if not efficient in parameter use), and isn't THAT costly to run 🤔

1y

@Mira do it

1y

Does any open source model resolves to YES, or only entreprise models?

... can I train a 10 trillion MNIST classifier? :)

1y

@BarrDetwix yes you can :) it'll be a tad bit costly though. And I think i should ammend the description to say that it should be competitive with SOTA of the time

Apologies for the title change, I will compensate if you want me to. I've changed title to AI model instead of LLM because we soon will have VLMs (vision language models) become popular, etc.

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules