LLMs by EOY 2025: Will Retentive Learning Surpass Transformers? (Subsidised 400 M$)
37
1.1kṀ4073
Dec 31
10%
chance

Will Retentive Learning be widely consider to be better than transformer based architectures for frontiers LLMs 2025? 



Original Paper - https://arxiv.org/abs/2307.08621



Based on this video - https://www.youtube.com/watch?v=ec56a8wmfRk


Resolves as "Yes" if:
By the end of the year 2025, Retentive Learning architectures are generally considered superior to Transformer-based architectures for Large Language Models (LLMs), as evidenced by peer-reviewed academic publications, benchmark performance metrics, or widespread industry adoption.



Resolves as "No" if:
By the end of the year 2025, Retentive Learning architectures are generally considered inferior to Transformer-based architectures for Large Language Models (LLMs), as evidenced by peer-reviewed academic publications, benchmark performance metrics, or limited industry adoption.

Get
Ṁ1,000
to start trading!
© Manifold Markets, Inc.TermsPrivacy