GPT-5 level model runnable on phones by 2030?
7
100Ṁ147
2030
41%
chance

Saw a hn comment proposing they will https://news.ycombinator.com/item?id=45561528

"They get more efficient every year and home chips get more capable every year. A GPT-5 level model will be on every phone running locally in 5 years."

  • Update 2025-10-15 (PST) (AI summary of creator comment): Benchmark: Creator is open to suggestions from traders

Token speed requirement: Must be roughly equivalent to current OpenAI API speeds (not minutes per token)

Get
Ṁ1,000
to start trading!
Sort by:

What benchmarks are we comparing to determine if a given model is "GPT-5 level"? Also are there any restrictions on speed? It's technically possible to "run" a full DeepSeek model on a phone at a speed of minutes per token.

@ProjectVictory feel free to suggest a benchmark, token speed roughly equivalent to what comes out of the openai api now

bought Ṁ10 YES

Gpt 5 can run on a 24 GB GPU according to the internet, and some phones are already close to that. Following the trend of exponential growth we have seen for the past decades, this seems very likely.

@MaxE "Gpt 5 can run on a 24 GB GPU according to the internet" - can you provide a source for that? I think you are mixing it up with GPT-OSS models, proprietary GPT-5 model is way larger, although nobody reported exact size.

@ProjectVictory I couldn't find a good source or much information about how well it can run because all the results are about its training compute requirements

opened a Ṁ50 YES at 51% order

If it's not, that probably means things have collapsed and manifold doesn't exist

© Manifold Markets, Inc.TermsPrivacy