Will there be an LLM (as good as GPT-4) that was trained with 1/10th the energy consumed to train GPT-4, by 2026?
➕
Plus
23
Ṁ11k
2026
98.2%
chance

The total power consumption could be estimated to be around 50-60 million kWh for training GPT-4.

1/10th of this energy = 5-6 million kWh

1/100th of this energy = 0.5-0.6 million kWh

See calculations below:

Get
Ṁ1,000
and
S3.00
Sort by:

@mods Resolves as YES. The creator's account is deleted, but DeepSeek v3 is much better than the original GPT-4 and was trained with an energy consumption of less than 5 million kWh.

Detailed calculation:
From the paper https://github.com/deepseek-ai/DeepSeek-V3/blob/main/DeepSeek_V3.pdf, we find that DeepSeek V3 required only 2.788 million H800 GPU hours for its full training.

The H800 GPU has a maximum power draw of 0.35 kW, see https://www.techpowerup.com/gpu-specs/h800-pcie-80-gb.c4181

Thus, the GPUs used at most 0.9758 million kWh (= 0.35 kW × 2.788 million hours) during training. Accounting for system power draw and other inefficiencies, we apply a factor of 2 to estimate an upper bound of at most 2 million kWh in total energy consumption for training the model. This is clearly below the 5 million kWh threshold required for resolving this market as YES.

DeepSeek is not only as good as the original GPT-4 but is much better, see https://lmarena.ai/




bought Ṁ500 YES

Jensen Huang (CEO of NVIDIA) said that with Blackwell GPUs, you could train GPT-4 with only about 4 MW of power consumed. Looks like even without algorithmic improvements we can get there.

Approx 4x efficiency improvement from silicon alone, based on latest GPUs being announced now (specifically mi300x Vs a100).

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules