Does a ChatGPT Query Consume 3Wh?
May 2025
An often quoted number is that each ChatGPT query uses about 3 Watt-hours of electricity to answer, and I got nerd-sniped to find how that value was computed. But before sharing what I found, I'd like to answer a different question:
Should We Care?
Not today.
Most of the incentives, and hence efforts, were directed at having an AI that produces better answers. So less brainpower was dedicated to minimizing the energy consumption of running the models. As more resources are dedicated to the issue, we'd see significant gains in energy utilization. We already got a glimpse of what is possible with DeepSeek-R1, where a strong incentive to reduce the compute required for building and running the model resulted in impressive gains, that directly translate to a reduction in the energy used.
Public attention is a scarce resource, and heated arguments on energy use distracts from discussing things that can't be solved by throwing more engineering effort at. What we need to care about today is the significant impact AI already has on our lives.
The 3Wh Estimate
Tracking the references to the 3Wh figure all end up leading to the same paper: The growing energy footprint of artificial intelligence (de Vries, 2023), which performs the following calculation to arrive at an estimate of 2.9 Watt-hours per ChatGPT query:
3,617 | A100 servers powering ChatGPT | ||
× | 6.5 | kW | Peak energy use per server |
× | 24 | Hours per day | |
÷ | 13,000,000 | Daily active users | |
÷ | 15 | Requests per user | |
≈ | 2.9 | Wh | Energy use per query |
The implict assumption that OpenAI's servers are always running at their peak power consumption is probably incorrect, as they'd melt if it was the case. But the real issue is the provenance of the fleet size and usage numbers: they are sourced from an analyst report, which has no publicly published sourcing for those numbers. It seems to have made them up.
In other words, 3 Watt-hours per query is no more than a guess.