Category: AI Energy
-

AI’s power problem: the slow grid
Near term, expect more self-provisioned power while transmission and substations catch up. Long term, nuclear is a perfect fit for data centers.
-

The environmental impact of Google Gemini AI text prompts
Google has published the first breakdown of an in-production, global fleet-level AI system serving real queries. Results show per text prompt energy consumption of 0.24 Wh.
-

ChatGPT energy usage is 0.34 Wh per query
Now we have a number for AI energy consumption, what can we use it for? What else do we need to know for it to be useful?
-

If only data centers would participate in demand response
Even for AI training workloads, data center demand response remains an academic exercise – intriguing but impractical.
-

How useful is GPU manufacturer TDP for estimating AI workload energy?
Manufacturer provided Thermal Design Power (TDP) figures are often used to estimate energy consumption of GPU AI workloads, but how useful are they?
-

Data center energy and AI in 2025
Global data center energy consumption was 240-340 TWh in 2022, but AI is now a major driver of future projections. An update on the 2024 US Data Center Energy report.
-

Expect more overestimates of AI energy consumption
We’ve started to see AI doomerism spread to predictions of the vast quantity of energy AI is undoubtably going to consume.
-
Overestimating AI’s water footprint
Researchers need to be more careful about the inputs into their models. Overestimates undermine the goal of reducing the environmental impact of IT.
-
Influencing the carbon emissions of AI
There is a correlation between the training time and energy consumption, but that doesn’t mean there is a correlation between training time and carbon emissions.