Written in Joule, Alex de Vries-Gao conducted a study estimating past, present and future electricity usage by AI data centers.
The International Energy Agency recently reported in 2024 that data centers account for up to 1.5% of global energy usage. This is increasing rapidly.
De Vries-Gao noted that data centers support more than AI queries, such as cloud storage and Bitcoin mining.
AI developers acknowledge the heavy computing power needed to run large-scale language models such as ChatGPT.
Some companies are beginning to generate their own electricity to meet demand.
However, over the past year, AI companies have become less transparent about their energy use.
Therefore, de vries-gao estimated power consumption based on published data.
He analyzed chips manufactured by Taiwan Semiconductor Manufacturing Company, which supplies Nvidia and others.
He combined estimates from AI hardware analysts, revenue reports, hardware sales and power consumption reports.
Using this data, he calculated that AI providers will consume approximately 82 terawatts of power in 2025. This is roughly equivalent to Switzerland’s total electricity usage.
Assuming that AI demand doubles by the end of the year, AI data centers could consume about half of the global data center’s electricity.
De Vries-Gao warned that an increase in AI power not only risks rising electricity prices, but also environmental harm.
“If most AI providers use grid electricity, coal-based generation can cause greenhouse gas emissions to skyrocket,” he said.
This increase could promote global warming.
MNA/TSN3321002