Artificial intelligence (AI) systems are on track to consume nearly half of all datacentre power by the end of this year, a stark new analysis reveals. The projection highlights the rapidly growing energy footprint of AI technologies and raises significant concerns about sustainability, reports The Guardian.

The estimates, formulated by Alex de Vries-Gao, founder of the tech sustainability website Digiconomist, align with broader warnings from bodies like the International Energy Agency (IEA). The IEA has forecasted that by the close of this decade, AI’s energy appetite could rival the current electricity consumption of entire nations like Japan.

De Vries-Gao’s research, slated for publication in the sustainable energy journal Joule, meticulously calculates the power consumed by specialized chips essential for training and operating AI models. These include processors manufactured by industry giants nVidia and AMD, as well as contributions from other companies like Broadcom.

According to the IEA, global datacentres (excluding cryptocurrency mining operations) consumed approximately 415 terawatt-hours (TWh) of electricity last year. De Vries-Gao’s analysis suggests that AI could already be responsible for as much as 20% of this figure.

The calculations consider several critical variables, including the energy efficiency of individual datacentres and the substantial power required for cooling systems that prevent servers from overheating under the intense workloads demanded by AI. Datacentres, often described as the backbone of AI, are facing mounting pressure to address their escalating energy needs.

Looking further ahead, de Vries-Gao projects that by the end of 2025, AI systems could account for up to 49% of total datacentre power consumption, again excluding energy used for cryptocurrency mining. This could translate to an AI energy demand of 23 gigawatts (GW) – a figure roughly double the total energy consumption of the Netherlands.

However, the report also acknowledges potential factors that could temper this explosive growth in hardware demand. A decline in the popularity of AI applications like ChatGPT, or geopolitical tensions leading to restrictions on AI hardware production, such as export controls, could slow the trend. De Vries-Gao pointed to existing barriers on Chinese access to advanced chips, which reportedly contributed to the development of the DeepSeek R1 AI model, designed to operate with fewer chips.


Total views: 577