AI energy demand could hinder net zero progress and outstrip supply
According to a new study from McKinsey & Company, the fervour for digital and AI capabilities is creating greater energy needs – even as the world struggles to meet its pre-GenAI net zero targets. While the study suggests the technology will eventually wean itself from its huge carbon footprint, several more reports suggest that this is a long way from being realised.
When AI and GenAI first made waves in late 2022, the sudden thirst for the technology came at a time when decarbonisation in the US and UK economies was slowing – and far from comprehensive. The potential problems of the technology are in many ways self-evident, then.
A 2024 report by Goldman Sachs found that an average ChatGPT query uses nearly 10 times as much electricity as a Google search query. And with power grids still relying heavily on fossil fuels to sate the energy demands of businesses and the public, the popularity of AI is causing a spike in omissions. While many companies working on AI, including ChatGPT maker OpenAI, don’t disclose their emissions, in the summer, Google released a new sustainability report with a glimpse of the data, and revealled its greenhouse gas emissions had risen by 48% compared to 2019. It attributed that surge to its data center energy consumption and supply chain emissions.
A new study from McKinsey & Company has further elaborated on these huge demands. According to the strategy consulting firm, power demand for data centers in the United States is expected to reach 606 terawatt-hours (TWh) by 2030, rising from 147 TWh in 2023. The study’s senior partner Alastair Green and coauthors found that this would amount to 11.7% of total US power demand.
McKinsey’s study seems confident that this will not be too much of a problem. The firm asserted that the emissions intensity of power generated in the US alone will drop from 400kg per MWh in 2023, to 110kg by 2040 – thanks in part to the phasing out of coal and the increasing use of onshore wind. While that is still well beneath the levels needed for the country’s net zero targets (70kg per KWh by 2040), the researchers suggested this would open up opportunities for investors in energy to provide clean energy for data centre growth.
The McKinsey report explains, “Across the power value chain, investors can participate in and enable solutions to meet the demand for data centres and accelerate growth. Current progress and limitations alike illuminate three clear areas in which investors may be able to make the most impact: power access and sources, power equipment, and trades and technicians.”
Less optimistic
However, another study from Zhejiang University, published in Frontiers of Environmental Science & Engineering, makes for much less optimistic reading. While many proponents of AI have argued that the more advanced the technology becomes, the more efficient it will become – reducing its emissions more in line with net zero goals – the study found that improved AI systems actually require more computing power and therefore more energy to run. OpenAI’s current GPT-4, for instance, requires 12 times more energy than the previous generation of system.
“The exponential growth in AI capabilities mirrors a concerning rise in its environmental impact,” said Meng Zhang, lead researcher from Zhejiang University. “This study underscores the urgent need for the AI industry to adopt greener practices and sustainable standards. Our goal is to equip policymakers with the data needed to address AI’s carbon footprint through proactive regulations.”
Elsewhere, another report from Gartner has suggested that the surging demand for GenAI workloads might even outstrip the supply of energy in the near future – leading to operational constraints in AI datacentres due to energy shortages. According to the analyst, 40% of existing AI datacentres could be affected by power supply issues by 2027, because of how rapidly energy consumption is expected to rise over the coming years thanks to the rising number of server farms hosting AI workloads. Gartner added that the power needed by datacentres to run incremental AI-optimised servers will hit 500 terawatt-hours (TWh) per year in 2027, which is 2.6 times higher than in 2023.
Bob Johnson, vice-president analyst at Gartner, commented, “The explosive growth of new hyperscale datacentres to implement GenAI is creating an insatiable demand for power that will exceed the ability of utility providers to expand their capacity fast enough.”