POWER OF AI: Wild predictions of power demand from AI put … – S&P Global

The rapid push to adopt artificial intelligence and machine learning in various segments of the energy sector will have far-reaching impacts that are only just starting to be understood. However, the electricity market, which already faces serious challenges from rapid renewables growth and widespread electrification, is expecting some significant net demand gains as a result of the new technology.

Receive daily email alerts, subscriber notes & personalize your experience.

This feature begins to explore some of the main power sector impacts and will be part of a longer series that addresses AI in several important areas of the energy industry.

For access to all installments of the POWER OF AI series, join us on Platts Connect.

The volume of electricity needed for artificial intelligence remains unclear, but it appears the technology will lead to a significant net increase in US power consumption, though some applications could reduce power demand, industry experts told S&P Global Commodity Insights.

"Regarding US power demand, it's really hard to quantify how much demand is needed for things like ChatGPT," David Groarke, managing director at consultant Indigo Advisory Group, said in a recent phone interview. "In terms of macro numbers, by 2030 AI could account for 3% to 4% of global power demand. Google said right now AI is representing 10% to 15% of their power use or 2.3 TWh annually."

However, Google could significantly increase its power demand if generative AI were used in every Google search, according to academic research conducted by Alex de Vries, a PhD candidate at the VU Amsterdam School of Business and Economics.

Citing research by semiconductor analysis firm SemiAnalysis, de Vries in a commentary published Oct. 10 in the journal Joule, estimated that using generative AI such as ChatGPT in each Google search would require more than 500,000 of Nvidia's A100 HGX servers, totaling 4.1 million graphics processing units, or GPUs. At a power demand of 6.5 kW per server, that would result in daily electricity consumption of 80 GWh and annual consumption of 29.2 TWh.

But such widespread adoption with current hardware and software is unlikely due to economic and server supply chain constraints, de Vries said in the commentary. That volume of Nvidia servers does not currently exist, and the cost to produce such a number could run up to $100 billion.

"In summary, while the rapid adoption of AI technology could potentially drastically increase the energy consumption of companies such as Google, there are various resource factors that are likely to prevent such worst-case scenarios from materializing," De Vries said in the commentary.

Close attention to datacenter geography and demand trends will be increasingly important for grid operators as AI adoption progresses.

Power demand from operational and currently planned datacenters in US power markets is expected to total about 30,694 MW once all the planned datacenters are operational, according to analysis of data from 451 Research, which is part of S&P Global Market Intelligence. Investor-owned utilities are set to supply 20,619 MW of that capacity.

To put those numbers into perspective, consider that US Lower 48 power demand is forecast to total about 473 GW in 2023, and rise to about 482 GW in 2027, according to an S&P Global Commodity Insights analytics forecast.

However, those expectations still don't assume any radical adjustments due to adoption of AI.

If significant forecast adjustments need to be made, the earliest indications will likely come from the utilities that serve the big datacenters.

Dominion Energy serves the largest datacenter market in the world in Loudoun County, Virginia, about 30 miles west of Washington, DC. The Richmond, Virginia-headquartered investor-owned utility has pointed out that electricity demand from datacenters in Virginia increased by about 500% from 2013 to 2022.

Since 2019, 81 datacenters with a combined capacity of 3.5 GW have connected to Dominion's power system, the utility said in a late June presentation to Mid-Atlantic grid operator PJM Interconnection.

"From 2023 to 2030, we are looking at about an 80% increase in US data center power demand, going from about 19 GW to about 35 GW," Stephen Oliver, vice president of corporate marketing and investor relations at Navitas Semiconductor, said in an interview.

Initial power demand for training AI is high and is more concentrated than traditional datacenter applications. "A typical rack which consumes 30 kW to 40 kW, with AI processors, like NVIDIA Grace Hopper H100, it's 2x-3x the power in the same rack, so we need new technology in the power converters," Oliver said.

"We see it popping up across the globe and while familiar names like Amazon Web Services, Microsoft and Google operate the data centers themselves, the hardware is designed and built by Taiwan-based companies like Delta, Lite On and Chicony," he said.

"We need to use new technology without taking up space within the cabinets," he said.

It is useful to look at AI in two broad slots, Groarke said, narrow AI that are a little more contained and not that energy intensive, with use cases like load forecasting and predictive maintenance. Secondly, "inference usage" like running a prompt that provides an answer adds to power consumption, along with computing hardware like data centers, he said.

The swath of applications that is really energy intensive is the language learning side, which needs more memory and storage. These are things like neural networks that need thousands of GPUs, he said.

Constance Crozier, assistant professor at Georgia Tech's H. Milton Stewart School of Industrial and Systems Engineering said that training something like ChatGPT uses about 1 billion times the power of running it -- but for end-uses this popular, the aggregate power consumed by running can become significant or even larger.

Power demand for AI comes from training these models which are pulling in "huge amounts of data from the web, and that seems to be doubling every year," Groarke said.

Global data volumes double every few years and untangling that from data center usage is really challenging, he said.

"The intensity of the training of the models is using the most power," Groarke said. "We need new ways of creating those models, and a lot of this power demand is predicated on how AI is adopted," he added.

AI is not on par with power usage from cryptocurrency, "but we could see that as companies adopt large language models," he said.

Many companies are working at embedding large language models in their own networks. Within the large language model industry there is effort to reduce complexity and increase the efficiency of hardware. "There is an awareness of not just building these models for the sake of it," Groarke said.

There are some AI applications that are learning to control systems in ways that will reduce power demand, Crozier said, adding that Google's Deep Mind is going to trial with being able to control room temperatures, and there is a project looking at temperature control of server rooms.

"There are a lot of efficiency gains to be made in building energy efficiency," she said.

"I have seen academic literature for managing data centers and being smarter with how to allocate load to certain servers," Crozier said.

There are control problems that AI could be used to improve in the future, Crozier said, adding that there are also some non-AI methods that could do the same thing, but they are more complicated because they need more extensive modeling.

"There is interest in this area because you can start something, train it, and see if it can better control buildings," Crozier said.

Virtual Power Plants are more about shifting power demand as opposed to reducing it. Electric vehicle charging is another example where demand could be shifted to times when there is a surplus of renewable power supply.

This is also true of AI training algorithms, which can be paused. "I could imagine a situation where we train these algorithms at times when the grid has fewer constraints, although strong economic incentives would be necessary," she said.

There are also "non-sexy things" like building energy efficiency that can be improved, and "there is a lot of low hanging fruit when we think about where this power is going to come from," Crozier said.

A lot of big computers will have the most power demand, with academics and others booking time to use them. They will have higher utilization rates, she said.

AI is very GPU heavy, but other big simulations that are non-AI are also GPU heavy, she said.

Machines are also collocated, so you can't make the distinction between AI and non-AI. "From a GPU standpoint, the big push is for AI, but it's not 100%," Crozier said.

The Massachusetts Institute of Technology and Google did not return requests for comment. Microsoft declined to comment.

S&P Global Commodity Insights reporter Darren Sweeney produces content for distribution on S&P Capital IQ Pro. S&P Global Commodity Insights is a division of S&P Global Inc.

Continue reading here:
POWER OF AI: Wild predictions of power demand from AI put ... - S&P Global

Related Posts

Comments are closed.