text.skipToContent text.skipToNavigation

How AI Is Breaking Our Grid, and What Can We Do About It?

September 10th, 2024

 

If you follow the news, you know that Generative AI is the most exciting technology development since the beginning of the Internet. Billions of people, millions of use cases, and thousands of new companies are leading the charge toward a new world where AI is central to our digital lives. 

Generative AI use has climbed faster than any technology, including smartphones and tablets, and, according to a June 2024 survey by Bain, 87% of companies have either deployed or are piloting generative AI. Data center operators are building new capacity as quickly as they can.

There’s only one problem. There’s a dark secret behind all the excitement.

It’s become clear that generative AI is, already, intensely power-hungry, and as demand for AI grows, the pressure on our electrical infrastructure mounts.

It’s uncertain if the energy industry can cope with the demand for energy, and even if they could rapidly scale production capacity, it’s questionable whether aging grids can handle the load. In many parts of the world, temperature extremes put so much pressure on local grids, that regulators frequently institute power conservation measures to cope with demand. Over the past few years, we’ve seen an uptick in brownouts and blackouts. These problems are only being made worse by the energy demands of AI. 

Will we have enough power to cope?

It’s hard to know. To make matters worse, power demands, even for existing data centers are increasing. The issue stems from AI’s need for accelerated compute infrastructure that requires more electricity and generates more heat which has to be cooled with specialized cooling technology. Both the servers and the cooling equipment consume extra energy. These servers also use vast amounts of drinkable water for cooling, which is often polluted or lost to the atmosphere. Every 10-50 ChatGPT prompts use a full 16 oz bottle of water. Training AI infrastructure consumes millions of gallons each day.

Data center operators are in a difficult position. Not only are they struggling to secure enough power and water, they’re also struggling to hit their sustainability targets.

Today, fossil fuels provide much of the power. Natural gas, because it’s easily available, is the preferred fuel for additional power generation. However, the demand is so high, many electric utility companies are deferring their plans to close coal-fired power plants. Data center operators would like to use wind and solar, and they are, but renewable energy production can’t keep up with the demand from AI.

So what solutions are being applied to these problems?

Many data center operators are working on their own power generation. Not only are they deploying natural gas capacity, but they’re also partnering with geothermal and nuclear startups to break away from a reliance on fossil fuels. 

Electric distribution providers are also getting into the act with grid hardening initiatives. The Biden administration helped with funding, but power companies are playing catch-up, sometimes forced to delay data center connections because they simply cannot supply the power.

More granular monitoring of electricity availability and distribution is a part of the solution, and many companies, from global multinationals like Legrand to startups, developed sensors to monitor power and send alerts when problems emerge. AI software can help utilities shift loads from one transformer or distribution line to another if pressure mounts or technical issues appear.

Data center operators are trying new technologies to cool their infrastructure that don’t require as much power or water. Putting data centers in cooler climates, bringing water directly to chips, and even immersing hardware in specially engineered fluids show promise in reducing both energy consumption and water utilization.

But one critical way forward is to do more work per watt. The idea is to employ technologies that are more energy efficient. For example, ARM processors, used by hyperscalers like Google and Amazon minimize energy waste. Redesigning data centers to cut energy utilization is also another way forward.

None of these changes are easy, and they’re all taking time.

The AI frenzy has data center demand rising 15-20% each year through 2030, estimating data centers could reach a whopping 16% of total U.S. power consumption. Without having complete visibility to, and control of power utilization, the industry is rushing toward a brick wall. At Legrand, we’re helping market leaders all over the world build more efficient infrastructure for AI that reduces pressure on utilities, grids, aquifers, and the planet.