Google Agrees To Pause Ai Workloads To Protect The Grid When Power Demand Spikes

Google will pause non-essential AI workloads to protect power grids, the advertising giant announced on Monday.

The web giant already does this sort of thing for non-essential workloads like processing YouTube vids, which it moves to datacenters where power is available rather than continuing to run them in places demand for energy strains the grid.

Under an agreement with Indiana Michigan Power (I&M) and the Tennessee Valley Authority (TVA), Google will use the same techniques for AI workloads.

The announcement comes as states served by the power companies brace for a heat wave that will likely strain the grid as residents use air conditioners and increase demand for energy. Amid debate about datacenters’ consumption of power and water, the last thing that the Chocolate Factory needs is folks blaming its AI Mode search function for a power outage when temperatures top 100°F (37.7°C).

Under the agreement, if energy demand surges or there’s a disruption in the grid due to extreme weather, I&M and TVA can now request that Google reduce its power use by rescheduling workloads or limiting non-urgent tasks until the issue is resolved.

By dynamically adjusting how much power its bit barns are allowed to consume, a process Google calls “demand response”, the web giant argues that new datacenters can be interconnected more quickly — presumably because utilities are less concerned about them causing brown outs or outages.

“By including load flexibility in our overall energy plan, we can manage AI-driven growth even where power generation and transmission are constrained,” Google wrote in a blog post on Monday.

Training and running AI models can easily consume tens or even hundreds of megawatts of power for hours, days, or weeks at a time, depending on how big or complex they are. However certain workloads don’t need to run 24/7. Advancements in checkpointing mean that a model could be trained exclusively at night when grid capacity is at its greatest.

Datacenter demand-response is still a nascent technology and is only being employed at a handful of Google bit barns. Further complicating the matter, the approach is also incompatible with certain high-demand workloads, such as Search, Maps, or its cloud business, Google notes.

Google may have the ability to spin down its own machine learning workloads as it pleases, but it can’t exactly pause its cloud customers’ AI jobs without causing a few headaches.

Demand-response isn’t the only way Google is looking to curb the power demands of its AI infrastructure build out, and it’s not hard to see why. The search giant plowed $14 billion into servers in just 91 days of its 2025 fiscal year with plans to spend upwards of $85 billion by year’s end.

The company also continues to invest in alternative energy sources, including geothermal, solar, wind, hydroelectric, and nuclear. Google aims to field small modular reactors just as soon as they get their hands on one, and in May signed an agreement with Elementl Power to support the development of three potential reactor sites in the US. ®


Original Source


A considerable amount of time and effort goes into maintaining this website, creating backend automation and creating new features and content for you to make actionable intelligence decisions. Everyone that supports the site helps enable new functionality.

If you like the site, please support us on “Patreon” or “Buy Me A Coffee” using the buttons below

To keep up to date follow us on the below channels.