AI-driven computing poses climate risks due to high energy consumption, urging data center operators to maximize renewable energy usage and shift operations globally to reduce emissions. (unsplash)AI 

Google-Developed Technique Could Mitigate Rising Energy Consumption in Data Centers Caused by AI Growth

Major technology companies are in a race to prevent a looming environmental crisis caused by the extensive data centers they are constructing globally.

Google’s pioneering technology is gaining currency as even more power-sharing artificial intelligence comes online: Software is used to search for clean electricity in parts of the world where there is too much sun and wind on the network, and then boost the operation of data centers there. This can reduce carbon and costs.

There is an urgent need to figure out how data centers can be used in ways that maximize the use of renewable energy, said Chris Noble, founder and CEO of Cirrus Nexus. Cirrus Nexus is a cloud service manager that utilizes data centers owned by Google, Microsoft and Amazon. .

The climate risks caused by artificial intelligence-based computing are far-reaching – and they will get worse without a major transition from electricity based on fossil fuels to clean electricity. Nvidia Corp. CEO Jensen Huang has said artificial intelligence has reached a “tipping point.” He has also said that data center costs will double within five years to power the rise of new software.

According to the International Energy Agency, data centers and transmission networks are already responsible for up to 1.5% of global consumption. Together, they are responsible for about as much carbon dioxide emissions as Brazil each year.

Hyperscalers – as the biggest data center owners like Google, Microsoft and Amazon are known – have all set climate goals and face internal and external pressures to meet them. These lofty goals include operational carbon dioxide emissions.

But the rise of artificial intelligence has already wreaked havoc on those goals. Graphics processing units have been key to the rise of large language models, and they use more electricity than CPUs used in other forms of computing. According to IEA estimates, training an artificial intelligence model consumes more power than 100 households per year.

“The growth of artificial intelligence far outstrips the ability to generate clean electricity for it,” he said.

In addition, AI energy consumption is unstable and more like a sawtooth curve than the smooth line that most data center operators are used to. This makes de-coaling a challenge, not to mention ensuring grid stability.

AI growth is being driven by North American companies, which are keeping computing power — and energy use — concentrated there, said Dave Sterlace, Hitachi Energy’s global data center account director. It’s a trend he didn’t expect two years ago.

To reduce data center carbon dioxide emissions, hyperscalers and other data center providers have financed huge amounts of solar or wind farms and used credits to offset emissions. (In the case of refunds, some of them have not succeeded in having a significant impact on emissions.)

But that alone is not enough, especially as the use of artificial intelligence increases. That’s why carriers are turning to a strategy called load shifting used by Alphabet Inc.’s Google. Idea: Reduce emissions by changing the operation of data centers.

Today, most data centers tend to operate in “steady state”, where their energy consumption is fairly stable. This leaves them at the mercy of the grid they are connected to, and any combination of natural gas, nuclear and renewable electricity generation of the day will be affected by the lack of transmission lines between the regions. Tech giants are looking for opportunities to move their daily or even hourly data center operations around the world to tap into excess renewable energy generation.

Google launched its first attempt to match its power consumption in certain data centers with carbon-free power on an hourly basis to keep the machines running on clean energy 24/7. No one has yet fully achieved this goal. And certainly, the strategy of moving loads around the world can be complicated by countries that require information security policies that try to limit and secure the flow of information across borders. But what Cirrus Nexus and Google are testing could still be a critical piece in reducing emissions.

Cirrus Nexus, based in Manhattan, surveys the world’s power grids and measures emissions every five minutes to find the least polluting computing resources for itself and its customers, from medicine to accounting. The company had the opportunity to implement the search in practice last summer.

The Netherlands experienced the sunniest June on record, which caused the price of solar electricity to drop online. This made running the servers cheaper and less carbon intensive. Cirrus Nexus then moved its computing load to California as the sun set in the Netherlands, allowing it to take advantage of the solar power that was just coming online for Golden State Day.

According to data shared with Bloomberg, by chasing solar from Europe to the US West Coast and back, the company was able to reduce computing emissions in certain workloads for itself and customers by 34 percent, rather than relying solely on servers in either location. Green. Operational flexibility brings with it both advantages and risks.

The ability to reach additional zero-carbon megawatts can help reduce stress on grids, such as during a heat wave or cold winter storm. However, data centers need to work with utilities and network operators because large swings in demand can throw power systems into disarray, increasing the likelihood of blackouts. Dominion Energy, which is seeing data center demand rise at its Virginia facility, is working on a program to harness load shifting in data centers to reduce the strain on the grid during extreme weather.

In recent years, Google and Amazon have tested variable data center usage for their own operations and for customers using cloud services. (Cirrus Nexus, for example, uses cloud services provided by Amazon, Microsoft, and Google.) In Virginia, Microsoft signed an agreement with Constellation Energy Corp., which guarantees that more than 90% of data center power in that area is carbon-free. energy. However, reaching 100% is still a huge goal for it and other hyperscalers.

Google’s data centers use carbon-free energy about 64% of the time, with 13 regional sites reaching 85% and seven just over 90% globally, said Michael Terrell, who leads Google’s 24/7 carbon-free energy strategy. .

“But if you don’t phase out fossil fuels, you’re not going to fully meet your climate goals,” Terrell said.

Related posts

Leave a Comment