Home Artificial Intelligence AI’s Energy Demands Are Out of Control. Welcome to the Internet’s Hyper-Consumption Era

AI’s Energy Demands Are Out of Control. Welcome to the Internet’s Hyper-Consumption Era

by admin
AI's Energy Demands Are Out of Control. Welcome to the Internet's Hyper-Consumption Era

Right now, generative artificial intelligence is impossible to ignore online. An AI-generated summary may randomly appear at the top of the results whenever you do a Google search. Or you might be prompted to try Meta’s AI tool while browsing Facebook. And that ever-present sparkle emoji continues to haunt my dreams.

This rush to add AI to as many online interactions as possible can be traced back to OpenAI’s boundary-pushing release of ChatGPT late in 2022. Silicon Valley soon became obsessed with generative AI, and nearly two years later, AI tools powered by large language models permeate the online user experience.

One unfortunate side effect of this proliferation is that the computing processes required to run generative AI systems are much more resource intensive. This has led to the arrival of the internet’s hyper-consumption era, a period defined by the spread of a new kind of computing that demands excessive amounts of electricity and water to build as well as operate.

“In the back end, these algorithms that need to be running for any generative AI model are fundamentally very, very different from the traditional kind of Google Search or email,” says Sajjad Moazeni, a computer engineering researcher at the University of Washington. “For basic services, those were very light in terms of the amount of data that needed to go back and forth between the processors.” In comparison, Moazeni estimates generative AI applications are around 100 to 1,000 times more computationally intensive.

The technology’s energy needs for training and deployment are no longer generative AI’s dirty little secret, as expert after expert last year predicted surges in energy demand at data centers where companies work on AI applications. Almost as if on cue, Google recently stopped considering itself to be carbon neutral, and Microsoft may trample its sustainability goals underfoot in the ongoing race to build the biggest, bestest AI tools.

“The carbon footprint and the energy consumption will be linear to the amount of computation you do, because basically these data centers are being powered proportional to the amount of computation they do,” says Junchen Jiang, a networked systems researcher at the University of Chicago. The bigger the AI model, the more computation is often required, and these frontier models are getting absolutely gigantic.

Even though Google’s total energy consumption doubled from 2019 to 2023, Corina Standiford, a spokesperson for the company, said it would not be fair to state that Google’s energy consumption spiked during the AI race. “Reducing emissions from our suppliers is extremely challenging, which makes up 75 percent of our footprint,” she says in an email. The suppliers that Google blames include the manufacturers of servers, networking equipment, and other technical infrastructure for the data centers—an energy-intensive process that is required to create physical parts for frontier AI models.

Source Link

Related Posts

Leave a Comment