Home Artificial Intelligence AI’s Growing Appetite for Power: Are Data Centers Ready to Keep Up?

AI’s Growing Appetite for Power: Are Data Centers Ready to Keep Up?

by admin
mm

As artificial intelligence (AI) races forward, its energy demands are straining data centers to the breaking point. Next-gen AI technologies like generative AI (genAI) aren’t just transforming industries—their energy consumption is affecting nearly every data server component—from CPUs and memory to accelerators and networking.

GenAI applications, including Microsoft’s Copilot and OpenAI’s ChatGPT, demand more energy than ever before. By 2027, training and maintaining these AI systems alone could consume enough electricity to power a small country for an entire year. And the trend isn’t slowing down: over the last decade, power demands for components such as CPUs, memory, and networking are estimated to grow 160% by 2030, according to a Goldman Sachs report.

The usage of large language models also consumes energy. For instance, a ChatGPT query consumes about ten times a traditional Google search. Given AI’s massive power requirements, can the industry’s rapid advancements be managed sustainably, or will they contribute further to global energy consumption? McKinsey’s recent research shows that around 70% of the surging demand in the data center market is geared toward facilities equipped to handle advanced AI workloads. This shift is fundamentally changing how data centers are built and run, as they adapt to the unique requirements of these high-powered genAI tasks.

“Traditional data centers often operate with aging, energy-intensive equipment and fixed capacities that struggle to adapt to fluctuating workloads, leading to significant energy waste,” Mark Rydon, Chief Strategy Officer and co-founder of distributed cloud compute platform Aethir, told me. “Centralized operations often create an imbalance between resource availability and consumption needs, leading the industry to a critical juncture where advancements could risk undermining environmental goals as AI-driven demands grow.”

Industry leaders are now addressing the challenge head-on, investing in greener designs and energy-efficient architectures for data centers. Efforts range from adopting renewable energy sources to creating more efficient cooling systems that can offset the vast amounts of heat generated by genAI workloads.

Revolutionizing Data Centers for a Greener Future

Lenovo recently introduced the ThinkSystem N1380 Neptune, a leap forward in liquid cooling technology for data centers. The company asserts that the innovation is already enabling organizations to deploy high-powered computing for genAI workloads with significantly lower energy use — up to 40% less power in data centers. N1380 Neptune, harnesses NVIDIA’s latest hardware, including the Blackwell and GB200 GPUs, allowing for the handling of trillion-parameter AI models in a compact setup. Lenovo said that it aims to pave the way for data centers that can operate 100KW+ server racks without the need for dedicated air conditioning.

“We identified a significant requirement from our current consumers: data centers are consuming more power when handling AI workloads due to outdated cooling architectures and traditional structural frameworks,” Robert Daigle, Global Director of AI at Lenovo, told me. “To understand this better, we collaborated with a high-performance computing (HPC) customer to analyze their power consumption, which led us to the conclusion that we could reduce energy usage by 40%.” He added that the company took into account factors such as fan power and the power consumption of cooling units, comparing these with standard systems available through Lenovo’s data center assessment service, to develop the new data center architecture in partnership with Nvidia.

UK-based information technology consulting company AVEVA, said it is utilizing predictive analytics to identify issues with data center compressors, motors, HVAC equipment, air handlers, and more.

“We found that it’s the pre-training of generative AI that consumes massive power,” Jim Chappell, AVEVA’s Head of AI & Advanced Analytics, told me. “Through our predictive AI-driven systems, we aim to find problems well before any SCADA or control system, allowing data center operators to fix equipment problems before they become major issues. In addition, we have a Vision AI Assistant that natively integrates with our control systems to help find other types of anomalies, including temperature hot spots when used with a heat imaging camera.”

Meanwhile, decentralized computing for AI training and development through GPUs over the cloud is emerging as an alternative. Aethir’s Rydon explained that by distributing computational tasks across a broader, more adaptable network, energy use can be optimized, by aligning resource demand with availability—leading to substantial reductions in waste from the outset.

“Instead of relying on large, centralized data centers, our ‘Edge’ infrastructure disperses computational tasks to nodes closer to the data source, which drastically reduces the energy load for data transfer and lowers latency,” said Rydon. “The Aethir Edge network minimizes the need for constant high-power cooling, as workloads are distributed across various environments rather than concentrated in a single location, helping to avoid energy-intensive cooling systems typical of central data centers.”

Likewise, companies including Amazon and Google are experimenting with renewable energy sources to manage rising power needs in their data centers. Microsoft, for instance, is investing heavily in renewable energy sources and efficiency-boosting technologies to reduce its data center’s energy consumption. Google has also taken steps to shift to carbon-free energy and explore cooling systems that minimize power use in data centers. “Nuclear power is likely the fastest path to carbon-free data centers. Major data center providers such as Microsoft, Amazon, and Google are now heavily investing in this type of power generation for the future. With small modular reactors (SMRs), the flexibility and time to production make this an even more viable option to achieve Net Zero,” added AVEVA’s Chappell.

Can AI and Data Center Sustainability Coexist?

Ugur Tigli, CTO at AI infrastructure platform MinIO, says that while we hope for a future where AI can advance without a huge spike in energy consumption, that’s just not realistic in the short term. “Long-term impacts are trickier to predict,” he told me, “but we’ll see a shift in the workforce, and AI will help improve energy consumption across the board.” Tigli believes that as energy efficiency becomes a market priority, we’ll see growth in computing alongside declines in energy use in other sectors, especially as they become more efficient.

He also pointed out that there’s a growing interest among consumers for greener AI solutions. “Imagine an AI application that performs at 90% efficiency but uses only half the power—that’s the kind of innovation that could really take off,” he added. It’s clear that the future of AI isn’t just about innovation—it’s also about data center sustainability. Whether it’s through developing more efficient hardware or smarter ways to use resources, how we manage AI’s energy consumption will greatly influence the design and operation of data centers.

Rydon emphasized the importance of industry-wide initiatives that focus on sustainable data center designs, energy-efficient AI workloads, and open resource sharing. “These are crucial steps towards greener operations,” he said. “Businesses using AI should partner with tech companies to create solutions that reduce environmental impact. By working together, we can steer AI toward a more sustainable future.”

Source Link

Related Posts

Leave a Comment