Beyond Cloud: Understanding GenAI’s Impact on Data Centres

Executive Summary

  • Generative AI is rapidly surpassing cloud workloads in power, cooling and infrastructure requirements, driving a dramatic surge in global data centre energy demand.
  • AI workloads are pushing rack densities to unprecedented levels, accelerating the shift to liquid and hybrid cooling.
  • By 2030, AI-driven data centres could consume up to 4% of global electricity, forcing operators to innovate in design, sustainability, and high-density capacity planning.

 

For a decade, the cloud has driven the demand for more energy and infrastructure, but a new opponent has entered the ring. Generative AI data centre demand is another source of stress on data centres, with its tidal wave of energy and infrastructure demands. While it may not break the grid, GenAI might within this landscape. Unlike cloud workloads, GenAI workloads, including training and inference, require substantial power and cooling systems.

According to Delloitte’s study, the use of Gen AI will double the electricity consumption of data centres around the world in just five years time.

Generative AI has become an integral part of daily personal life and business, from our smartphones to personal computers, to AI agents on websites, and even generating content for businesses within our daily lives. The AI agent deployed was up to 25% this year and is doubling by 2027. With more use cases comes more inference workloads, resulting in continuous energy demand.

The Energy Surge

The power needed to support data centres and their most important components is expected to reach around 96 gigawatts around the world by 2026, with AI operations consuming 40% of that power.

The predictions don’t stop there: Goldman Sachs projects AI will drive a 165% increase in global data centre power demand by 2030, a surge driven by the high energy needs of AI workloads. It leads to data centres consuming 3 to 4% of total worldwide electricity by the end of 2030.

Furthermore, AI-related data centres’ capacity, according to GrandView Research, is projected to grow at over 24/7% CAGR through 2030, clearly outpacing the fast pace growth of cloud in the last decade – let’s pause and think about that for a moment.

Why GenAI is Demanding more than the cloud

Cloud computing has been vital over the last decade and has transformed how organisations use IT. Most cloud workloads, whether it’s storage, web apps, databases, or virtual machines, are lightweight from an energy and density perspective when it comes to a direct comparison with generative AI. That’s a whole new kettle of fish.
AI relies on huge clusters of GPUs operating in parallel, which create unprecedented power and heat density.

Cloud racks draw around 5-10kW as a standard, but AI training racks exceed 50-100kW, with some drawing higher than that.

With the day-to-day running of AI continuously once it’s deployed, the energy demands are relentless compared to the cloud workloads, especially as organisations embed AI agents into their operations, product experiences and customer support, that’s an increase in demand around the clock.

This leads us to another issue: cooling. While air cooling, the traditional method, was sufficient to manage the cloud workloads, it has found itself unable to keep up with the unprecedented growth that AI has brought to the industry. The adoption of liquid and hybrid cooling solutions has accelerated due to this fact.

What this means for the future of data centres

Colocation operators will struggle more and more with the constraints of the grid and managing the influx of power demands and the adequate cooling to keep up. With the future looking more and more energy hungry, innovation will be a key player in shifting the data centre industry towards a renewable, sustainable future and more innovative infrastructure with new designs that can scale with the pressure and demand.

Cloud data centres aren’t the only growth in the industry, and GenAi is now the new driver of infrastructure change.

Share this Post: