top of page

Taming the AI surge: The critical role of data centre infrastructure

  • Writer: Josh Addison
    Josh Addison
  • 2 days ago
  • 3 min read

Updated: 1 day ago

Blue graphic with text: "Taming the AI surge: The critical role of data centre infrastructure." Includes profiles of people working on laptops.

From creative studios to financial services, generative AI is reshaping how industries operate, but the unseen engines powering every prompt and inference are being pushed to their limits. Modern AI workloads demand more compute, connectivity, and cooling than traditional applications, forcing data centres to evolve or risk becoming bottlenecks for innovation.

The unstoppable rise of AI

Since the public debut of large language models (LLM) in late 2022, AI adoption has accelerated at breakneck pace. The largest and most capable LLMs are generative pretrained transformers (GPTs) which are predominantly used in generative chatbots, such as ChatGPT, Gemini or Claude. Generating human-like responses in text, speech, and images, as well as access to features like searching the web, using apps and running programmes these tools are credited to accelerating the AI boom.


But AI-driven workloads can consume up to ten times the energy of typical cloud applications and, as model sizes scale, power and cooling requirements have spiralled. In the United Stares alone, data centre electricity demand is projected to grow from roughly 150-175 TWh today to around 560 TWh by 2030, accounting for around 13% of the nation’s total consumption. It has even been reported from OpenAI CEO, the company behind ChatGPT, that users saying “Please” or “Thank you” is costing the company “Tens of millions of dollars” in electricity bills.


Why the right infrastructure makes all the difference

Handling AI’s appetite for compute isn’t just about building bigger halls full of servers, it instead requires a more holistic approach to site selection, power sourcing, risk management, and network design. Hyperscalers, which are massive cloud and data management providers like Amazon Web Services, Google, Microsoft, and Oracle, have led the way in navigating the architectural demands of hyperscale computing.


By designing systems that scale seamlessly under pressure, they’ve turned infrastructure constraints, including power, cooling, connectivity, and resilience, into competitive advantages. Their ability to add compute, memory, networking, and storage across distributed environments has not only enabled the growth of AI but reshaped what’s possible in cloud, big data and distributed storage operations.


In addition to the work of hyperscalers, long-term renewable energy contracts, on-site battery storage, and smaller modular reactors are being deployed to secure “firm dispatchable” energy. The development of next-generation connectivity also keeps traffic in the optical domain, which improves both latency and power consumption compared with traditional electrical switching.


Cutting losses with optical circuit switching

Alongside this, optical circuit switching is emerging as a breakthrough technology in large scale AI deployments. Clusters of thousands of graphical processor units (GPUs) communicate over optical fibres at hundreds of gigabits per second, but legacy networks often rely on energy intensive optical-electrical-optical conversions.


As an example, the POLATIS® optical circuit switch from HUBER+SUHNER helps keep data entirely in the optical domain, enabling dynamic path reconfiguration with minimal latency and significantly lower power draw. This shift not only reduces operational expenditure but also unlocks new data centre topologies that can be scaled to meet changing AI workloads.


Beyond performance, smarter infrastructure can also support sustainability goals. At a data centre near the HUBER+SUHNER Swiss headquarters, waste heat from the cooling system is recycled to heat a nearby cheese dairy, demonstrating how smarter infrastructure can yield multiple benefits.


Aligning growth with sustainability goals

On the topic of data centre sustainability, recent research from Barclays underscores that AI’s escalating energy demands will exceed projected grid upgrades and existing net-zero pathways unless data centre operators, utilities, and regulators work together. High-capacity transmission networks must be expanded, policy frameworks should incentivise carbon-neutral power for latency sensitive workloads, and public-private partnerships are essential to secure resilient, low-carbon infrastructure without compromising performance.


In this increasingly interconnected landscape, data centres can no longer grow in isolation but instead must embrace forward-looking energy strategies that balance compute power with a clear path to net zero emissions.

Comments


bottom of page