Intelligent Data Centres Issue 78 | Page 61

IN SHORT, AI PUTS MORE STRESS ON INFRASTRUCTURE AND IT DOES SO IN WAYS THAT OLDER DATA CENTRE DESIGNS SIMPLY WEREN’ T BUILT TO HANDLE.
FINAL WORD
This volatility creates unique challenges for both electrical and cooling systems.
In short, AI puts more stress on infrastructure and it does so in ways that older data centre designs simply weren’ t built to handle.
Cooling is now a strategic concern
As densities increase, the limits of traditional air cooling are being tested. Fans, vents and aisle containment strategies still have a role to play, but on their own, they’ re not enough. Heat builds quickly and when it isn’ t removed efficiently, it can lead to serious incidents such as thermal throttling or even catastrophic system failure.
That’ s why many operators are turning to liquid cooling. Direct-to-chip systems and immersion cooling solutions are becoming more common, particularly in facilities that need to house dense clusters of AI servers.
These cooling methods introduce new considerations: everything from leak prevention and fluid maintenance to regulating flow rates and temperatures in real time. But there are benefits: improved energy efficiency, higher density per square foot and greater longterm flexibility.
Electrical infrastructure is under strain
AI workloads don’ t just generate heat. They demand a different kind of electrical backbone.
The spike-driven nature of AI tasks means power systems must be responsive and resilient. Power distribution units( PDUs), uninterruptible power supplies( UPSs), switchgear and circuit protection systems all need to be sized and selected with this behaviour in mind.
Failures can cascade quickly when AI nodes are involved. A momentary voltage drop or power quality issue may go unnoticed in a standard IT environment but could interrupt or corrupt an active AI training process, costing hours or even days of processing time.
This has placed a renewed focus on electrical commissioning. Systems must be tested under stress, not just at average operating conditions. Load banks, simulated faults and failover exercises are becoming essential parts of the handover process.
Commissioning for dynamic environments
Data centre infrastructure used to be relatively static. Once a data hall was provisioned, it often ran close to capacity with consistent, stable loads. AI changes that norm. The infrastructure must now handle dynamic scaling by adding and removing servers, integrating new hardware generations, and adjusting cooling and power delivery on the fly.

IN SHORT, AI PUTS MORE STRESS ON INFRASTRUCTURE AND IT DOES SO IN WAYS THAT OLDER DATA CENTRE DESIGNS SIMPLY WEREN’ T BUILT TO HANDLE.
This requires a different commissioning mindset. Facilities are increasingly deploying digital twins to test new layouts and configurations before rolling them out. These virtual environments help predict airflow patterns, simulate energy loads and even model thermal hotspots – reducing surprises during live deployment.
Moreover, commissioning no longer ends once a system is signed off. With workloads evolving rapidly, operators are introducing continuous commissioning models – re-testing infrastructure periodically as usage changes.
The grid is becoming a bottleneck
Power availability is one of the most pressing concerns for data centre
Jon Abbott, Technologies Director, Global Strategic Clients at Vertiv
expansion today. In parts of the UK, securing new grid connections can take years. In urban centres, the situation is especially acute. Some developers are being told that capacity won’ t be available until the 2030s.
For AI infrastructure, which requires significantly more power than generalpurpose compute, this is a major constraint. To work around it, some operators are turning to on-site generation – either through combined heat and power( CHP) systems, natural gas turbines, or hydrogen-ready backup units. Others are investing in large-scale battery energy storage systems( BESS) to help manage peak loads and maintain power continuity.
Cooling systems are tightly linked to these developments. Liquid cooling systems rely on consistent energy delivery. If the power goes, so does the cooling – and when racks are drawing tens of kilowatts, thermal runaway can happen fast.
Reuse of heat is being explored seriously
A broader benefit of AI’ s thermal intensity is that it creates an opportunity to recapture waste heat. This isn’ t a new idea, but the practicality has often been limited by low-grade heat and complex routing challenges.
Liquid cooling changes the equation. It delivers higher-temperature thermal energy in a concentrated form, which can be more easily captured and redirected. In some cases, data centres are feeding heat into district heating
www. intelligentdatacentres. com 61