Intelligent Data Centres Issue 60 | Page 43

THE MAJORITY ( 89 %) OF IT DECISION- MAKERS WILL DEMAND HIGH-DENSITY , HIGH-PERFORMANCE COMPUTER SYSTEMS BY 2030 .
E X P E R T O P I N I O N ata centre operators

D are in a continual race to meet the changing demands of clients . As AI evolves at a staggering pace , many infrastructures now necessitate high-density compute capabilities for AI-centric applications . New Telehouse research has also found that the majority ( 89 %) of IT decisionmakers will demand high-density , high-performance computer systems by 2030 . Yet , with this surge in power comes another challenge : the indispensable need for more robust cooling .

From their very inception , data centres have had integrated cooling mechanisms . Historically , air cooling has been the preferred method to maintain optimal temperatures . Using the simple principle of circulating cold air around active hardware , it dissipates the heat generated . However , the contemporary workloads we see today , especially those governed by AI , challenge the limits of air cooling .
Air cooling systems are fundamentally limited in their ability to be heat transfer solutions . This places strain on data centre infrastructure and also creates frustration among customers who are unable to deploy new resource-
Immersive liquid cooling , necessitates the submersion of servers in a specially engineered , non-conductive fluid , allowing heat to disperse into the liquid medium . This method , while efficient , requires specific alterations to servers to ensure their safe immersion .
The adoption of liquid cooling offers a myriad of advantages . Notably , it allows data centre operators to increase rack densities , sometimes reaching up to 100kW per rack . This , in turn , empowers

THE MAJORITY ( 89 %) OF IT DECISION- MAKERS WILL DEMAND HIGH-DENSITY , HIGH-PERFORMANCE COMPUTER SYSTEMS BY 2030 .
heavy services . While this is less of a widespread concern now , denser workloads and computing power requirements will grow over the next few years , and air cooling will need to make way for new solutions in these instances . Enter : liquid cooling .
While the appetite for liquid cooling currently remains stable , primarily due to supply chain disruptions , it ’ s gearing up to play a pivotal role in the future of digital infrastructure . As the urgency for enhanced power and cooling escalates , the focus on eco-friendly solutions will also intensify . The inflection point for liquid cooling is near and providers must prepare now ahead of this trend .
The rise of liquid cooling
The future landscape of data centre cooling is likely to be dominated by two distinct types of liquid cooling . Conductive liquid cooling taps into the principle of using liquid to directly siphon off heat from processor components . It employs heat sinks affixed directly to heat-generating elements like the central processor . These are then linked to tubes that facilitate liquid movement , ensuring heat is efficiently transferred . clients focused on innovation to roll out power-intensive workloads essential for their expansion . Moreover , these cooling systems typically consume less energy , alleviating concerns for operators about escalating energy costs .
The lower energy needed also reduces the operator ’ s carbon footprint and leads to a marked improvement in its PUE number . As a bonus , the absence of CRAH / CRAC units liberates space in data halls , and the omission of fans results in quieter operations .
However , the transition isn ’ t devoid of challenges . Implementing liquid cooling introduces complexities , particularly in the initial design and setup phases . These complexities might ripple through to maintenance and troubleshooting stages . Operators must be vigilant against potential leaks or spills , which can spell disaster , leading to hardware damage or data loss . The cooling and heat rejection system ’ s water quality must be wellmanaged by operators . Additionally , the financial implications of repairs , replacements and ensuring optimal water quality cannot be overlooked . Specialty lifting and cleaning equipment should be considered by the building design team .
www . intelligentdatacentres . com 43