AS REQUIREMENTS FOR AI COMPUTE CONTINUE TO INCREASE, THE TRAINING OF LARGE LANGUAGE MODELS( LLMS) IS EXPECTED TO BECOME MORE DISTRIBUTED ACROSS DIFFERENT AI DATA CENTRES.
TRENDING
“
AS REQUIREMENTS FOR AI COMPUTE CONTINUE TO INCREASE, THE TRAINING OF LARGE LANGUAGE MODELS( LLMS) IS EXPECTED TO BECOME MORE DISTRIBUTED ACROSS DIFFERENT AI DATA CENTRES.
Traditionally, network traffic has grown at a rate of 20 – 30 % per year, however AI is accelerating this growth significantly. Amid this surge, network providers are taking bold steps to ensure that their networks are ready for a future where AIdriven traffic dominates.
Telecom’ s role in scaling Artificial Intelligence
The AI revolution is not just about compute, rather it is about connectivity. It needs the right network foundation for its full potential to be realised. To illustrate, recently e & UAE, the telecom arm of e &, marked a Middle East and Africa regional first, deploying 1.6 Tb / s wavelength connectivity using Ciena’ s WaveLogic 6 Extreme( WL6e) on its network.
Ciena’ s WL6 boosts the network with ultra-high speed 400G client infrastructure connectivity, supporting 10 Gb home services as well as wholesale and domestic business customer traffic with 100G and 400G requirements. It also elevates the delivery of premium customer experiences over e &’ s 5G network and optimises the optical network infrastructure to strategically support the traffic sensitivity of AI data centre hubs.
“ As requirements for AI compute continue to increase, the training of Large Language Models( LLMs) is expected to become more distributed across different AI data centres. In fact, many operators believe LLM training will take place over some level of distributed
52 www. intelligentdatacentres. com