Intelligent Data Centres Issue 77 | Page 43

NO MATTER HOW POWERFUL THE MODEL OR HOW ABUNDANT THE COMPUTE POWER, IF THE NETWORK CAN’ T DELIVER DATA WITH SINGLE-DIGIT MILLISECOND PRECISION, AI WON’ T DELIVER THE INTENDED RESULTS.
E X P E R T O P I N I O N rtificial Intelligence( AI)

A that has the potential to revolutionise entire industries, has long passed its demo stage. It now runs through the veins of businesses and will soon become the lifeblood of our economy.

From fraud detection and humanoid robotics to autonomous vehicles and real-time language processing, AI is now expected to operate instantly, intuitively and everywhere. But with this shift comes a new bottleneck: latency.
No matter how powerful the model or how abundant the compute power, if the network can’ t deliver data with singledigit millisecond precision, AI won’ t deliver the intended results. The reality is simple – without ultra-low-latency connectivity, there is no viable future for AI at scale.
That’ s why I was pleased to join Tonya Witherspoon of Wichita State University and Hunter Newby of Connected Nation Internet Exchange Points( CNIXP) for a webinar on one of the most overlooked constraints in digital infrastructure: roundtrip delay( RTD). While our conversation covered AI, network design and publicprivate collaboration, the central message was clear: we cannot solve tomorrow’ s challenges with yesterday’ s networks. Latency isn’ t just a technical metric; it’ s an economic limiter, a competitive differentiator and now a make-or-break component of AI.
Below are five key discussion points from our webinar, titled‘ Latency Kills: Solving the bottleneck of RTD to unlock the future of AI’, on why solving the latency challenge – both locally and nationally – is the next critical step on the road to AI mastery.
1. Low latency is no longer optional at a keystroke, a vehicle processing sensory data on the move, or a manufacturing plant using robotics for precision tasks, latency has become the hard ceiling of performance. As I’ ve said many times before, latency is not just a metric – it’ s currency.
For 4K streaming, the boundary is around 15 milliseconds. For high-frequency trading and autonomous driving, it’ s under five. And when we enter the realm of humanoid robotics and AI agents that right places, we can shrink that round-trip delay and make real-time AI a reality.
2. Geography matters
For decades, network infrastructure has mirrored population density and economic gravity, clustering around coastal metros and skipping over large swathes of the country. The result is what Hunter described during the webinar as‘ flyover cities’ – places where fibre may pass through but never breaks out. These

NO MATTER HOW POWERFUL THE MODEL OR HOW ABUNDANT THE COMPUTE POWER, IF THE NETWORK CAN’ T DELIVER DATA WITH SINGLE-DIGIT MILLISECOND PRECISION, AI WON’ T DELIVER THE INTENDED RESULTS.
interact like humans, we’ re talking about single-digit millisecond responsiveness and that translates to a physical radius of 50 to 150 miles. Beyond that range, the round-trip delay is too high and the application breaks down.
As Newby puts it,“ Fraud detection from the major banks is something that they want to do at the keystroke, on the phone, as it ' s occurring. That’ s a three or sub-three millisecond requirement. Without the right physical infrastructure in place – land, buildings, fibre and an Internet Exchange – it simply can’ t happen. We’ re talking about needing thousands of facilities like that across the US, and they don’ t exist.” This is the kind of performance that enterprises must now design for and it’ s impossible to achieve without rethinking where infrastructure lives and how data moves. cities aren’ t disconnected; they just have poor accessibility. And that has real consequences for latency. Every extra mile a packet travels adds delay, which adds cost, which in turn limits the viability of emerging AI services. This is why geography matters.
If we want real-time digital experiences, whether for a bank customer in Kansas or an autonomous vehicle in rural Texas, then we need physical infrastructure built where the people, machines and data actually are.
Hunter offered an analogy that resonated with many:“ Think of the Internet like air travel. Nobody wants to take three connecting flights to get somewhere. Everyone wants a direct flight. But right now, for many parts of the US, we’ ve built the equivalent of runways with no airports.”
AI applications are no longer abstract, back-end computations; they are realtime, front-line systems that increasingly underpin daily life. Whether it’ s a fintech company performing fraud detection
Adding more compute capacity won’ t solve the problem if the data can’ t get there and back in time. The only way forward is through proximity; by building the right interconnection points in the
And the stakes are rising. Applications like autonomous vehicles and robotics don’ t just benefit from direct interconnection – they require it. Which is why projects like the new IXP at
www. intelligentdatacentres. com 43