Intelligent Data Centres Issue 85 | Page 28

I N T E L L I G E N T T E C H N O L O G Y INNOVATION

EPRI, NVIDIA, Prologis and InfraPartners announce collaboration to produce smaller data centres for the AI era

EPRI has announced a collaboration with Prologis, NVIDIA and InfraPartners to study smaller-scale data centres designed for distributed inference, a form of real-time data processing used across sectors including logistics, healthcare, finance and public services.

The collaborators will assess the deployment of micro data centres – ranging from 5 to 20 megawatts – at or near utility substations with available grid capacity that can be quickly set up. The goal is to bring inference capabilities – the process of generating real-time responses from trained models – closer to where data is generated and consumed, while making better use of underutilised infrastructure and reducing pressure on congested transmission systems.
The companies will explore how smaller, distributed sites can meet computing needs without straining the grid. The collaboration aims to have at least five pilot sites in development across the US by the end of 2026, providing a replicable model for rapid, scalable deployment.
“ AI is transforming every industry, and the energy system will need to continue to evolve to meet increasing demand,” said EPRI President and CEO, Arshad Mansoor.“ This collaboration with Prologis, NVIDIA, InfraPartners and the utility community highlights the type of innovative actions required to meet the moment. Using existing grid capacity to bring inference compute closer to where it’ s needed – quickly and reliably – is a win for all.” closer to end-users to relieve pressure on congested transmission systems.
From autonomous logistics to fraud detection and digital diagnostics, inference systems are playing an increasingly important role in supporting real-time decision-making across nearly every sector of the economy. These workloads don’ t require hyperscale facilities, but they do demand reliable, fast and location-sensitive compute power. By moving inference closer to the edge of the grid, utilities and infrastructure providers can respond more efficiently to the growing volume and velocity of data.
This approach also supports grid reliability. By co-locating computing capacity with substations that have existing but underused distribution headroom, utilities may reduce transmission congestion, improve system flexibility and help integrate renewable energy.
“ As energy demand grows, we need infrastructure solutions that support grid reliability and make better use of what’ s already built,” said Parag Soni, Senior Vice President and Global Head of Utility Strategy and Engagement at Prologis.“ This collaboration is about using our development and energy expertise to help deliver smarter, more flexible infrastructure right where it’ s needed.”
“ AI is driving a new industrial revolution that demands a fundamental rethinking of data centre infrastructure,” said Marc Spieler, Senior Managing Director for the Global Energy Industry at NVIDIA.“ By deploying accelerated computing resources directly adjacent to available grid capacity, we can unlock stranded power to scale AI inference efficiently. This distributed approach, powered by NVIDIA accelerated computing, maximises existing energy assets, helping to deliver the intelligence required to transform every industry.” �
As Artificial Intelligence applications scale across industries, demand for AI inference continues to surge. Meeting this demand not only requires more compute, but AI infrastructure deployed
28 www. intelligentdatacentres. com