AI WILL CONTINUE ITS RAPID ACCELERATION , PROVIDING INSIGHTS AND OPTIMISATION , AS WELL AS HAVING AN IMPACT ON HOW DATA CENTRES ARE DESIGNED AS AI WORKLOADS GROW AND DEVELOP .
therapists and even lawyers , according to analysts . AI assistants will be leveraged in our jobs too , aggregating search engines , knowledge bases and professional resources . Furthermore , AI assistants are also being trained to monitor and improve teamwork , identifying bottlenecks and impediments , and assisting people to work more closely and effectively together .
The application of AI has also been recognised as having the potential to mitigate 5 to 10 % of global greenhouse gas emissions , according to research from Boston Consulting Group .
In this context , data centres have evolved to not only handle AI ' s computational intensity but to integrate smarter , more efficient systems that optimise energy use , reduce latency and ensure seamless connectivity .
Edge Computing
IDC sees Edge Computing market growth hitting US $ 350 billion by 2027 . AI will be a huge influence here , with Edge deployments bringing the benefit of the technology closer to where it is needed . It will also serve to optimise Edge deployments generally .
And there are other opportunities for refinement and optimisation too . Accelerated computing is described as the use of specialised hardware to dramatically speed up work , using parallel processing that bundles frequently occurring tasks . It actively offloads demanding work that can potentially bog down CPUs that typically execute tasks in serial fashion .
It is being argued that accelerated computing can help businesses strike the right balance by reducing overall energy consumption and costs for computeheavy workloads . It is true that GPUs may consume more power at peak times when compared to CPUs , but they can also complete tasks much faster . When comparing total energy consumption , GPUs use less energy and deliver faster results , making them a superior option for tasks like developing Large Language Models and simulations .
Evolving AI
The evolution of AI has been divided into two main types : training , which involves building models , and inference , which is used for decision-making , content generation and automation . Recently , AI inference has advanced significantly within data centres , particularly due to the large companies are repurposing their substantial training clusters for inference tasks because of their availability .
This has led to the emergence of data centre inferencing , where large training clusters are used for inference tasks , even though they are overpowered for the purpose .
The trend is expected to gradually shift towards Edge Computing for inference , as Edge devices offer better efficiency , lower latency , higher data security and customisation . However , until then , data centre inferencing will continue to be the primary method , despite its inefficiencies for smaller tasks .
New workloads
Compute-intensive technologies such as blockchain , will continue to grow
“
AI WILL CONTINUE ITS RAPID ACCELERATION , PROVIDING INSIGHTS AND OPTIMISATION , AS WELL AS HAVING AN IMPACT ON HOW DATA CENTRES ARE DESIGNED AS AI WORKLOADS GROW AND DEVELOP .
emphasis on edge computing for realtime data processing .
Contrary to initial expectations that small , efficient inference clusters would be built closer to users , many
and will have unique requirements as it does . Supply chains , international finance systems and the likes of identity and credentials management will all grow as use cases , with others emerging too .
www . intelligentdatacentres . com 65