t the end of 2023 any forecast of how much
A energy will be required by Generative AI is inexact . Headlines tend towards guesstimates of ‘ 5x , 10x , 30x power needed for AI ’ and ‘ Enough power to run 100,000s of homes ’ etc . Meanwhile , reports in specialist publications such as the data centre press , talk of power densities rising to 50kW or 100kW per rack .
Why is Generative AI so resource hungry ? What moves are being made to calculate its potential energy cost and carbon footprint ? Or as one research paper puts it , what is the ‘ huge computational cost of training these behemoths ’? Today , much of this information is not readily available .
Analysts have forecast their own estimates for specific workload scenarios , but with few disclosed numbers from the cloud hyperscalers at the forefront of model building , there is very little hard data to go on at this time .
Where analysis has been conducted , the carbon cost of AI model building from training to inference has produced some sobering figures .
The explosion of AI implementation has presented many questions in the sector surrounding AI and its power consumption , with minimal information available to answer these questions . Ed Ansett , Founder and Chairman , i3 Solutions Group , explores how AI will become increasingly demanding for data centres .
According to a report in the Harvard Business Review , researchers have argued that training a ‘ single Large Language Deep Learning model ’ such as OpenAI ’ s GPT-4 or Google ’ s PaLM , is estimated to use around 300 tons of
CO 2
. Other researchers calculated that training a mediumsized Generative AI model using a technique called ‘ neural architecture search ’ used electricity and energy consumption equivalent to 626,000 tons of CO 2 emissions .
WHY AI IS SO RESOURCE HUNGRY AND HOW IT WILL IMPACT DATA CENTRES
64 www . intelligentdatacentres . com