The widespread of Artificial Intelligence (AI) technologies in existing tech systems is contributing to an increase in energy consumption, resulting in environmental consequences.
‘ChatGPT, the widely-used chatbot developed by OpenAI, guzzled over half a million kilowatt-hours of electricity daily, as it handles nearly 200 million user requests worldwide’, as per a report in The New Yorker.
The average American household consumes about 29 kilowatt-hours per day. ChatGPT’s energy usage, therefore, exceeds that of an average household by over 17,000 times.
The projections of these energy consumptions by Artificial Intelligence companies show that if major tech corporations were to integrate generative AI technology into every search, it could result in an annual electricity consumption of around 29 billion kilowatt-hours.
‘This surpasses the annual energy consumption of entire nations such as Kenya, Guatemala, and Croatia’, according to the reports of The New Yorker .
According to a report by Business Insider, ‘With the energy-intensive nature of AI, individual AI servers can match or surpass the power consumption of multiple households combined;.
The entire AI sector could annually consume between 85 to 134 terawatt-hours by 2047, equivalent to half a per cent of global electricity consumption.
Accurately quantifying the electricity usage of the AI industry remains a challenge due to variability in AI model operations and a lack of transparency from major tech companies.
But the estimates based on data from Nvidia which is the top chipmaker in the AI market, suggest a surge in energy consumption within the sector in the coming years.
Comments