Menu Menu

The rise of AI chatbots could create an ecological nightmare

The race for Silicon Valley giants to integrate AI language generators into internet search engines could cause a five-fold demand in computing power and massive carbon emissions.

The commercial road-map for AI text generators has now been established. With humble beginnings in recreational uses, like creating original song lyrics, poems, and even acing school assignments, the technology has now become a lucrative obsession for Silicon Valley giants.

In early February, news that Microsoft planned to integrate ChatGPT into Bing sent Google into a reported ‘code red’ scramble.

Following crunch meetings between Alphabet’s bigwigs, the company decided to accelerate the launch of its own AI competitor, Bard, fearing its search engine and ad revenue could see a sizable dip in engagement.

Both companies are still jostling to prove their iteration of chatbot tech is the benchmark and Chinese search company Baidu has since announced its foray into AI.

New developments are popping up every week, and seeing the sheer magnitude of investment and innovation is exciting. Amid the frenzy for profits, however, one key aspect of the burgeoning space has received far less attention: its potentially huge ecological cost.

Quite literally concealing a dirty secret, proprietors have yet to disclose the huge power demands to power these tools to perform en masse. Third party analysis suggests that OpenAI’s GPT-3 consumed 1,287 MWh during training and created 550 tons of carbon dioxide.

While that doesn’t seem a horrendous amount in isolation, when you think of the bigger picture – and the energy needed to serve millions of users constantly on each major search engine – the ramifications become real concerning.

ChatGPT boasts 13 million users a day as a standalone product, and its host platform Bing handles half a billion daily hits. Even without the added strain of generative AI tech, data centres already account for around 1% of all greenhouse gas emissions.

Once eventually coupled with several chatbot integrations, Martin Bouchard of Canadian data centre company QScale estimates that an extra four-to-five times more computing power will be needed per internet search. He claims this is likely the absolute minimum too.

‘If they’re going to retrain the model often and add more parameters and stuff, it’s a totally different scale of things,’ Bouchard explained. ‘Current data centres and the infrastructure we have in place will not be able to cope with [adding AI]… it’s too much,’ he warns.

As we’ve seen all too often, when new and popular technologies emerge, their ecological impact is often an afterthought. The boom in cryptocurrency and NFTs literally revived dying fossil fuel plants throughout the US, and the extent of cloud computing’s carbon footprint could become outrageous as it grows into the bedrock of digital transactions.

Considering Microsoft has committed to becoming carbon negative by 2050, we’re taking a very keen interest in its future operations – in particular, how it plans to underpin its AI endeavours with green infrastructure.

Accessibility