The race for Silicon Valley giants to integrate AI language generators into internet search engines could cause a five-fold demand in computing power and massive carbon emissions.
The commercial road-map for AI text generators has now been established. With humble beginnings in recreational uses, like creating original song lyrics, poems, and even acing school assignments, the technology has now become a lucrative obsession for Silicon Valley giants.
In early February, news that Microsoft planned to integrate ChatGPT into Bing sent Google into a reported โcode redโ scramble.
Following crunch meetings between Alphabetโs bigwigs, the company decided to accelerate the launch of its own AI competitor, Bard, fearing its search engine and ad revenue could see a sizable dip in engagement.
Both companies are still jostling to prove their iteration of chatbot tech is the benchmark and Chinese search company Baidu has since announced its foray into AI.
New developments are popping up every week, and seeing the sheer magnitude of investment and innovation is exciting. Amid the frenzy for profits, however, one key aspect of the burgeoning space has received far less attention: its potentially huge ecological cost.
Quite literally concealing a dirty secret, proprietors have yet to disclose the huge power demands to power these tools to perform en masse. Third party analysis suggests that OpenAIโs GPT-3 consumed 1,287 MWh during training and created 550 tons of carbon dioxide.