What you need to know
- A new study highlights that generative AI could consume energy to power a small county by 2027 for a year, translating to approximately 85-134 terawatt-hours (TWh) of electricity.
- There’s also a rising concern about the large amount of water used to cool the data centers used to run AI-powered chatbots like ChatGPT and Bing Chat whenever they are used to answer queries.
- However, for this to happen, several factors must be constant—for instance, the interest in AI and the availability of AI chips.
It’s no secret that most organizations and companies have plunged into generative AI since its emergence last year. Undoubtedly, remarkable feats and untapped opportunities have been unlocked using the technology. For instance, it’s now easier for students to solve complex math problems, medical advancements, and more.
But all these advances come at a cost, and an expensive one. We already know that OpenAI parts with up to 700,000 dollars daily to run its AI-powered chatbot, ChatGPT. Running the chatbot is becoming an expensive venture because the company is reportedly on the verge of bankruptcy amid user complaints that it is getting dumber.
And now, according to a new study spotted by BBC, it’s feared that the resource-hungry technology could plausibly consume “as much energy as a country the size of the Netherlands by 2027.” This is because AI uses more energy compared to other conventional apps.
However, this depends on the exponential growth of the technology. But as it stands. Chances of this happening are rather slim to none, according to a recent report that showed ChatGPT’s user base declined for three months consecutively. The same can be said about Microsoft’s Bing Chat since its market share has stagnated for the greater part of this year despite its multi-billion dollar investment in the technology.
AI consumes enough energy to power a small country
While conducting this study, Alex De Vries, a PhD student at the VU Amsterdam School of Business and Economics, assumed a couple of factors would remain constant through 2027. For instance, Alex assumed that the interest in AI technology among consumers would continue to grow, coupled with the availability of AI chips.
But if recent developments are anything to go by, AI chips might not be readily available for much longer. NVIDIA, a mass developer of the GPUs used in AI tech, is seemingly unable to meet the rising demand for these chips, which has stunted the growth and development of AI advances. Microsoft has reportedly clocked this shortcoming and is preparing to debut its first dedicated AI chip next month during its annual developer conference, Ignite 2024. This is to mitigate costs and potentially make the venture more profitable for the company.
The researcher’s study further disclosed that if these factors remain constant, AI technology will consume approximately 85-134 terawatt-hours (TWh) of electricity annually by 2027. This is enough to power a small country with a sparse population like the Netherlands.
Climatic and Environmental concerns
(Image credit: Microsoft)
There’s already a rising concern over the amount of water used to cool data centers to run chatbots like OpenAI’s ChatGPT and Microsoft’s Bing Chat. Approximately 1 bottle of water is used for cooling every time these chatbots are used to answer queries. While the research paper didn’t quantify the amount of energy or water required to keep this entire operation alive, primarily because tech firms don’t reveal this kind of information, it is evident that a lot of resources are factored in.
However, strategically placed data centers like Microsoft’s Iowa-based data centers have proven effective and efficient thanks to the location and temperate climate. These two factors significantly reduced the water used to cool down the servers.
Do you think tech firms can keep the generative AI ball rolling for much longer amid the energy concerns? Please share your thoughts with us in the comments.