When OpenAI introduced ChatGPT nearly two years ago, no one anticipated that this chatbot would take the world by storm. As of today, OpenAI’s conversational assistant boasts an impressive 200 million users per week. While this significant user base is a major win for the company, it poses serious challenges for our planet.
The development of a sophisticated language model typically comes at a high financial cost, but it also has a profound impact on our environment.
A study conducted at the University of California revealed that for just ten to fifty queries made to ChatGPT, its supercomputer uses 500 milliliters of water for cooling purposes.
Now, consider the vast amount of water required for the multitude of chatbots available today.
Cooling Towers of Data Centers Deplete Our Water Resources
What you may not realize is that in all data centers, especially those housing AI systems, the servers are primarily cooled by cooling towers.
This method is essential for maintaining optimal temperatures and preventing overheating. However, the water that exits these cooling towers is not recoverable.
For instance, training the GPT-3 model required approximately 700 cubic meters of water. One can only imagine the staggering amount of water OpenAI has consumed to train the GPT-4 model.
In addition, Microsoft, OpenAI’s principal partner that has invested up to 13 billion dollars into the company, has reported a 34% increase in water consumption from 2012 to 2022.
This increase is equivalent to 2,500 Olympic-sized swimming pools. Researcher Shaolei Ren from the University of California notes that this excessive usage is largely due to the training of Copilot and the collaboration between Microsoft and OpenAI.
“The issue is that most of us are unaware of the dangers associated with the daily use of chatbots like ChatGPT. This lack of knowledge about the resources necessary to operate these AI systems makes it impossible to sustain them,” he added.
On its part, Microsoft claims it has clear sustainability goals in its policy, including monitoring its emissions, accelerating AI development progress, and utilizing renewable energy for powering its data centers.
Generative AI Threatens the Water Table in Des Moines, Already Facing Drought
Microsoft, a key partner of OpenAI, has committed to providing the massive computing resources required for the development of future AI models.
This entails building extensive data centers. Iowa, with its vast spaces and temperate climate, appears to be the ideal location for such installations.
However, in Des Moines, the capital of Iowa, each request sent to AI systems like ChatGPT only serves to exacerbate local drought conditions.
These data centers consume thousands of liters of groundwater daily, risking the depletion of the nation’s aquifers.
The study emphasizes the potential conflicts surrounding water usage by generative AI in the United States.
This concern is magnified by the fact that 44 million Americans already have limited access to clean drinking water.
Moreover, let’s not forget Elon Musk’s Colossus project, which is set to become the world’s largest data center.
Our blog is reader-supported. When you purchase through links on our site, we may earn an affiliate commission.
As a young independent media, Web Search News aneeds your help. Please support us by following us and bookmarking us on Google News. Thank you for your support!