ESTEEMStream.News

ESTEEM Center for Equity in Science, Technology, Engineering, English & Math

ESTEEMStream.News

ESTEEMStream.News

Microsoft’s Usage of Billions of Gallons of Water for AI

Both Microsoft and Google saw significant spikes in water usage from 2021 to 2022
Microsofts Usage of Billions of Gallons of Water for AI

The first recorded increase of 34 percent was nearly 1.7 billion gallons of water, while later reports recorded a 20 percent rise in the same time period.

According to AP Shaolei Ren, a scientist at the University of California, Riverside, who has been on a mission to evaluate the environmental impact of generative AI, it’s “fair to say the majority of the growth is is due to AI,” mostly from generative AI and OpenAI.

Does ChatGPT 'drink' water? Research reveals hidden cost of every conversation with AI chatbot | Technology News - The Indian ExpressAn 2023 April report revealed that ChatGPT-3’s training guzzled up a total of 85,000 gallons of water. The research predicted that ChatGPT would need 500 millimeters of water (a normal water bottle) to complete a discussion with a user of around 24 to 50 questions.

Large cooling solutions are needed in data centers, where AI computations are run, to avoid the overheating of gear. HVAC systems and cooling towers use a huge amount of water. Moreover, AI computations can cost a lot of energy. In some situations, producing electricity for operations can lead to large amounts of water use.

AP reached out to Microsoft, who responded saying they were “working on ways to make large systems more efficient, in both training and application.”

In order to reach its targets of reducing carbon usage, increase water, and becoming waste free by 2030, the firm stated it would continue to analyze its emissions, boost the usage of clean energy, and purchase renewable energy and invest in eco-friendly options. Training ChatGPT AI Required 185,000 Gallons of Water: Study

Moreover, Open AI told the news outlet that they “recognize[s] training large models can be energy and water-intensive” and are improving efficiencies of the models.

One solution being explored is to lower the energy requirements of AI computations. This leads to AI being more energy-efficient, and can become possible with better algorithms and hardware, also reducing water use.

Back in 2018, Microsoft sunk one of its centers off the coast of Orkney and studied its performance for two years. The center turned out to be more efficient during that time. Microsoft's underwater server experiment resurfaces after two years - The Verge

One reason that could have been is due to the water surrounding it and constantly cooling it. Microsoft even said at the time of its retrieval that the process had environmental benefits.

 

 

RELATED STORIES:

TAKE ACTION:

More to Discover