Five ChatGPT queries can mean an approximate consumption of 500 milliliters of wateror at least that’s what researchers say the University of California. Artificial intelligence is on everyone’s lips, its popularity has increased so much that it is already being researched the impact on the environment. If it was a concern in the past Energy consumption associated with cryptocurrency miningNow all of this damage is being passed on to the AI industry.
This study particularly focuses on the indirect water consumption required for operations the Microsoft data center in Iowa, United States. This study study also includes indirect water consumption from other sources connected to the data center. As an example, the power plant that supplies Microsoft’s servers with energy.
Determining the environmental impact (water costs) of using ChatGPT
This is how researchers at the University of California want to find out the environmental impact of AI processing. Above all motivated by the success of products like ChatGPT. And of course, Iowa is home to one of Microsoft’s most powerful ChatGPT supercomputers. As such, Microsoft’s data centers in Iowa are one of the key drivers of much of the recent expansion of AI processing.
One of Microsoft’s most powerful supercomputers uses a configuration of 285,000 cores through its AMD EPYC CPUs. These are accompanied by nothing less than 10,000 GPUs from NVIDIA along with 400 gigabits per second of network connectivity for each GPU server. Its entire computing power is dedicated to bringing to life GPT-4, OpenAI’s fourth-generation ChatGPT model.
Based on this hardware, the research found that Microsoft’s water consumption in Iowa increased by 34% between 2021 and 2022. up to almost 6,400 million liters. University researchers attribute this increase to the rise of AI. Calculations show that the data center uses half a liter of water for every 5 to 50 ChatGPT requests. This range is very high, so the variables that affect water consumption depend on the location of the server and the climate. Added to this is the “indirect water consumption“, is not taken into account by the companies involved. One example is the water consumption of power plants that supply data centers with electricity.
Microsoft and OpenAI said they are doing their best in terms of efficiency
Microsoft, for its part, can do little. All relevant hardware depends on AMD and NVIDIA. But they do suggest that they are trying to make progress in terms of efficiency, not to mention the use of renewable energies and other efforts in the name of sustainability.
On the other hand, OpenAI admitted that training large models involves high energy and water consumption. At this point, these are efficiency improvements, but as AI scales, further improvements will be pursued.
“We will continue to control our emissions, accelerate progress while increasing the use of clean energy to power data centers, purchase renewable energy and make other efforts to achieve our sustainability goals of being carbon negative, water positive and zero waste by 2030,” Microsoft said in a statement to The Associated Press after noting ChatGPT’s impact on water costs.
“We recognize that large training models can consume a lot of energy and water and are working to improve efficiency,” OpenAI said.