The resource consumption of ChatGPT is reaching unprecedented levels. As the popularity of generative AI grows, so too does the need for water and electricity. A new study conducted by The Washington Post and the University of California, Riverside, highlights the significant environmental cost of running OpenAI’s chatbot, even for basic functions.
Water Usage for ChatGPT: How Location Impacts Consumption
The amount of water needed for ChatGPT to generate a simple 100-word email varies depending on where the user is located and their proximity to an OpenAI data center. In regions with scarce water resources and lower electricity costs, data centers often depend on electric-powered cooling systems. For example, in Texas, the chatbot consumes roughly 235 milliliters of water to write a 100-word email. In contrast, the same email generated in Washington state requires about 1,408 milliliters of water—nearly 1.5 liters.
ChatGPT’s Electricity Usage: An Energy-Intensive Operation
The electricity demands of ChatGPT are similarly significant. According to The Washington Post, generating a 100-word email through the chatbot consumes enough power to run more than a dozen LED light bulbs for one hour. If only 10% of Americans used ChatGPT to write one email a week for a year, the electricity used would equal the energy consumption of all households in Washington, D.C., over a 20-day period. Washington, D.C., is home to around 670,000 people.
The Growing Strain on Resources: Meta, Google, and xAI
This growing demand for resources isn’t limited to ChatGPT. Meta’s training of its Llama 3.1 models required 22 million liters of water. Google’s data centers in The Dalles, Oregon, reportedly consume almost 25% of the town’s available water supply. Meanwhile, xAI’s new Memphis supercluster requires a massive 150MW of electricity from the local utility provider, which is enough to power approximately 30,000 homes.
As AI technology advances, its environmental footprint continues to grow, posing significant challenges for the future.