In a rare disclosure on the environmental impact of AI, OpenAI CEO Sam Altman has revealed that every ChatGPT query consumes approximately 0.000085 gallons of water. While the number may appear negligible at first glance, it underscores a growing concern about the sustainability of artificial intelligence technologies at scale.
Speaking at a recent AI and climate forum, Altman highlighted the hidden resource demands behind large language models like GPT-4 and GPT-4o. “People think of AI as software, but the compute infrastructure behind every query has a physical cost — including electricity and water,” he said.
This water usage is primarily attributed to data center cooling systems, which require large volumes of water to prevent overheating during intense computation. As millions of users engage with AI tools daily, these micro-amounts add up, sparking debates on the environmental footprint of advanced AI.
To put the figure into perspective:
- One million ChatGPT queries would use 85 gallons of water — about the same as a single household’s daily use.
- Global daily ChatGPT traffic could potentially account for tens of thousands of gallons of water consumption.
Altman emphasized OpenAI’s efforts to reduce water usage by partnering with more efficient cloud providers and exploring alternative cooling methods. The company recently announced a major collaboration with Google Cloud, known for its relatively sustainable infrastructure.
The revelation comes at a time when scrutiny of AI’s environmental impact is intensifying. Industry peers like Microsoft and Google have also faced pressure to disclose and reduce their carbon and water footprints, especially as demand for generative AI surges.
As AI continues to reshape industries, its hidden environmental costs are entering the spotlight. Altman’s statement marks a significant step in acknowledging these challenges and the need for greener AI innovation.