As data centers multiply and electricity use rivals that of entire nations, the argument is no longer about whether artificial intelligence will expand, but about how the world will power it without deepening the strains on water, energy and climate goals

At a summit devoted to charting India’s artificial intelligence ambitions, Sam Altman, the chief executive of OpenAI, found himself confronting a more elemental question: how much water and power will this new technological era consume?
Speaking on the sidelines of the India AI Impact summit in an interview with The Indian Express, Mr. Altman dismissed as exaggerated some of the most viral claims about AI’s environmental toll. Assertions circulating online that ChatGPT consumes gallons of water for every query, he said, were “completely untrue, totally insane,” and have “no connection to reality.”
The scrutiny reflects a broader reckoning. Data centers, the warehouse-sized facilities that power AI systems, have long relied on significant amounts of water to cool servers and prevent overheating. While some newer facilities are designed to reduce or even eliminate water use for cooling, overall demand for computing continues to climb.
A recent report by the water technology company Xylem and Global Water Intelligence projected that water drawn for cooling could more than triple over the next 25 years as computing needs expand, adding pressure to already strained water systems in some regions.
Mr. Altman acknowledged that energy use remains a legitimate concern. “Not per query, but in total – because the world is using so much AI … and we need to move towards nuclear or wind and solar very quickly,” he said.
He also pushed back on comparisons that focus on the enormous energy required to train large AI models, a process that can take weeks or months on thousands of specialized chips. Responding to earlier remarks by Bill Gates, the Microsoft co-founder, who has argued that the human brain’s efficiency suggests AI systems could also become more energy efficient over time, Mr. Altman suggested that critics often frame the comparison incorrectly.
“One of the things that is always unfair in this comparison is people talk about how much energy it takes to train an AI model … But it also takes a lot of energy to train a human,” he said. “It takes like 20 years of life, and all the food you eat before that time, before you get smart.”
“The fair comparison is if you ask ChatGPT a question, how much energy does it take once a model is trained to answer that question, versus a human, and probably AI has already caught up on an energy efficiency basis, measured that way,” he added.
The distinction he drew centers on what technologists call inference, the phase in which a trained model generates responses. Inference generally consumes far less energy than the initial training process, though its cumulative impact grows as millions of users query systems daily.
Mr. Altman’s remarks, particularly his comparison between AI systems and human cognition, quickly circulated online, where anxiety about automation and job displacement remains high. Sridhar Vembu, co-founder and chief scientist of Zoho Corporation, who attended the summit, criticized the analogy. “I do not want to see a world where we equate a piece of technology to a human being,” he wrote in a post on X.
The exchange unfolded against a backdrop of mounting investment in AI infrastructure. Governments and companies are pouring billions of dollars into new data centers to meet surging demand. A May report from the International Monetary Fund found that global data center electricity consumption in 2023 had already reached levels comparable to those of Germany or France, not long after the debut of ChatGPT.
In response, some governments have sought to accelerate approvals for new energy projects, including renewable and nuclear power, to ensure adequate supply. Environmental advocates have warned that such efforts could conflict with net-zero climate commitments if they rely too heavily on fossil fuels.
Local resistance has also emerged. In the United States, some communities have opposed large data center projects over concerns that they could strain power grids and raise electricity costs. Last week, the City Council in San Marcos, Texas, rejected a proposed $1.5 billion data center after months of public opposition.
Get the latest news and insights that are shaping the world. Subscribe to Impact Newswire to stay informed and be part of the global conversation.
Got a story to share? Pitch it to us at info@impactnews-wire.com and reach the right audience worldwide
Discover more from Impact AI News
Subscribe to get the latest posts sent to your email.

