The rapid development of artificial intelligence has brought global attention to issues of water and electricity consumption in data centers. In response to public skepticism that AI systems consume large amounts of water with each query, OpenAI CEO Sam Altman recently publicly refuted these claims as “completely untrue,” emphasizing that such rumors are disconnected from reality. However, he also acknowledged that as global AI usage surges, the overall energy consumption is indeed a challenge that must be addressed.
In an interview near the AI Impact Summit in India with The Indian Express, Altman provided a comprehensive response to concerns about AI resource usage, sparking lively discussion within the tech community and on social media platforms.
Altman Denies “Gallons of Water per Query” Claim
During the interview, Altman directly called out the circulating claim that “ChatGPT consumes several gallons of water per query” as “completely false and absurd,” stating that these figures are “completely disconnected from reality.”
Data centers have long relied on water cooling systems to prevent overheating of electronic equipment, but with advances in cooling technology, many new-generation data centers have gradually reduced water dependence, with some facilities operating entirely without water cooling.
Nevertheless, despite efficiency improvements, the overall trend cannot be ignored. According to a report released last month by water technology company Xylem and Global Water Intelligence, as global computing demand continues to rise, water used for data center cooling is projected to increase more than threefold over the next 25 years, exerting pressure on water resource systems.
Altman’s remarks suggest he believes the “water consumption per query” narrative has been exaggerated, but the resource demands of the overall infrastructure still require rational assessment.
Energy Consumption in AI Is the Real Issue
Compared to water resource debates, Altman more directly stated that energy consumption is a more reasonable focal point for criticism of AI development.
He said, “It’s not about each individual query, but about the overall — because the world is using AI at a large scale. We need to rapidly shift toward nuclear, wind, and solar energy.”
This highlights the real dilemma facing the AI industry: as model sizes grow and applications become more widespread, computational power demand is growing exponentially, and electricity supply must increase accordingly. Balancing innovation with carbon reduction goals has become a difficult challenge for governments and companies worldwide.
According to a report by the International Monetary Fund (IMF) in May this year, the electricity consumption of global data centers in 2023 has reached levels comparable to the total electricity used by Germany or France. This figure emerged shortly after the launch of ChatGPT, indicating that the growth in computational demand driven by generative AI is astonishing.
AI vs. Human Brain? Altman Counters Bill Gates’ View
During the interview, Altman was also asked about the earlier perspective of Microsoft founder Bill Gates. Gates had stated that the human brain is highly efficient in energy use, suggesting that AI could also become more energy-efficient over time.
In response, Altman offered a different comparison. He pointed out that many discussions about AI energy consumption focus on the massive energy used during “model training,” but overlook the time and resources required to develop a human.
“Training an AI model indeed requires a lot of energy, but training a human also consumes a lot of energy — that’s 20 years of life and all the food you eat before that,” Altman said.
He further argued that a fairer comparison should be between “the single response after model training” and “the energy a human uses to answer the same question.” Under this measure, he believes AI may have already caught up with humans in terms of energy efficiency.
The process Altman refers to is the “inference” stage in AI, which involves using a trained model to generate new outputs. Generally, inference consumes far less power than training.
Community Debate: Can Humans and Technology Be Equated?
Altman’s comparison of AI and human energy efficiency quickly sparked controversy on social media.
Sridhar Vembu, co-founder and chief scientist of Indian software company Zoho Corporation, posted on X (formerly Twitter) criticizing, “I don’t want to see a world where technology is equated with humans.”
Amid rapid advances in generative AI and concerns over job displacement, such comparisons touch on deeper ethical and social issues.
Global Data Center Expansion Faces Resistance
As governments and tech companies invest billions of dollars in building new data centers to meet AI computational demands, opposition is growing.
Some governments are streamlining approval processes to accelerate new power supplies, but environmental groups warn this could conflict with global net-zero carbon goals.
In the U.S., local communities have expressed concerns about large data center projects, fearing grid stress and rising electricity prices. Last week, the San Marcos city council in Texas rejected a $1.5 billion data center project after months of public opposition.
In response to these challenges, many tech leaders, including Altman, advocate for future data centers to rely more on renewable energy sources and nuclear power.
Altman’s latest comments reflect the core contradiction in the era of generative AI: the tension between technological progress and resource consumption.
On one hand, he denies exaggerated claims about water use; on the other, he admits that energy demand will continue to rise with AI adoption and calls for accelerated energy transition. As global data center electricity consumption approaches national scales, the next phase for the AI industry is not just about model performance but also about reshaping energy infrastructure. Since the advent of ChatGPT, AI has become a fundamental infrastructure of the digital economy. Whether a balance can be struck between fostering innovation and ensuring sustainability remains a long-term challenge for industry and governments alike.
This article features Altman’s response to the water consumption controversy: that ChatGPT’s water usage is “completely untrue,” and that energy challenges are the real issue. Originally published by ABMedia on Chain News.