Sam Altman, CEO of OpenAI, has pushed back against viral claims about ChatGPT’s environmental impact, calling some widely circulated numbers “completely untrue, totally insane.” Recent internet rumors suggested that a single ChatGPT query guzzles 17 gallons of water or uses enough electricity to charge an iPhone one and a half times, painting a picture of an energy-hungry monster lurking behind every question typed into the chatbot.
Sam Altman dismisses viral claims about ChatGPT’s environmental impact as completely untrue, calling exaggerated water and electricity consumption numbers totally insane.
According to Altman, these claims lack any connection to reality. The actual numbers tell a different story. An average ChatGPT query consumes approximately 0.34 watt-hours of electricity, roughly equivalent to running your oven for one second or powering a high-efficiency lightbulb for a couple of minutes. This per-query figure is small compared with many other online services and highlights the role of data center efficiency in overall impact.
The water consumption myth stems from outdated concerns about older data centers that relied on evaporative cooling systems, technology ChatGPT’s infrastructure has largely moved away from. 56% of global data centers still use evaporative cooling in some form, according to a report by Xylem and Global Water Intelligence.
Altman also addressed what he considers unfair comparisons between AI and human energy use. Critics often compare the massive energy required to train AI models against the tiny amount humans use to answer questions. But this ignores an important fact: training a human brain takes roughly twenty years of constant energy consumption. The 20 years of life remark drew laughter from the audience.
Evolution itself has consumed unimaginable amounts of energy over millennia to produce 100 billion thinking humans. When comparing a trained AI answering questions against a trained human doing the same task, AI has “already caught up on an energy efficiency basis.”
However, Altman acknowledges legitimate concerns about the AI industry’s total energy footprint. McKinsey & Company estimates data centers could account for 14 percent of total US power demand by 2050. This growing appetite for electricity has been connected to rising electricity prices, making the issue worth serious attention.
The solution, according to Altman, involves rapidly shifting to renewable energy sources like nuclear, wind, or solar power. This shift needs to happen very quickly to keep pace with AI expansion. While individual queries use surprisingly little energy, the industry-wide growth demands responsible energy infrastructure development.
Tech companies face no legal requirement to disclose these metrics, making accurate information even more important.




