Is AI killing the planet? Maybe. But not how you've been told.
The water panic is mostly nonsense. The energy story is real but it's a grid problem, not an AI problem. A sober look at the numbers.
Every few weeks, a new headline drops telling you that asking ChatGPT a question is roughly equivalent to draining a swimming pool and setting a coal plant on fire. The discourse is loud, confident, and (as with most loud confident things on the internet) only about half right.
Here’s where I actually land after going down this rabbit hole: AI’s energy consumption is a legitimate and growing concern that deserves serious scrutiny. The water panic, though? It’s a case study in how a stat can go viral, mutate, get cited in bestselling books, and end up completely divorced from reality, all while the people spreading it feel like they’re doing heroic work.
Let’s take these apart.
The water thing is (mostly) bullshit
The viral claim goes something like this: generating a 100-word email with ChatGPT uses 500 milliliters of water. A full water bottle, basically evaporated into the ether every time you ask AI to help you sound less passive-aggressive in a Slack message.
That number comes from a 2023 paper called Making AI Less Thirsty. The study was real. The number was real, for GPT-3, trained in 2020, under a specific set of assumptions about which data center you were hitting and how you counted “water use.” By the time the headline was circulating everywhere, it had already been rendered obsolete by roughly a decade of compute efficiency gains compressed into three years.
Hank Green did a video on this in late 2025 (worth watching if you have 24 minutes) where he laid out exactly why this is such an easy thing to get wrong. The core issue is that there’s no single correct answer to “how much water does an AI query use” because it depends entirely on when you start the clock. Are you counting just the cooling water at the data center? The water used to generate the electricity that powers the data center? The water embedded in training the model? You can get wildly different numbers depending on your methodology, and none of them are technically lying.
The gap between the competing claims is stark: Sam Altman said each ChatGPT query uses about 1/15th of a teaspoon of water, while other estimates put a single 100-word email at up to three water bottles. Neither figure is technically wrong. They’re just measuring different things.
The more honest figure, once you account for modern models and inference efficiency, is somewhere around 5 milliliters per query, roughly a teaspoon. The original 500ml figure significantly overestimated the length of a typical response and was based on models that are now about 10x more efficient than what you’re actually talking to today.
Even if you take the scarier numbers at face value and zoom out, they don’t hold up. Data centers in Maricopa County (Phoenix, one of the most water-stressed metro areas in the country) are projected to use 905 million gallons of water in 2025. The golf courses in that same county use 29 billion gallons. Data centers: 0.12% of the county’s water. Golf courses: 3.8%.
I actually went down this rabbit hole a couple years ago when the panic first started, and the real-world numbers on the ground kept refusing to match the headlines. The Wisconsin story stuck with me. Microsoft’s Fairwater data center in Mount Pleasant, a $7.7 billion facility that Brad Smith called the most powerful AI data center ever built, became the center of a transparency fight when environmental groups had to sue the city just to find out how much water it was using. The number that finally came out after months of legal wrangling? 2.8 million gallons of Lake Michigan water per year, roughly four Olympic-sized swimming pools. For context, Foxconn had previously been permitted to draw more than 7 million gallons from the same source every single day. Smith’s response: “Good news. Lake Michigan has nothing to fear from our data center.”
And Wisconsin is water-rich. What about somewhere actually dry? Arizona is ground zero for the data center buildout, nearly 200 facilities and counting, and the predicted water catastrophe just hasn’t materialized. Even in Mesa, where both Apple and Meta own clusters of data centers, industrial water usage only accounted for around 6% of the city’s total potable usage in 2024. The mayor of one fast-growing Phoenix suburb summed up what his town was actually worried about: “The trick now isn’t water… the trick now is getting enough power.”
The towns near the data centers weren’t noticing. Because there wasn’t much to notice.
And that’s before we even get to agriculture. Producing a single kilogram of beef requires around 15,400 liters of water. Globally, daily water usage for beef production alone runs to approximately 2.49 trillion liters. For comparison, the USA uses about 20 trillion gallons of water every year just to grow corn, roughly 80 times what global AI-related water use amounts to.
This is the part that makes the discourse so frustrating. The people most likely to post about AI’s water footprint are also, statistically, the people most likely to eat a burger while they’re doing it. That’s not a gotcha. It’s just a genuine misallocation of moral concern. Training GPT-3 used about as much water as the average American’s beef consumption for less than two years. Two people giving up beef would more than compensate.
The water story isn’t nothing. There are real localized concerns about data centers being built in drought-prone areas, and at least one bestselling book cited water usage figures that were off by a factor of one thousand before a correction was issued. But as a global environmental crisis? It doesn’t survive contact with context.
The power thing is real. But it’s really about the grid.
Here’s where I switch gears, because I’m not here to be an AI apologist.
Energy consumption is the actual story, and it is genuinely large and growing.
Lawrence Berkeley National Laboratory forecasts that by 2028, U.S. data centers could consume as much as 12% of the nation’s electricity, with generative AI as the primary driver. That’s a significant structural shift in how we use power as a civilization.
But here’s the thing people keep missing when they throw that number around: power consumption isn’t the problem. Dirty power consumption is the problem. These are not the same thing, and conflating them is how we end up with bad policy thinking.
If every watt powering a data center comes from wind, solar, hydro, or nuclear (which is increasingly the direction the industry is moving) then the energy conversation becomes almost entirely about grid capacity and infrastructure, not emissions. A data center running on clean energy is not a climate problem. It’s an engineering problem. Completely solvable.
The carbon footprint of AI systems alone could land between 32.6 and 79.7 million tons of CO₂ emissions in 2025, roughly comparable to the emissions of New York City. That’s real and it matters. But that number is a function of what the grid looks like right now, not what it has to look like. As renewable capacity grows and AI companies accelerate their clean energy commitments, those emissions figures fall. The energy consumption doesn’t go away, but its climate impact can.
This is why the more interesting question isn’t “how much power does AI use?” It’s “how fast are we decarbonizing the grid relative to how fast AI demand is growing?” Those two curves are in a race, and which one wins matters enormously.
To put the scale in everyday terms: a single ChatGPT query uses about the same energy as running a microwave for one second, or a gaming console for six seconds. A family running 1,000 GPT queries a day would still use significantly less than 1% of their household electricity and water. At the individual level, it’s fine. At civilization scale, with usage growing exponentially, it adds up — but only as a problem if the power behind it stays dirty.
The comparison to meat is instructive here too. Producing 50 grams of protein as beef generates about 19 kilograms of CO₂, equivalent to roughly 617,000 Gemini queries. That sounds like a win for AI, until you realize that beef consumption has been roughly flat for decades, while AI queries are doubling every few months. The trajectory is what matters, not the snapshot. Which is exactly why grid decarbonization has to keep pace.
The other piece that doesn’t get enough attention is location. A bunch of data centers in the Pacific Northwest running on hydro power is a completely different environmental equation than the same infrastructure in the Sonoran Desert running on natural gas. Right now there’s almost no regulatory pressure to make companies site their infrastructure responsibly or source their power cleanly, and that’s the policy gap actually worth fighting about.
The cautionary example here is Meta’s Hyperion data center in Louisiana. Louisiana regulators approved plans to build three new gas power plants specifically to offset the electricity demand from that single facility. That’s the nightmare scenario, not “AI uses electricity,” but “AI uses electricity and we’re burning new fossil fuels to provide it.” Fix the grid, and most of the AI energy story changes.
The transparency problem nobody talks about
Here’s the thing that actually pisses me off about this whole debate: we’re arguing about estimates built on estimates, because the companies operating this infrastructure refuse to tell us the real numbers.
Even the IEA’s figures on data center water and energy consumption don’t clearly break out what’s attributable to AI versus other workloads. Researchers trying to calculate AI’s actual footprint have to reverse-engineer it from company revenue and public disclosures, which are incomplete by design.
There are no federal or state regulations requiring tech companies to disclose their energy and water consumption. The entire field of AI environmental research is built on inference and estimation because the primary sources won’t talk. California Governor Gavin Newsom vetoed legislation last year that would have required data center operators to share site-level water use estimates with local water suppliers. The industry lobbied against it.
So when you see confident-sounding statistics flying around about how many water bottles your GPT query is worth, remember: the companies at the center of this have actively prevented us from knowing the real answer. That’s the part that deserves outrage, not a number that got inflated by a factor of a hundred before going viral.
So what should you actually think?
Water: mostly a moral panic built on a misread paper, laundered through a hundred viral posts, that now lives in people’s heads as settled fact. The actual localized concerns (specific data centers in specific drought-stricken places) are legitimate. The global narrative is not. The towns next to the data centers aren’t running dry. They’re fine.
Power: real, growing, and the only version of this conversation worth having. But the framing matters. AI’s energy consumption isn’t inherently a climate problem. It’s a climate problem if the grid it runs on stays dirty. Clean power means AI’s carbon footprint trends toward zero regardless of how many queries we run. The race to decarbonize has to win. If it does, the energy story largely solves itself.
The frustrating thing is that the noise about water is actually crowding out the signal about energy. It gives AI companies an easy target to debunk, which lets them change the subject from the harder conversation about their actual carbon trajectory and their responsibility to accelerate, not undermine, the clean energy transition.
You can care about AI’s environmental impact, genuinely and seriously, and still demand that the concern be pointed at the right things.
The water bottle thing? Let it go.
The question of whether we can build enough clean power generation to sustain an AI-powered civilization without cooking the planet? That one’s worth every bit of the energy we’re spending on it.
Further reading: Andy Masley’s deep-dive on AI water is the most rigorous public writing I’ve found on the subject. Hank Green’s December 2025 video covers the same ground accessibly. For the energy side, the IEA’s Energy and AI report is dense but worth it.