When most people think about artificial intelligence, they imagine it as software floating invisibly in the cloud — clean, frictionless, and intangible. But behind every AI-generated image, voice assistant reply, or ChatGPT response lies a very real and growing infrastructure problem — one that could trigger the next global resource crunch.
Because the truth is, AI doesn’t just run on code.
It runs on vast amounts of electricity — and increasingly, billions of gallons of water.
AI’s Massive Energy Appetite
Start with the obvious: power.
AI models like GPT-4, which sit at the core of modern platforms, are trained using immense computational power. Training alone is estimated to consume **hundreds of megawatt-hours** — enough to power over 100 US homes for an entire year.
But training is just the beginning.
Once deployed, these models are used constantly. Every time you prompt ChatGPT, generate an image, or ask Alexa a question, you trigger a process called inference — and that means lighting up high-performance GPUs in sprawling data centres across the globe.
Inference happens millions of times per second, and it draws an astonishing amount of energy.
Tech giants know this. That’s why chipmakers like Nvidia and AMD are racing to design more power-efficient chips. Software companies are optimizing algorithms. Meanwhile, the big platforms — Microsoft, Amazon, Meta — are pouring billions into energy infrastructure, from renewables to gas to nuclear, just to keep up with demand.
But electricity is only half the story.
AI’s Other Appetite: Water
Here’s what few are talking about…
To keep data centres cool — especially those handling intense AI workloads — operators rely heavily on **evaporative cooling**, a system that uses vast quantities of water to remove heat from high-performance chips.
And the numbers are staggering.
* A medium-sized data centre can consume around 300,000 gallons of water per day — equivalent to the daily use of 1,000 homes.
* Larger AI-focused facilities can use up to 4.5 million gallons daily, depending on design and climate.
In 2023–2024 alone, data centres in San Antonio and Austin, Texas, consumed 463 million gallons — enough to supply tens of thousands of homes.
And it’s just beginning.
A recent white paper from the Houston Advanced Research Center (HARC) projects that AI data centres in Texas alone will withdraw:
* 49 billion gallons of water in 2025
* Scaling to 399 billion gallons by 2030 — nearly 7% of the state’s total water use
That’s the equivalent of over 600,000 Olympic-size swimming pools.
Why This Matters
Yes, some companies are working on solutions.
Tech giants like Google and Microsoft have pledged to become “water positive” by 2030, aiming to replenish more water than they consume. Others are relocating data centres to **cooler climates** (like Northern Europe) or using non-potable or recycled water.
But these are early efforts — and demand continues to skyrocket.
The reality is: AI is now competing with households, farms, and ecosystems for a finite resource. And just like carbon emissions, AI’s water footprint is set to become a major sustainability and geopolitical issue.
The Opportunity in the Bottleneck
Every technological revolution creates its own resource bottlenecks — and with them, new avenues for profit.
Water is no different.
Several companies are already stepping up with smart, scalable solutions to help industry navigate this growing crisis. Names like Ecolab, Pentair, Honeywell, and Xylem are leading the way in water efficiency, smart cooling systems, and industrial recycling tech.
They may not be flashy, but they could become some of the most important companies in the AI economy.
Not a subscriber to Money Morning?
You can get free daily recommendations like these with Money Morning eletter. Just sign up here.