Experts say AI’s environmental toll is a problem of measurement, not just consumption
Jacob Ben-David on building data centers in water-scarce regions, and the need for policy incentives to relocate to water-abundant areas.

To solve AI’s thirst problem, the industry doesn’t just need better data centers; it needs better measurement. The gap between corporate claims and independent findings is a crisis of transparency, not a rounding error, and without a standard way to track AI’s appetite for water and energy, the true cost stays hidden.
Jacob Ben-David, IT and AI strategist at a leading cloud technology firm, says the debate over AI’s resource consumption is misguided. But before we can solve the problem, we first need to agree on how to measure it.
The measurement vacuum: "You have Sam Altman saying a query uses a fraction of a teaspoon of water, while researchers say it's closer to a bottle," says Ben-David. "The truth is, we don't know exactly how much energy is being used because as an industry, we lack a standard way to measure it." He points out that different model architectures and a narrow focus on things like chip cooling are a "weird way to measure consumption" that allow companies to deliberately downplay the problem. "Before we can address the problem," he says, "we have to clearly define and articulate it."
A geography of thirst: "The critical question isn't just how much water is used, but where it's being used. That's what determines the impact on the environment and the community," says Ben-David. He explains that data centers are often built in water-scarce areas like Texas, lured by land availability and tax incentives. Those facilities have a far greater environmental toll than ones in water-rich regions. "This is where government needs to step in," Ben-David suggests. "Imagine creating incentives for companies to build their data centers in Scandinavia or Canada, where water is abundant. That's how you make a real difference."
The truth is, we don't know exactly how much energy is being used because as an industry, we lack a standard way to measure it. Before we can address the problem, we have to clearly define and articulate it.
The cure is the cause: "AI is the problem, but it's also the opportunity," Ben-David says. "It will eventually help us develop and innovate new, efficient technologies to dramatically improve resource consumption." He points to immersion cooling, where server racks are submerged in a non-conductive liquid that dissipates heat. "It cools efficiently without using water," Ben-David says. But scaling solutions like this requires incentives.
The model T moment: For Ben-David, this is a familiar story: the early, inefficient days of a world-changing technology. "We are in the early stages of this revolution." Having spent more than two decades in IT, he’s never seen a technology scale as fast as AI. "Today's AI is like the first version of the combustion engine," he explains. "We're going to move on from this, eventually reaching the equivalent of an electric motor with zero environmental impact. I truly believe the problem will fix itself—and the fix is AI."