U.S. data centers consumed 17 billion gallons of water for cooling in 2023. Hyperscale facilities accounted for 84% of that, roughly 15 billion gallons. By 2028, projections show that number doubling to 33 billion gallons. And those figures only count direct water use. Add the water consumed by power plants generating electricity for those same data centers and the total climbs to 211 billion gallons in a single year.
The numbers have been sitting in environmental reports and academic papers for months. What changed is that the New York Times, Forbes, CNBC, IEEE Spectrum, the Chicago Tribune, and Consumer Reports all published major investigations on AI water use within the same quarter. The story crossed from trade press into mainstream consciousness. Data center operators are now fielding questions from municipal officials, community boards, and shareholders who read those headlines over breakfast.
The most damaging disclosure came from internal Microsoft forecasts obtained by the New York Times in January. The company's own models projected water consumption rising from 7.9 billion liters in 2020 to 28 billion liters by 2030. More than tripling. After the Times contacted Microsoft for comment, the company revised that projection down to 18 billion liters, a 150% increase from 2020 that still excludes the water implications of $50 billion in data center deals signed in 2024.
The regional breakdowns are where it gets specific. Microsoft's Phoenix-area data centers were originally forecast to consume 3.3 billion liters by 2030. The company revised that to 2 billion liters. Their 2024 estimate for the same facilities was already 2.9 billion liters. Phoenix. A city in a desert, drawing from a Colorado River system in sustained decline, where 46% of Microsoft's 2024 water withdrawals came from water-stressed areas.
Microsoft operates over 400 data centers globally. The company committed to being "water positive" by 2030 through 75 replenishment projects. A former water strategy director told the Times that "water took a back seat" internally. "Energy was more the focus because it was more expensive."
Google's 2024 environmental report disclosed data center water consumption exceeding six billion gallons in 2023. The company has been sending procurement teams to China to source liquid cooling equipment from manufacturers like Envicool, a clear signal that even the hyperscalers with the deepest engineering benches cannot build cooling infrastructure fast enough internally.
The geographic center of gravity for new data center construction is shifting toward the Great Lakes basin, and the water implications are enormous. 368 data centers currently operate across eight Great Lakes states. Another 738 are announced or under construction. That is a 200% expansion of the regional fleet.
Hyperscale data centers consume over one million gallons of water daily for cooling. Across the existing U.S. fleet, data center cooling uses approximately 48 million gallons consumptively per day. Factor in the water needed for electricity generation and the combined figure reaches 627 million gallons daily. The Great Lakes supply drinking water to more than 40 million people across the U.S. and Canada. A Joyce Foundation study found that nearly every Great Lakes state already faces at least one groundwater shortage location.
Michigan is the epicenter of the expansion. Three Microsoft campuses are under development near Grand Rapids. An Oracle, OpenAI, and Related Digital project in Saline will draw 1,400 megawatts across 575 acres. A 2024 tax incentive package accelerated interest statewide, and several townships have responded by imposing 12-month development moratoriums.
The coordination problem is severe. Each project gets evaluated individually by local permitting authorities. There is no regional framework for assessing cumulative water impact across dozens of facilities sharing the same aquifers and watersheds. Municipal water use reporting has gaps that obscure how much data centers actually consume, and operators treat their cooling technology specifications as proprietary information.
Shaolei Ren, an associate professor of electrical engineering at UC Riverside, and Amy Luers, who leads sustainability science at Microsoft, published a joint analysis in IEEE Spectrum that reframed how the industry should account for water use. Their finding: indirect water consumption, the water used by power plants generating electricity for data centers, makes up approximately 87% of the total water footprint per AI query.
For GPT-3, generating a single text output of 150 to 300 words consumed 16.9 milliliters of water total. Only 2.2 milliliters was onsite cooling water. The remaining 14.7 milliliters went to electricity generation at thermoelectric power plants upstream. Data centers typically evaporate about 80% of the water they draw directly. The other 20% gets discharged to wastewater treatment.
The implication for cooling engineers is straightforward. Reducing onsite water use, the thing operators measure and report, addresses only 13% of the water problem. The other 87% depends on the energy source. Data centers powered by solar, wind, or nuclear consume almost zero indirect water. Data centers powered by coal or natural gas plants inherit massive water footprints that never appear in the facility's environmental report.
AI ethicist Masheika Allgood put it more directly when discussing closed-loop cooling claims: "Closed loop just means that one of those loops never leaves the system." The second loop, the one that rejects heat to the atmosphere through evaporation, still loses water. Corporate sustainability disclosures that highlight closed-loop systems without disclosing the evaporative loop are, in her assessment, intentionally incomplete.
Sam Altman, at the India AI Impact Summit in February, called water consumption concerns about AI "completely untrue, totally insane" with "no connection to reality." He claimed that OpenAI no longer uses evaporative cooling in data centers. He compared the energy cost of training an AI model to the energy cost of raising a human being. "It takes like 20 years of life, and all the food you eat before that time."
Sridhar Vembu, co-founder and chief scientist of Zoho Corporation, was at the same summit. His response: "I do not want to see a world where we equate a piece of technology to a human being."
Altman's claim about evaporative cooling is narrowly true for OpenAI's newest facilities but misleading about the industry. Fifty-six percent of global data centers still use evaporative cooling. OpenAI's 800-acre facility in Abilene, Texas required 8 million gallons of water for initial operations. The broader industry water demand for data center cooling is projected to triple over the next 25 years, according to Xylem and Global Water Intelligence.
The political and media environment around data center water use has fundamentally shifted. Minnesota has established state-level clearinghouse requirements for data center projects. Illinois proposed suspending tax incentives to allow regulatory catch-up. Ohio tried the same in its budget, though legislators overrode the governor's veto. Multiple Michigan townships have imposed construction moratoriums.
For cooling vendors, the business case for zero-water and low-water cooling technologies just strengthened. Facilities that can demonstrate zero consumptive water use will face fewer permitting obstacles, shorter approval timelines, and less community opposition. Zero-water cooling pilots in Phoenix and Mt. Pleasant take on new urgency in this context.
The operators who treated water disclosure as an annual reporting exercise now face quarterly scrutiny from journalists, regulators, and communities. The cooling technology they choose determines whether they pass that scrutiny or become the next headline. That is a procurement factor that did not exist two years ago. It exists now.