U.S. data centers consumed 1.9% of total national electricity in 2018. By 2023, that figure had grown to 4.4%. The Department of Energy projects the range reaching 6.7% to 12% of total U.S. consumption by 2028, driven by AI training and inference workloads that are running on GPU clusters consuming 80 to 130 kilowatts per rack. The cooling systems serving those racks represent a large and growing fraction of that electricity draw. A facility running a PUE of 1.3 spends 23 cents of every dollar of power on cooling infrastructure rather than compute. The DOE's geothermal program is funding approaches that could reduce that cooling power draw by using subsurface thermal resources instead of electrically-driven refrigeration systems.
Geothermal energy plants maintain a capacity factor of approximately 90%, meaning they produce power at close to nameplate capacity nearly continuously. That reliability profile matches data centers well. AI training clusters run 24 hours a day, 7 days a week, at near-full utilization during active training runs. An intermittent renewable source requires grid backup or storage to serve a continuous load; geothermal does not. Enhanced Geothermal Systems, or EGS, expand the geographic availability of geothermal power by enabling heat extraction in areas without natural hydrothermal resources, through engineered reservoir creation in hot dry rock formations. That expansion potentially brings geothermal power within reach of data center campuses in markets where grid power is constrained.
Cold Underground Thermal Energy Storage is the geothermal approach most directly applicable to data center cooling rather than data center power. The principle is straightforward: chill water during periods when the electrical grid is running cool, clean, and cheap, typically at night and in winter, and store it in underground aquifer formations. During peak cooling demand periods, draw that stored cold water to meet facility cooling loads instead of running chillers at full capacity against peak grid electricity prices. The underground thermal mass maintains water at near-constant cool temperature with minimal energy input, because the subsurface is thermally stable in a way that above-ground insulated tanks are not at the required volumes.
The scale required to make Cold UTES useful at a hyperscale data center is large. A 50-megawatt facility running at PUE 1.3 needs roughly 15 megawatts of continuous cooling capacity. Shifting even 30% of that peak cooling load to Cold UTES requires storing tens of millions of gallons of chilled water in a geologically suitable underground formation near the facility. The site selection constraints are real: not every data center campus sits above an aquifer with the right hydraulic and thermal properties for Cold UTES deployment. But in regions where those conditions align, the operational cost reduction can be material, particularly as grid electricity prices rise during peak summer cooling periods.
The DOE is funding a project in Southwest Virginia that demonstrates a different application of subsurface thermal resources: cooling data centers with water stored in abandoned coal mines. The region sits above billions of gallons of cool underground water that has been accumulating in decommissioned mining tunnels for decades. That water maintains a stable temperature driven by the geothermal gradient, typically in the range of 10 to 15 degrees Celsius depending on depth, which is close to or below typical chilled water supply temperatures for data center cooling loops. Pumping that mine water to surface, passing it through a heat exchanger serving the data center cooling loop, and returning it to the mine formation closes the loop without consuming the water. The implications for the former coal country economies are also real: data center campuses powered partly by their own underground geology represent a different kind of economic development story than the industrial decline these communities have been living through.