Mistral AI just borrowed $830 million from seven banks to build a data center south of Paris packed with 13,800 Nvidia GB300 GPUs. The facility will draw 44 megawatts of power. It needs to be operational by end of June 2026. Three months from now.
Nobody is talking about how they plan to cool it.
The GB300 is Nvidia's Grace Blackwell Ultra architecture. These are not the 300-watt chips of two years ago. Each GB300 NVL72 rack can pull over 100 kilowatts. Pack 13,800 of those GPUs into a single 44MW envelope south of Paris, and you have a thermal density problem that no amount of raised-floor CRAC units can touch. This facility will require liquid cooling infrastructure from day one. Full stop.
The $830 million raise (roughly 722 million euros) is Mistral's first debt financing. The company is less than three years old, founded in April 2023. The consortium backing it reads like a European banking who's who: Bpifrance, BNP Paribas, Credit Agricole CIB, HSBC, La Banque Postale, MUFG, and Natixis. Six of those seven are European institutions. That composition tells you something about the political undercurrent here.
CEO Arthur Mensch framed the investment as a matter of sovereignty. Scaling infrastructure within Europe, he said, is "critical" for "AI innovation and autonomy." Translation: Mistral does not want to rent GPU time from Amazon, Microsoft, or Google. They want their own iron. Their own facility. Their own thermal plant.
The site is in Bruyeres-le-Chatel, a commune about 30 kilometers south of Paris. This is not some remote Nordic fjord with free ambient cooling. The Ile-de-France region sees summer temperatures regularly clearing 35 degrees Celsius, with heat waves pushing past 40. You are cooling a 44MW GPU cluster in a climate that offers zero free thermodynamic lunch for three months of the year.
Let's do the math. 44MW of IT load. Assume a PUE of 1.2, which is aggressive for a new liquid-cooled facility but achievable. That puts total facility power around 52.8MW, with roughly 8.8MW dedicated to cooling and overhead. At PUE 1.3, you're looking at 57.2MW total and 13.2MW for cooling. Either way, the thermal rejection system for this site needs to move tens of megawatts of heat out of the building and into the atmosphere.
Air cooling cannot do this. Not at GB300 densities. The chip-level thermal design power on these GPUs demands direct contact with liquid. Cold plate systems, rear-door heat exchangers, or full immersion. There is no air-cooled option for a facility running 13,800 Grace Blackwell Ultra GPUs in a footprint that needs to be live in 90 days.
The question is who is supplying the cooling hardware. And that question carries political weight.
Mistral's entire pitch to investors, to the French government, and to Europe broadly is sovereignty. Build AI models in Europe. Train them on European infrastructure. Keep the data, the compute, and the intellectual property on European soil, independent of American cloud hyperscalers.
That narrative has a thermal layer.
If you are building sovereign compute, are you sourcing sovereign thermal infrastructure? The three largest data center cooling vendors in the world are Vertiv (US), Schneider Electric (France), and CoolIT Systems (Canada). Schneider Electric is headquartered in Rueil-Malmaison. That is 40 kilometers from Bruyeres-le-Chatel. France's largest cooling infrastructure company sits down the road from Mistral's new GPU cluster.
That is not a coincidence anyone will ignore.
Schneider's EcoStruxure IT platform and their liquid cooling partnerships give them a plausible path to being the primary thermal vendor for this facility. But the liquid cooling market for GPU-dense AI clusters is still dominated by US and Canadian players. Vertiv's XDU rear-door units. CoolIT's direct liquid cooling manifolds. GRC's single-phase immersion tanks. If Mistral and its banking consortium are serious about the sovereignty play, picking an American cooling vendor to thermally manage their sovereign AI cluster would be a conspicuous choice.
There are smaller European players in the liquid cooling space. LiquidCool Solutions in the UK. Asetek (now Vertiv-acquired, which complicates the narrative). Submer in Spain, doing immersion cooling. The European cooling supply chain exists, but it is thinner than the American one. The vendor Mistral selects will signal how deep the sovereignty argument actually runs.
The 44MW Paris facility is the beginning. Mistral has a broader 1.2 billion euro initiative to build compute capacity across Europe, including expansion in Sweden. The target: 200MW of total capacity by end of 2027.
That means the cooling procurement pipeline behind this announcement is four to five times larger than what the Paris site alone requires. Two hundred megawatts of GPU-dense AI training infrastructure across multiple European jurisdictions. Each site with its own climate, its own water regulations, its own grid constraints.
And Europe's regulatory environment for cooling is materially different from the US. The EU Water Framework Directive imposes stricter controls on water abstraction and thermal discharge than anything in American federal or state law. Evaporative cooling towers, the default heat rejection method for most US data centers, face permitting headwinds across the EU. Open-loop systems that draw from rivers or aquifers and return heated water are heavily regulated under the Directive's ecological status requirements.
This pushes European AI data centers toward closed-loop and zero-evaporative cooling architectures. Dry coolers. Adiabatic systems with water recapture. Direct liquid cooling loops that reject heat through air-cooled radiators rather than cooling towers. The technology choices Mistral makes for the Paris facility will set the template for every subsequent site in their European buildout.
For cooling vendors, Mistral's announcement represents a concentrated procurement event. One customer. Multiple sites. Hundreds of megawatts. All requiring liquid cooling infrastructure, all governed by European water and environmental regulations, all operating under a sovereignty narrative that creates preference (whether explicit or implicit) for European supply chains.
The Paris site alone needs cooling hardware installed and commissioned in under 90 days. That timeline eliminates custom-engineered solutions and favors vendors with modular, pre-fabricated cooling systems that can ship and deploy fast. Prefabricated CDU (coolant distribution unit) skids. Factory-built rear-door heat exchangers. Modular dry cooler arrays that bolt together on site.
Speed favors the incumbents who already have European manufacturing and logistics. Schneider has factories across France. Vertiv has European production facilities in Italy, Croatia, and the Czech Republic. CoolIT manufactures in Calgary but has distribution partnerships across Europe.
The 200MW target by 2027 changes the calculus. That volume justifies dedicated production lines, regional partnerships, and potentially co-development agreements between Mistral and its cooling suppliers. The sovereign AI thesis, if taken seriously at the infrastructure layer, could reshape which cooling companies win European AI buildouts for the next decade.
Mistral borrowed $830 million to buy GPUs and build walls around them. The thermal systems that keep those GPUs from melting will cost tens of millions more. Who gets that contract will tell you whether sovereign AI is a real infrastructure philosophy or just a fundraising pitch.