Microsoft Announces New AI Datacentres: Here's What You Must Know
As you probably know, Microsoft has tons of regions and data centres; over 400 in 70 regions across the world.
But what Microsoft has been working on is something we haven’t seen before, a new type of data centre: the AI data centre.
Author
Niels KroezeIT Business Copywriter
Reading time 4 minutesPublished: 26 September 2025
What is an AI data centre?
An AI (Artificial Intelligence) data centre is a specialised facility made specifically for AI solutions, from training to running AI models and more. Microsoft’s AI datacentres support OpenAI, Microsoft AI, CoPilot, and other leading AI workloads.
Why do we even need these AI datacentres?
We all use AI nowadays, like who still doesn’t?? Whether it’s ChatGPT, CoPilot, or whatever your daily go-to LLMs are, we cannot deny the world is relying on it more and more. The extreme surge in usage over the past years calls for facilities capable of handling massive compute, storage, and networking needs.
Say hello to AI Datacentres. These massive facilities represent tens of billions in investment while housing hundreds of thousands of advanced AI chips.
Designed to handle all the load and demands we feed them daily, they will connect seamlessly across servers to operate as one giant AI supercomputer, powering training and inference at scales previously not thinkable.
Meet Fairwater and Microsoft’s Global AI Datacentre Plans
The first one was unveiled September 18, 2025 by Microsoft and is called “Fairwater”.
Their newest AI datacentre on US soil, in Wisconsin, represents their most advanced AI facility ever made.
“The new Fairwater AI datacenter in Wisconsin stands as a remarkable feat of engineering, covering 315 acres and housing three massive buildings with a combined 1.2 million square feet under roofs. Constructing this facility required 46.6 miles of deep foundation piles, 26.5 million pounds of structural steel, 120 miles of medium-voltage underground cable and 72.6 miles of mechanical piping.”
And Microsoft isn’t planning to leave it there; they are building various similar datacentres around the states while also announcing plans in other regions:
How do AI datacentres differ from standard datacentres?
Unlike standard cloud datacentres, which run many smaller, separate workloads like websites, email, or business apps, this facility is designed to operate as a single massive AI supercomputer. A flat network links hundreds of thousands of the latest NVIDIA GPUs, delivering performance 10 times higher than the world’s fastest supercomputer and enabling AI training and inference at an unprecedented scale.
AI Infrastructure at Frontier Scale
Fairwater’s infrastructure is purpose-built for massive AI workloads. Servers packed with multiple GPUs, CPUs, memory, and storage are linked into racks, and racks are interconnected to act as a single supercomputer. Each GB200 rack holds 72 GPUs with ultra-high bandwidth and pooled memory, processing hundreds of thousands of tokens per second.
Networking is designed to minimize latency: GPUs communicate across racks and pods at terabytes per second using NVLink, NVSwitch, InfiniBand, and Ethernet, while a two-story rack layout further reduces delays. This layered approach is what sets Azure apart, letting tens of thousands of GPUs work together as one global-scale AI machine.
By combining hardware, networking, and software into a tightly engineered system, Microsoft has built one of the most powerful AI supercomputers in the world, ready for frontier-scale AI training and inference.
An eye on sustainability
Traditional air cooling can’t handle the density of modern AI hardware, so Fairwater uses advanced liquid cooling. Cold liquid circulates directly through servers, extracting heat efficiently in a closed-loop system that reuses water continuously, with no waste.
This setup allows higher rack density and keeps the datacenter running efficiently even at peak loads. Fairwater features the second-largest water-cooled chiller plant on the planet, with 172 giant fans chilling and recirculating water back into the system. Over 90% of the facility uses this closed-loop design, while the remaining servers rely on air cooling, switching to water only on the hottest days, which reduces overall water usage.
Microsoft is also rolling out liquid cooling in existing datacentres using Heat Exchanger Units, maintaining zero operational water use while supporting demanding AI workloads.
Closing thoughts
Can we expect these AI datacentres to offload heavy AI workloads, freeing up capacity? Time will tell.
One thing is for sure: these purpose-built infrastructure, massive GPU clusters, will give businesses and developers the power needed to run large-scale AI around the globe.
Meanwhile, Microsoft is expanding their network rapidly, with more new datacentres planned globally. To keep up with updates and announcements on additional AI facilities, check the latest news about Microsoft datacentres here.