AI Data Centers by Workload Focus
Not every AI data center serves the same role. Some are built for frontier training, others for low-latency inference, sovereign national capacity, or AI-heavy cloud regions. Browse 344 tracked facilities by their primary disclosed AI workload focus.
Training
Facilities used for frontier model training, large GPU clusters, and pretraining workloads.
34
facilities
35.5 GW
Inference
Facilities optimized for serving models, real-time AI, and enterprise inference workloads.
44
facilities
6.9 GW
Research
Academic, national lab, and R&D systems focused on experimentation and scientific AI.
3
facilities
0.1 GW
Sovereign AI
National and regional facilities built for domestic AI capability, public-sector workloads, and digital sovereignty.
5
facilities
1.9 GW
General Cloud
Cloud regions and multipurpose platforms that blend AI capacity with broader cloud infrastructure.
15
facilities
2.8 GW
HPC / Supercomputing
High-performance computing facilities supporting exascale, supercomputing, and hybrid AI-HPC workloads.
6
facilities
0.2 GW
Mixed
Facilities with multiple public AI roles, such as training plus inference or cloud plus sovereign AI.
115
facilities
46.0 GW
AI Focus Unknown
Facilities where the public AI workload focus has not been clearly disclosed.
122
facilities
62.4 GW
About AI Focus Classification
AI focus is normalized from each facility's disclosed purpose in operator announcements and reporting. Many sites span multiple roles — for example training plus inference, or sovereign AI plus cloud services. Those are grouped under Mixed so the public can separate pure-play training clusters from more general AI infrastructure. If no workload focus is clearly stated, the facility falls under AI Focus Unknown.