AI Workload Types

Separate training, inference, mixed-use, and general cloud facilities.

AI focus pages turn a broad infrastructure dataset into something more analytically useful by showing which facilities are oriented toward training clusters, inference fleets, or mixed workloads.

AI focus categories tracked

7

In this index

Facilities covered

295

86% of full index

Known capacity

130.1 GW

Total IT load in MW

Largest AI focus

Mixed

149 facilities

How To Read This Taxonomy

This taxonomy is one lens on a live dataset covering 344 AI data centers across 64 countries and 202 tracked operators.

What does AI focus add beyond operator and geography?

It distinguishes facilities designed for frontier-model training from those aimed at inference, cloud spillover, or mixed workloads, which changes how readers interpret hardware and power demand.

When is this slice especially useful?

Use it when tracking where dense GPU clusters may emerge, where low-latency inference capacity is being built, or how providers balance AI-specific demand with general cloud infrastructure.

Related Lenses

Cross-check this slice with adjacent taxonomies to see how geography, operator concentration, status, and power sourcing interact.

All AI focus categories

Frequently asked questions

What does AI focus add beyond operator and geography?
It distinguishes facilities designed for frontier-model training from those aimed at inference, cloud spillover, or mixed workloads, which changes how readers interpret hardware and power demand.
When is this slice especially useful?
Use it when tracking where dense GPU clusters may emerge, where low-latency inference capacity is being built, or how providers balance AI-specific demand with general cloud infrastructure.