AI could represent half of data center workloads by 2030
While AI has been quickly gaining daily active users, it only represented about a quarter of all data center workloads in 2025, with training driving most of the demand. However, a significant shift is anticipated in 2027, when inference workloads could overtake training as the dominant AI requirement.
While an AI model represents a one-time or periodic investment, once the model is created, inference generates ongoing revenue through actual application usage. Looking forward, every AI model deployment creates sustained inference demand that grows with user adoption. This growth, however, depends on the emergence and rapid adoption of inference applications that don't yet exist at scale.
Inference demand requires geographical distribution to reduce latency and serve users effectively. This will drive regional deployments and embedded systems at the edge.



