Abstract for: Feeding the Machine: Energy Demand of Data Centres and Artificial Intelligence

The rapid expansion of AI and cloud computing is transforming data centres into one of the fastest-growing energy consumers globally. While past digital infrastructure growth has been moderated by efficiency gains, the computational intensity of AI may disrupt this pattern. This study examines how the interplay between technological progress, cost dynamics, and infrastructure constraints shapes the long-term trajectory of data centre expansion and energy consumption. A system dynamics model is developed to capture the feedback loops governing data centre growth, semiconductor scaling, and grid constraints. The model tracks four types of data centres—AI Training, AI Inference, Cloud Compute, and Data Services—and simulates how shifts in Process Node Size, Compute Power Efficiency, and Infrastructure Scaling influence energy demand, pricing, and market size over time. Findings reveal a structural shift in data centre composition, with AI workloads surpassing cloud computing in the 2040s. While efficiency gains partially offset rising energy demand, they are insufficient to prevent an overall increase in power consumption. Grid constraints could become a bottleneck, limiting data centre expansion. Market-driven technological improvements drive the transition to Leading Edge Process Nodes, but cost reductions slow as fabrication reaches physical limits. These results highlight the complex interaction between technology scaling, energy infrastructure, and AI adoption. The study suggests that AI-driven computing will not cause unchecked energy demand growth but will reshape how and where energy is consumed. Key uncertainties include semiconductor advancements, grid expansion rates, and AI efficiency improvements. The findings are relevant for policymakers, energy planners, and AI developers seeking to balance computational needs with sustainable energy growth. for language improvement