With data centres under mounting pressure to support AI without overwhelming the grid, Wannie Park, CEO/Founder of PADO, explores why workload orchestration may be the missing link.
Electricity, that humble yet essential resource, is having a moment. Quite literally, the world can’t get enough of it. The rapid expansion of data centre infrastructure to support the rise of AI has triggered a rush for power resources on a scale not seen since the early days of electrification.
This scramble for power is exemplified by initiatives such as Google’s partnership with TotalEnergies, which aims to power some of its data centres through solar capacity. Such partnerships underscore a growing industry view: meeting rising demand will require a major contribution from renewable energy. This is echoed by the International Energy Agency, which states that renewables are the fastest-growing source of electricity for data centres and meet nearly 50% of energy demand.
However, several factors prevent Big Tech from fully utilising renewable resources. Chief among these is a challenge now made more acute by AI: the difficulty of effectively integrating and managing intermittent power sources. Removing this operational bottleneck could be an important factor in keeping pace with the escalating demands of the AI era.
Pitfalls of integrating renewable energy sources
We hardly need statistics to confirm what is already obvious in both our professional and personal lives: AI is rapidly becoming integrated into every facet of modern society. AI chatbots, which felt like miracles only a few years ago, have become commonplace. From the rise of ‘vibecoding’ to the deployment of agentic AI, these technologies are reshaping the economy – perhaps not as disruptively as the infamous Citrini report predicted, but with undeniable speed.
To respond to this demand, the European Commission launched the AI Continent Action Plan last spring, with a target of tripling EU data centre capacity within five to seven years. And yet, per IEA modelling, we can expect only a 70% increase in capacity by 2030, owing at least in part to grid congestion. The fact is that the power grid is already straining under the weight of this surging demand. While renewable energy holds immense promise, the challenges of integration remain a significant barrier.
The primary issue is that renewable power is non-dispatchable; it cannot be generated on demand at will. While there have been significant advancements in solar storage, the technology remains heavily dependent on real-time weather conditions. Unpredictable load fluctuations and peak demands require a level of stability that current solar integration in data centres is not yet equipped to provide. This intermittency has, until now, limited the role solar and other renewables can play in supporting an emerging AI-driven landscape.
Making orchestration more intelligent
To understand one possible response to this energy challenge, one must first understand how data centres currently manage workloads. The traditional model is ‘first-in, first-out’ – or, more colloquially, first-come, first-served.
In this scenario, jobs are processed in the order they arrive, regardless of their priority or specific energy requirements. This model neglects key differences between the kinds of requests that data centres receive. On the one hand, there are jobs that need to be completed immediately; on the other, there are batch workload requests that have a deadline but whose completion can be delayed.
Intelligent orchestration seeks to address this by dynamically scheduling these different kinds of jobs while also taking into account renewable energy availability and electricity cost. A system that schedules the most intensive workloads during cooler periods can achieve the same cooling outcome at a lower cost. Overhead is reduced while computing output stays steady.
Placement also plays a significant role here. Intelligent systems use clustered placement, in which jobs are packed strategically. This prescriptive placement strategy packs jobs onto fewer servers, maximising their capacity and allowing data centres to use power more efficiently to maximise compute.
All of this can help to improve Power Usage Effectiveness (PUE), one of the key performance benchmarks for today’s data centre operators. PUE essentially measures a given data centre’s energy efficiency, showing how much of that energy actually goes towards running servers. Average PUE globally is around 1.58, but at many older data centres it hovers closer to 2.0.
Summer school provides an apt analogy here. Let’s say that 10% of classrooms are in use during an average summer school session. If you distributed those classrooms throughout the school – one per floor – you would have to cool every floor of the school. If you instead simply used every classroom on one floor of the building, you could cool only that floor. This is the benefit of clustered placement: fewer, more strategically packed servers mean less energy devoted to cooling and, in turn, lower cost.
Notably, intelligent orchestration could address power needs across multiple interlocking fronts. It may enhance the viability of renewable energy while also supporting the continued use of legacy data centres. By helping operators support AI workloads and optimise cooling systems, this kind of infrastructure may allow older facilities to be adapted for changing demand.
Intelligent orchestration’s virtuous cycle
Finding ways to reliably and efficiently power data centres has become one of the central infrastructure challenges of our era. The growth of both the technology sector and the global economy depends on this stability, as does the long-term sustainability of our environment and power grids. The IEA also notes that yearly electricity consumption from data centres is anticipated to double by 2030 due to the growth of high-intensity AI workloads.
Industry stakeholders recognise that traditional orchestration methods are coming under pressure. Uptime cannot be sustained indefinitely using a conventional ‘first-in, first-out’ model, and the surging demands of AI will be harder to meet without improving the performance of legacy data centres. Intelligent orchestration is one possible path forward.
These challenges would be significant even without the added complexity of the regulatory landscape. New data centre construction often faces a thicket of slow-moving regulations that struggle to keep pace with the speed of AI development. In that environment, orchestration offers a potentially useful operational lever. By enabling smarter job scheduling, improved server utilisation, and more responsive energy management, it has the potential to improve GPU efficiency, reduce cooling costs, lower PUE, and support greater renewable energy integration.
Each of these benefits can reinforce the others, and data centres around the world are already exploring them. Data centres in Europe are doubling down on renewable energy, with further incentive from the Climate Neutral Data Centre Pact to prioritise sustainability as buildout continues. In contrast, the US is still in the earlier stages of this shift, with renewables accounting for less than a quarter of electricity in these facilities.
The effective deployment of intelligent orchestration could create a virtuous cycle, where one operational gain helps unlock the next.

