Mei Dent, CTPO at TeamViewer, argues that the soaring power demands of AI will force organisations to rethink how they build, run, and regulate sustainable digital infrastructure.
Artificial intelligence has become the defining technological force of our time, but its rapid ascent comes with a cost that is growing harder to overlook. As organisations race to roll out increasingly sophisticated models, the world is edging closer to an AI energy crisis.
This isn’t an issue reserved for a handful of data-heavy industries. Its impact will extend across the global economy, forcing business leaders to confront the environmental reality behind the technology they are so eager to adopt.
Balancing AI ambition with climate commitment
Training and running modern AI systems demands extraordinary amounts of power, and that appetite shows no sign of slowing. Each leap forward in capability (larger, more complex models and more real-time applications), brings with it a need for more data centres and more infrastructure. The problem is that most of this infrastructure still relies on fossil fuels. Around 60% of today’s data centre energy consumption is tied to non-renewable sources, an uncomfortable truth for companies that have publicly committed to reducing their environmental footprint.
This widening gap between technological ambition and sustainability goals is creating a moment of reckoning. Organisations will soon find themselves choosing between the scale of their AI deployments and the climate promises they have made to customers, investors and regulators. That tension, however, may be the very thing that pushes the industry into a new phase of energy innovation. Advances in renewable power, smarter cooling systems, and more efficient compute architectures are already emerging, and pressure from AI demand is likely to accelerate them.
The regional realities shaping AI’s future
Alongside these energy concerns, governments are rapidly reshaping the regulatory landscape. Many regions are introducing stricter rules on where data can be stored and processed, while also imposing tighter carbon-reduction requirements on local infrastructure. These shifts will influence not only how AI is deployed, but where it can be deployed at all.
As these policies settle into place, the AI experience will start to look different from one region to another. Access to clean energy, local emissions targets, and the resilience of energy grids will play a bigger role in determining the types of AI services that can operate in each market. Data sovereignty, once viewed as an administrative hurdle, will become a strategic factor that shapes AI product roadmaps.
Building energy-aware AI strategies
For any organisation planning to embed AI deeply into its operations, this is the moment to reassess long-term energy assumptions, rather than waiting for the market to produce turnkey solutions. Companies should scrutinise the sustainability commitments of their data centre partners and insist on transparent, measurable plans for reducing fossil-fuel reliance. They also need to consider how their AI architectures can flex with regional constraints instead of assuming uniform access to compute everywhere.
Ultimately, the AI energy crisis is already unfolding, and influencing decisions across the technology landscape. But with pressure comes opportunity. If industry leaders respond to this challenge with urgency and creativity, this moment could inspire breakthroughs that transform AI from an energy burden into a catalyst for sustainable innovation. The organisations willing to confront these challenges today will be the ones best positioned to build AI systems that endure.
This article is part of our DCR Predicts 2026 series. Come back every week in January for more.


