Power Content Hub sponsored by
Riello logo

Is the data centre ‘shortage’ a myth – or the catalyst for a smarter new build‑out?

Daniele Viappiani
Daniele Viappiani
Portfolio Manager at GC1 Ventures

Daniele Viappiani, Portfolio Manager at GC1 Ventures, argues that soaring AI demand is spurring a radical reinvention of data centre design, power and location – not an impending capacity crunch.

In recent months, headlines have been dominated by a mounting concern: that the world might be running out of data centres. With artificial intelligence workloads ballooning, cloud computing becoming the digital backbone of everything from banking to biotech, and hyperscalers, the mega-sized cloud providers like Amazon, Microsoft, and Google, gobbling up compute capacity, some are beginning to wonder: are we approaching ‘peak data centre’?

It’s not the first time such fears have emerged. The idea of ‘peak oil’ or ‘peak food’ – the point at which demand outpaces global production – has haunted resource planning debates for decades. But history has shown us that the market is remarkably resilient. Just as the world adapted to energy crises and agricultural shocks, the data centre industry is already adjusting to unprecedented demand.

Here’s the reality: there is no looming, resource-constrained cliff for data centres. Instead, we’re witnessing a rapid evolution in design, strategic location selection, and infrastructure management. Far from hitting a wall, the industry is finding new and inventive ways to scale.

To be clear, data centres are not simple structures. These aren’t plug-and-play server closets. They’re among the most sophisticated facilities humans build, requiring a delicate interplay of stable electricity, advanced cooling systems, ultra-fast connectivity, and top-tier physical and cyber security. Their construction demands engineers, electricians, network specialists, architects, and supply chains that include everything from semiconductors to climate systems.

Yes, the complexity is daunting. But we’re not out of materials, expertise, or ideas. What’s happening isn’t a crunch – it’s a transformation.

The industry is bifurcating in a way that makes sense: on one end, we see the rise of vast hyperscale facilities, sprawling campuses capable of housing tens of thousands of servers. These are the data fortresses powering large language models, global cloud services, and enterprise applications at scale.

On the other end of the spectrum, modular and edge data centres are also taking off. These smaller, more agile facilities can be spun up in a matter of months, placed closer to end users to reduce latency, and tailored to specific workloads. They can even fit into repurposed buildings such as abandoned malls, empty office towers, and derelict factories are finding new life as digital infrastructure. By diversifying form factors and deployment models, the industry is building resilience.

Despite being theoretically buildable almost anywhere, practical data centre siting is far more nuanced. Ideal locations balance access to stable power, low disaster risk, proximity to other infrastructure, and favourable climate conditions. Mild environments reduce cooling costs, while abundant water can support evaporative cooling systems.

Zoning and land use regulations, especially in urban or semi-urban areas, are a consistent bottleneck. And ironically, while these centres power the digital world, they remain fundamentally tied to the limitations of the physical one. Cities want the jobs, but often not the power draw or real estate burden.

Adding to the challenge is a scarcity of specialised labour. Engineers who understand the intricacies of network latency, redundancy planning, or liquid cooling systems are in short supply.

Even so, there’s slack in the system. Many data centres are intentionally over-provisioned to absorb peak usage and future demand. It’s not uncommon for facilities to operate at 30–50% utilisation. This buffer allows cloud providers to spin up new servers or workloads rapidly without breaking ground on a new site.

Moreover, hardware refresh cycles offer another lever. Swapping out aging servers for newer, more efficient machines can dramatically boost capacity within the same physical footprint for example. So, while the front-end build timelines are long, backend scaling can be impressively quick.

There’s no doubt that the current spike in demand is being driven by AI. Training large models like GPT-4 or Gemini eats up GPU cycles like candy. AI inference has also begun to place new, more sustained pressure on infrastructure.

But here’s a key point often missed in breathless headlines: AI workloads don’t scale linearly. As models get larger, gains don’t always follow. Diminishing returns on compute investment are a real factor. At some point, more data and more servers don’t necessarily yield better AI results.

In other words, the exponential curve of AI demand may not be endless. While demand will likely continue to grow, especially from enterprise and consumer applications, it’s unlikely to do so infinitely.

Delivering reliable, high-capacity power to the right place, at the right time, is a real constraint, however. Local grids weren’t designed to support the demands of hyperscale data centres. In areas like Northern Virginia, home to one of the world’s largest data centre clusters, utilities are beginning to voice concerns around grid bottlenecks.

To sidestep these constraints, companies are getting creative. Some are considering co-locating with power generation by building data centres next to hydro plants, solar farms, or even nuclear facilities. Others are investing in on-site generation, including small modular nuclear reactors (SMRs), to guarantee clean, always-on power without grid dependency.

Renewables are also a huge part of the story. Solar and wind, once fringe, are now among the cheapest and fastest energy sources to deploy. Their modular nature allows them to scale alongside data centre needs. Pair that with battery storage or grid balancing tech, and you have a roadmap to greener, more distributed data infrastructure.

Hyperscalers are pushing the envelope on passive cooling systems, energy-efficient server racks, and closed-loop water systems. Some new builds are even experimenting with direct-to-chip liquid cooling and geothermal exchange systems. Meanwhile, AI itself is being used to improve data centre efficiency for example by adjusting temperature, optimizing workloads, and even predicting maintenance needs.

Finally, while demand is high, and infrastructure is playing catch-up in some regions, the idea that we’re nearing a data centre crunch isn’t backed by the facts.

The industry is innovating, scaling, and diversifying. From modular builds and real estate re-use to nuclear-powered campuses and AI-optimised cooling, we’re seeing the biggest shift in data infrastructure since the advent of cloud computing itself. We’re not running out, we’re evolving.

Related Articles

Top Stories