Designing the edge the AI boom really needs

Niklas Lindqvist
Niklas Lindqvist
Nordic General Manager at Onnec

Niklas Lindqvist, Nordic General Manager at Onnec, argues that only edge data centres designed holistically around power, cooling and cabling will keep pace with AI’s surging compute demands.

Artificial intelligence is no longer a futuristic concept; it’s an operational reality reshaping technical environments at an unprecedented pace. The rise of generative and agent-based AI has triggered a wave of excitement and urgency across industries, with companies racing to adapt their digital strategies. But while the software evolves rapidly, infrastructure may struggle to keep pace. The demand for compute power is skyrocketing, placing immense pressure on data centre capacity. 

Edge computing could be the crucial link in alleviating this growing strain, yet simply deploying more edge data centres isn’t enough. To meet the performance expectations of AI-driven services, characterised by ultra-low latency and massive bandwidth, these facilities must be designed with an integrated mindset. Power delivery, cooling, and cabling infrastructure must work holistically from the outset. Success lies in balancing innovation with operational resilience, and strategic design decisions will shape the long-term effectiveness of these environments.

The edge opportunity

AI’s impact on compute is accelerating, particularly as agentic models capable of decision-making become more embedded in critical applications like logistics and healthcare. McKinsey forecasts that global demand for data centre capacity could surge 22% annually from 2023-2030, reaching 219 GW annually, with AI expected to account for 70% of total demand by 2030. 

Edge data centres are gaining traction as the backbone of AI infrastructure due to their proximity to end users and connected devices. By reducing latency and boosting bandwidth, these localised facilities are well-positioned to support the performance demands of emerging AI applications. Reflecting this momentum, the edge data centre market is forecast to reach $317 billion by 2026, more than double its value in 2020. However, to truly deliver on their promise in the AI era, these sites must be designed with a balanced focus on three foundational elements: power, cooling, and cabling.

These components are deeply interconnected. As AI workloads push electrical demands higher, the associated high-capacity cabling can create heat buildup that affects both performance and reliability. Addressing cable congestion will require deliberate planning and robust infrastructure. In many cases, liquid cooling will offer a more effective solution to managing thermal output than traditional air-based systems. Simultaneously, improving the efficiency of power delivery becomes critical, not just to support performance, but also to meet sustainability goals, reduce operating costs, and minimise stress on local energy grids.

Designing for demand: What really matters

Designing edge data centres fit for the AI era starts with a clear understanding of IT load. Since AI workloads can differ significantly depending on the models, applications, and usage patterns, operators must assess processing requirements, data throughput, and latency expectations with precision. This baseline understanding sets the direction for key design decisions around power capacity, thermal management, and network architecture.

Once IT load is defined, the next critical decision is where to build. Established European data centre hubs – Frankfurt, London, Amsterdam, Paris, and Dublin (FLAP-D) – still offer strong connectivity and mature infrastructure. However, mounting power constraints in these regions are prompting operators to look elsewhere. Countries such as Spain and those across the Nordic region are emerging as attractive alternatives, offering abundant renewable energy and more favourable permitting and regulatory conditions.

Regulation itself is becoming a design driver and must be considered early in the planning process. Sustainability standards vary by region, but many – such as the UK’s Climate Change Agreement (CCA) Scheme for Data Centres – set clear expectations for energy efficiency. In parallel, policies around water consumption, energy reuse, and emissions reporting are tightening, influencing everything from cooling system choices to how facilities interact with local power grids.

Sustainability, meanwhile, is no longer a peripheral concern, it’s a core business imperative. Edge operators are increasingly adopting circular design strategies to reduce waste and improve efficiency. That might mean capturing and reusing heat for nearby buildings or using recycled water sources in cooling processes. These measures help meet environmental targets and can drive long-term cost savings.

Standardisation also plays a powerful role in future-ready data centre design. Creating modular, repeatable blueprints for power, cooling, and cabling can accelerate deployment, reduce complexity, and enhance operational consistency. Industry frameworks like the Open Compute Project (OCP) are advancing this approach, making it easier to scale infrastructure while maintaining technical uniformity.

Lastly, cabling must be prioritised from the outset. The density and thermal intensity of AI infrastructure demands robust, well-planned cabling systems. Underinvestment here can result in signal degradation, overheating, and costly infrastructure rework. To avoid these risks, operators should work closely with suppliers, adhere to proven standards, and ensure that cabling infrastructure is tightly integrated with power and cooling plans. Done right, this investment will help ensure long-term performance, resilience, and scalability.

Building for the long haul

The strength of an edge deployment is defined by the ecosystem behind it. Choosing the right partners, from design consultants to hardware vendors, can mean the difference between a smooth, scalable build and one plagued by inefficiencies. Some organisations may even find that AI tools themselves can assist in managing complex design variables. But shortcuts are rarely sustainable. Long-term value stems from a strategic vision that prioritises durability, compliance, and performance at every stage.

In an AI-driven future, infrastructure is the enabler. Smart, interconnected edge data centres are not just a response to rising demand, they are the foundation of tomorrow’s digital economy.

Related Articles

More Opinions

It takes just one minute to register for the leading twice weekly B2B newsletter for the data centre industry, and it's free.