Alistair Barnes, Head of Mechanical Engineering at Colt Data Centre Services, explains why air still matters in the AI era, and how selective hybrid deployments could help control cost and improve efficiency.
Worldwide spending on artificial intelligence (AI) is forecast to reach $2.52 trillion in 2026, according to Gartner, representing a 44% year-on-year increase. In addition, it is estimated that 70% of global data centre capacity will be driven by AI workloads by 2030.
This rapid growth is creating new challenges for data centre operators. They are now required to rethink traditional approaches to cooling, as high-performance computing components such as GPUs generate far more heat than conventional IT equipment. If thermal output is not managed properly, operators could face a range of issues, from degradation of existing hardware and uneven load distribution to soaring energy costs.
It is clear that data centres designed for AI platforms must be supported by efficient cooling systems if they are to run reliably and efficiently. Let’s take a look at some of the cooling technologies designed to tackle this challenge head-on.
The capacity limits of traditional air cooling
At the heart of this challenge is the limitation of traditional air-cooling methods. While conventional air-cooling systems were designed for far lower rack densities, today’s modern infrastructure is packing much higher power into each rack, with AI-focused deployments exceeding 100kW.
With total data centre power demand projected to increase by 165% by 2030, these power densities will only continue to rise, widening the gap between thermal output and what air-only cooling can realistically handle.
A closer look at liquid cooling
Liquid cooling has become a powerful way to address these thermal challenges. Instead of relying on air, these systems use water or specialised coolants to draw heat directly away from high-temperature components. Because liquids have a much higher heat capacity than air, they can remove heat more effectively, enable heat reuse, and reduce the load on cooling systems.
A range of liquid-cooling approaches is now used in data centres. Some solutions circulate coolant through the rack to draw heat away from servers, while others rely on cold plates mounted directly on heat-intensive components. More advanced systems even immerse entire servers in a thermally conductive, non-electrically conductive fluid to deliver cooling across all components.
By removing heat directly at the source, liquid cooling enables data centres to run higher-density workloads more efficiently – an increasingly important requirement for AI services.
The benefits of a hybrid cooling strategy
While liquid cooling offers major advantages, it is not capable of fully replacing air-cooling systems. Even with liquid technologies deployed, some heat still radiates into the surrounding environment. This means a degree of air cooling remains necessary to maintain stable conditions throughout the data hall.
For many operators, the most effective approach is a hybrid cooling strategy that blends liquid and air cooling. The ratio of liquid to air will differ from customer to customer, but this combination can deliver strong thermal performance, improve power usage effectiveness (PUE), and help reduce overall energy consumption across the facility.
A major advantage of hybrid cooling is its flexibility. This approach can accommodate a wide spectrum of rack densities as requirements change. Lower-power racks can continue to rely on air cooling, while liquid cooling can be applied to high-density AI racks. With the right set-up, this selective deployment allows operators to control costs and cut energy use by applying liquid cooling only where it is needed most.
Future-ready cooling for the AI-driven era
Sufficient cooling is now essential for data centres supporting the surge in AI workloads. Traditional air cooling alone can no longer meet the thermal demands of high-density infrastructure. Fortunately, emerging technologies such as liquid and hybrid cooling provide the performance and efficiency needed to keep pace with these requirements.
With smart planning and the right mix of cooling architectures, data centres can future-proof their operations, maintain performance, and stay ahead in an increasingly AI-driven era.

