Optimising costs in data centres is a constant, significant challenge for today’s IT managers, says Rittal’s Clive Partridge. Benefitting from the free cooling available from nature is clearly a wise option. Energy usage is one of those costs to be optimised, particularly the energy required to for the technology used to cool computer servers. Some very high-tech equipment has been developed in recent years which has increased the efficiency of coolers. However, it’s also worth noting that natural sources of cooling can also be considered, including cold water and cool ambient air. So, if this is a route you’re considering, what questions what do you need to ask to get started?
The phrase 'free cooling' should perhaps be qualified. In the context of climate control technology, it does not mean an IT cooling system that is completely free of charge. Instead, it involves reducing the use of compressor-based refrigerating machines as far as possible – ideally to a point where energy is only required for both the free cooler’s fans and any pumps which are needed for the cold water.
The efficiency of the system therefore depends largely on the relevant climatic conditions on site. A data centre in Northern Scandinavia will operate much more cheaply than one in southern Europe. But how does free cooling work?
Cooling uses convection to remove heat from the medium to be cooled, usually a water-glycol mixture, via the ambient air. The free cooler is installed outdoors and might contain a lamellar heat exchanger - or something comparable -through which the warmed water-glycol flows to remove the heat. The larger the contact surface of the lamellae, the more efficient the system. Air flow can be increased using additional fans, thus boosting the cooling output while expending minimal energy consumption for cooling. But the inlet temperature this achieves will only be just above that of the ambient air. Climate control technicians use around +3°C as a guideline figure for design purposes.
Advantages and disadvantages
In free cooling, a distinction is made between direct and indirect methods. Direct free cooling uses the cooling medium as directly as possible to remove the heat generated by the data centre. For example, large data centre operators with uniform environments use the outside air - they literally blow outside air directly into the data centre.
A good example is the Yahoo self-cooling data centre in New York State, near the border with Canada. The buildings were erected at right angles to the prevailing wind direction and fitted with a roof extension running the entire length, similar to a cockscomb – hence its nickname, the henhouse. Cold air flows into the building via slats in the side walls, while the warm air is dissipated via the roof. Ideally, the only additional energy this solution requires is using fans to help with moving the air.
As easy as this sounds in principle, there are disadvantages. The intake air needs to be purified using filter units. It is also necessary to mitigate for weather-related temperature fluctuations. For example, a mixer can feed warm waste air from the data centre outside, if the outside temperatures are too low. Conversely, if the outside temperature is too high, a refrigeration compressor must be used. A further challenge is the humidity, which changes due to factors such as rain. Air that is too moist or dry can have a negative impact on the service life of IT components. Finally, the ducts which draw in the fresh air are usually very large, so there has to be further built-in protection against rodents and insects.
Adiabatic cooling is a complementary technology which improves the efficiency of direct free cooling. Before intake air reaches the heat exchanger, water is sprayed into it. The water droplets evaporate immediately and this transition from liquid to gaseous state results in the water extracting heat from the surrounding air. This makes it possible to lower a cooling system’s inlet temperature.
But using water droplets brings with it a risk of bacterial infection, notably legionella. This necessitates regular cleaning, a high water flow and shielding from sunlight. Overall, adiabatic cooling systems offer great potential for energy optimisation but need precise planning and expert oversight. Users of large amounts of water need to keep an eye on how much they are consuming. The Green Grid has defined the water usage effectiveness (WUE) metric for data centres. This metric determines the annual water consumption in relation to the energy consumption of the active IT components. The unit of WUE is litres per kilowatt hour (l/kWh). This can be used in tandem with other consumption values to optimise IT running costs.
Indirect free cooling
Those living in Northern latitudes who need cooling for a medium-sized IT infrastructure of up to around 200 kW will typically opt for an indirect cooling system. This applies in particular to SMEs, who rarely have the resources to pay for major cooling systems. In the case of indirect systems, the outside air cools a heat transfer fluid (such as water). The water is used to feed cooling energy into the data centre based on the fact that it conducts heat up to 4,000 times better than air. As no outside air is blown into the data centre, fewer filter systems are needed and no outside humidity is brought into the building. However, there needs to be at least one air/water heat exchanger as well as pumps in the cold water system, which will be connected to mains electricity. Many prefer this solution because it’s clean, stable and predictable. It compensates for fluctuating weather conditions and any seasonally-related temperature changes highly effectively.
Lefdal Mine Datacenter is a cloud data centre built in a decommissioned mine on the Norwegian coast. The developers used seawater from the adjacent fjord as a cooling medium. The cooling water is drawn from a depth where the water temperature is around a constant 8oC and then fed into a heat exchanger system’s primary circuit. The secondary circuit then supplies the cooling into the mine at the required temperature. As the weather and temperature conditions are extremely constant, the operators have very stable control over the thermodynamic system. To protect against corrosion, the system uses titanium-coated surfaces within the primary cooling circuit.
A minimum data centre load is needed for some systems, such as Lefdal, to heat water that is too cold. So, when designing a cooling system, start by evaluating the minimum load required for operation. A cooling concept should always be designed and engineered specifically to match the need. Suppliers such as Rittal use up-to-date weather data, for example, to calculate the temperatures for free cooling at each of the relevant sites. Other significant parameters include the humidity and dew point. The internationally recognised industrial association ASHRAE provides guidance across these parameters. It defines the conditions that enable an IT environment to operate reliably meaning it is possible to operate a server at an ambient temperature of 25oC or higher, for instance.
Cooling concepts used in large data centres such as those at Facebook, Google and other hyperscalers are not easily purchased and adapted. These are customised solutions that take into account the IT infrastructure system utilisation and ambient temperatures. Users looking for reliability will choose a sealed cooling solution they can control themselves covering all the cooling circuit’s parameters. Only by removing uncertainties from the equation such as the weather can lead to stable and, above all, failsafe infrastructures.
Clive Partridge is Product Manager for IT infrastructure at Rittal