Skip to content Skip to footer

Save your energy

Michael Akinla, TSE manager EMEA at Panduit outlines how a well-designed white-space can pay dividends when it comes to energy savings and efficiency.

Competition in the data centre market continues to intensify in the build and operation for both colocation and owned sites. The momentum to improve energy efficiency has been helped by developments driven by hyperscale operators (high performance, simplified specification), customer requirements (reducing PUE, reducing cost per kW utilisation) and manufacturers’ equipment warranty parameters (higher operating temperatures).

High energy use in the technology suites of data centres has been a necessary weakness in many data centre operators’ strategies. The long-standing policy of over specifying cooling systems, was based on the belief that equipment should be operated at a temperature that staff were comfortable working in.

Today, operators and most customers understand that with energy costs for cooling systems outpacing energy used in the technology suites themselves, it is time to design for performance and efficiency.

A well-designed white space with a monitored and controllable cooling/environmental system may use a greatly reduced level of energy. In many cases, the latest developments in thermal planning, monitoring and cooling optimisation are saving hundreds of thousands of pounds in energy costs, as well as pre-empting problems and providing a more resilient and reliable data centre.

Data has become an increasingly valuable corporate asset and the requirement to develop systems that guarantee data availability and delivery have led more board-level IT decisions toward standards-based solutions.

Cooling Strategies

International standards such as ASHREA TC 9.9, ETSI EN 300 and EN 50600-2-3 are driving acceptance of best practice in the technology suites and data centre environments. ASHRAE TC 9.9, provides a framework for compliance and determining suitable Information Technology Environments (ITE).

These industry guidelines provide detailed technical information to allow data centre operators to implement cooling strategies that allow optimised equipment operation at carefully monitored and controlled airflows, temperature, humidity and other significant variants.

  1. Cabinet/Rack level – Measurement of inlet temperature and Relative Humidity (RH) for racks at the bottom, middle and top of the cabinets. Maintaining a recommended (18-27oC) as well as allowable (15-32oC) thermal ranges
  2. Containment level (in addition to 1) – With a cold aisle containment system, the hot aisle temperature can be in the range of 50oC; instrument and monitor the outlet temperature at the top of the rack and cabinet. When using a hot aisle containment system, temperatures across the room can be monitored.
  3. Data hall level (in addition to 1 and, or 2) – Humidity and temperature needs to be monitored near each CRAC/CRAH at the Supply and Return. Relatively Humidity is recommended at 60% RH and allowable at 20% – 80% RH
  4. Airflow management and cooling system control – An airflow management and cooling system control strategy should be Implemented. With good airflow management, server temperature rise can be up to 15oC; with inlet temperature of 27oC the hot aisle can be 55o
An example of the saving attainable by using inlet ducting jpg
An example of the saving attainable by using inlet ducting

A current data centre client has designed out mechanical refrigeration to its technology suites, utilising instead an N+1 Indirect Evaporative Cooling (IEC) system, which provides highly efficient climate control, while offering a resilient back-up capability in the unlikely scenario of a unit failure. 

This system design also incorporates free-cooling technologies resulting in increased reliability, higher energy efficiency, increased sustainability and lower operating costs. The site is compliant with the ASHRAE Thermal Guidelines (2011 and 2015) and is the only European data centre working with the Open Compute Project’s data centre program to standardise data centre designs. This cooling system is designed for a temperate climate and regional variation will require modification to ensure maximum efficiency is gained.

Essential equipment

Each data centre technology suite has a capacity of up to 2.2MW, whether that is shared or for individual customers. These offer a single span open hall, which utilise hot aisle containment enclosures, such as Panduit’s Net-Contain system, offering higher density racks, up to 40kW, to be optimised using industry leading cooling and monitoring technologies.

Energy efficient data centre cabinet systems allow higher data centre temperature set points and reduce cooling systems energy consumption by up to 40%. The phrase, look after the pennies and the pounds look after themselves is never more relevant than in this situation. Controlling small air leaks in the cabinets and enclosures maintains air separation between hot and cold air streams, this leads to large savings in cooling energy costs.

In this situation, the regulated cool air from the Indirect Evaporative Cooling (IEC) system, is diffused into the technology suite. Utilising hot aisle containment allows the operator to effectively manage airflows across devices, such as server racks, facilitating cool air to be drawn into the front of the enclosure and cabinets through the hot equipment. The hot exhaust air is then directed up and away from the equipment through exhaust ducts, into the ceiling space and recycled to the IEC system for heat transfer.

Today’s white space processing equipment has higher operating temperatures, and this allows warmer white space operational temperature, meaning that less energy is needed to equalise the ‘Air Inlet’ temperature. Device Inlet temperatures between 18-27°C and 20-80% relative humidity (RH) will usually meet the manufacturers operational criteria.What does become increasingly important is the capability to monitor and control the recommended environmental range, including temperature and relative humidity (RH) and to maintain an allowable environmental envelope, where the systems are operating at optimum performance.

Image of Hot Aisle Enclosure Systems Panduit Net Contain System
Image of Hot Aisle Enclosure Systems (Panduit Net-Contain System)
Effective environmental management

Monitoring systems, such as SynapSense, provide various levels of data, so it is important to understand the level of granularity that is required for your needs. Once airflow management is optimised, the automated system, often wireless and increasingly on a mesh network to maximise its capability, should offer active control to mitigate temperature control risks associated with fan failures, maintenance schedules relocations, changes in IT load and software patches and failures.  

The chosen solution should offer an advanced wireless sensor mesh network, where sensing devices, gateways, routers, server platforms and a comprehensive software platform provide connection and monitoring across the entire technology suite. The system needs to integrate data sets from every key piece of equipment to provide management with a highly versatile tool for analysis and intelligent trend gathering.

Conclusion

Data centres are an increasingly important hub within the digital economy, many older sites with legacy technology, expensive cooling equipment and minimal monitoring and analysis capabilities are becoming inefficient to the extent where it is change (upgrade) or die (lose your clients to higher performance, more efficient sites).

All data centres are different whether it’s the construction, the region, the availability of energy or at what price. As such, they require individual solutions to achieve the most effective position within the market. Today, the market is evolving faster than ever, but the constant remains; the data centre must be – more efficient and offer 100% uptime.

You may also like

Stay In The Know

Get the Data Centre Review Newsletter direct to your inbox.