Skip to content Skip to footer

Green IT and the Edge: Can you really have one without the other?

edge

Are green IT and edge computing a match made in heaven we never even considered? David Craig, CEO at Iceotope explores why getting the edge ‘right’ can’t happen without green IT.

IT equipment is fragile – exposing it to the elements will create unacceptable risk. Liquid cooling should be a key consideration for everyone implementing edge data centres. ASHRAE TC9.9 latest Technical Bulletin: ‘Considerations for reliable operation of edge data centres’, needs to be essential reading as edge goes global.

Many conversations today consider edge data centre networks will be built using mass produced commoditised units, the same systems that pop up in shops, factories, transport hubs, streets and on roof tops in cities across the world. New requirements will ensure edge data centres come in many different form factors, right sized to their use and location.  

What won’t change, in most cases, is the heart of the machine. The IT equipment that sits inside these various edge data centre designs. This will comprise the mass-produced processor, memory, storage and networking components that currently populate the world’s environmentally controlled data centres.  

The reasons for this are scale and economics. Creating hardened IT equipment, like the telecoms industry did decades ago, for every type of edge data centre and every type of environment is simply not economical at this time. 

While many designs for the IT equipment components within edge data centres will utilise the same IT gear operating in IT cupboards, server rooms and hyperscale data centres, this equipment, the data it processes and stores, will become crucial to local essential activities, such as autonomous vehicles and traffic control, where an outage at a site could cause serious consequences.

Edge data centres are subject to a variety of external influences that have been eliminated in enterprise and data centre environments. It is time to seriously consider best practice over common use and availability.

Standard IT equipment is about to be rolled out at an unprecedented scale. However, unlike industrial equipment or telecoms gear, IT equipment is generally not hardened or built to specific standards for use in uncontrolled or semi controlled environments.  

As standard equipment is placed in non-standard environments efforts to avoid unacceptable failure rates must be addressed now at the design stage.  

Right sized, not one size 

ASHRAE TC 9.9 has taken time to evaluate and consider the design and operation of edge data centres in its latest technical bulletin. It provides a detailed dissection of the hazards to reliable operation of edge data centres. 

Technical Committee 9.9 is the committee which established the temperature guidelines for efficient data centre operations in 2004, which have become the de-facto standard for data centre design across the globe, and it continues to monitor and update them regularly.  

The latest bulletin begins by outlining some of the new demand drivers such as IoT and AI data applications and workloads such as remote learning, 5G and telemedicine. Its conclusions are that edge data centre form factors will not be standardised.

There will be some uniform mass-produced units and there must be a range of different enclosure types. These will scale from small street furniture type enclosures to large containers which may be deployed in semi-controlled environments such as factories or warehouses.

There will also be small brick-built buildings which could accommodate mantrap access and environmental controls, providing environments closer to technology hall clean spaces.

Whatever form factor is chosen there is no escaping the fact that the edge is bringing IT equipment out of the protected, environmentally controlled data centre technical white space and into challenging and harsh real-world scenarios.

Getting under the skin of edge  

ASHRAE’s considerations include warranties for IT equipment, maintenance regime considerations including temperature, humidity, pollution and condensation. 

Many of these edge data centres will be protected by a single skin which risks exposing sensitive IT components to the elements whenever access is required.  The TC 9.9 bulletin states, “Scope and Problem Statement – Will any of the following occurrences impact the IT equipment performance, reliability, or manufacturer’s warranty?”  

• Cold day: What would happen to the inlet airstream to the IT equipment if the door was opened on a very cold day (i.e., –5°C, 23°F) that was below the specified limit of the IT equipment?  

• Air pollution: What if, opening a door allows high levels of air pollution to access the interior, this could initiate corrosion?  

• Dust:What if, the data centre was located in a dusty climate where air coming in through an open door bypassed the filtration system?  

• High humidity: What if, a service needed to be done on a rainy day when relative humidity was close to 95% rh, i.e., above the IT equipment’s specified limit?

The bulletin offers excellent advice for the design stage about right-sizing your different edge units for application variances and differing servicing and maintenance approaches.  

ASHRAE does not tell IT equipment makers how to design components, processors, memory or hard drives. It is simply addressing edge in the context of standard IT equipment being deployed in non-standard environments. These are new scenarios which are harsh by nature and where IT equipment must be protected from the possibility of soaring failure rates.  

You can’t control the weather

Currently too many edge designs still incorporate standard cooling techniques as though they are controlled environments using forced air blown across hot equipment to keep it cool.

This is a risky way to proceed, as it requires mechanical fans, which fail, and maintains inefficient and expensive energy use into the model, at a time when sustainability and efficiency are central requirements. 

Efforts to avoid unacceptable failure rates maximise effectiveness must be addressed at the earliest stages in development.  

It is possible to design your edge data centre, so units are cooled by a technology which guarantees IT equipment is impervious to heat, dust, humidity and other contaminants.

A chassis level liquid cooled solution which is energy efficient, fully sealed and operates efficiently for years without the regular intrusive high touch service and maintenance regimes.

Liquid cooled designs can cool even the latest dense compute requirements of HPC racks, while eliminating almost all water use in the cooling process.

The technology offers the capability to capture and reuse heat dependent on the location and requirement, whilst providing a level of reliability at the edge, which is currently only available in controlled data centre environments.  

The huge number of units to be managed in any edge network means regular intrusive maintenance is simply not practical or economic. Not least because the act of servicing itself introduces heat, cold, pollutants and humidity variations into the equipment which may void the warranty conditions of sensitive equipment.  

Edge data centres by their nature will be exposed to the elements. However, that does not mean that the IT equipment operating within cannot be protected. Chassis level liquid cooling solutions directly address many of the challenges and considerations raised by ASHRAE’s TC 9.9 bulletin. 

You may also like

Stay In The Know

Get the Data Centre Review Newsletter direct to your inbox.