Skip to content Skip to footer

Keeping the edge

Pascal Holt

Pascal Holt

Director of Marketing at Iceotope Technologies Limited
Share on facebook
Share on twitter
Share on linkedin
Edge

Edge computing is changing the way we process data. The need to handle, manipulate, communicate, store and retrieve data efficiently is successively moving processing capacity closer to the user than ever before.

In fact, the rapid expansion of edge computing facilities is expected to increase five-fold to $11 billion in investment by 2026. The data centre is no longer the centre point of data.

All types of industries – from healthcare and agriculture to retail and more – are driving this trend. Any device in any location that is gathering, analysing and acting on data is a component of edge infrastructure. IDC estimates the number of connected devices will reach 55.7 billion by 2025. Gartner predicts that by 2023, more than 50% of enterprise-generated data will be created and processed outside the data centre or cloud. Those figures don’t even take into account the fact enterprises are still adjusting to a more distributed workforce with long-term work from home practices because of the pandemic.  

Artificial intelligence (AI) is one key application increasing the requirement for digital infrastructure and processing power. AI, according to Google CEO Sundar Pichai, will have a more profound effect on civilisation than electricity or fire. Similar to the industries driving the push to the edge, AI applications and technology are being deployed in a wide variety of use cases to ensure the reliable delivery and continuity of services based upon data outputs.

AI applications are driving the edge

The recently announced study using AI to diagnose dementia in patients in as little as a day is one such case. While the trial was initially conducted in hospitals and memory care facilities, the next phase of the study will be to test it in clinical settings alongside conventional ways of diagnosing dementia. Dr Laura Phipps at Alzheimer’s Research UK noted that the AI systems were “drawing on the insights from huge datasets to help doctors make more informed decisions about diagnosis, treatment and care.” Instead of relying on interpretation of scans, machine learning models will help lead to more accurate diagnoses.

Currently, AI workloads like this generate large data sets that require complex calculations and data processing. As a result, they need to leverage high-power density GPUs which demand a high level of resiliency and processing power, particularly at the edge. In the near future, as AI becomes ubiquitous in all applications, it will be commonly running on standard, low-cost commercial server platforms.

The challenge is that edge computing loads are usually required to operate reliably in locations not built specifically for IT equipment. At the same time, taking servers out of protected, environmentally-controlled data centre technical white space can have multiple impacts upon reliability, efficiency, monitoring and service operations. In addition, placing such loads in harsh environments also exposes them to the effects of humidity and high temperature, emissions, air particles, vibration from industrial machinery, corrosion, etc.

Cooling technology solving challenges of high-density technology

The use of technology to solve the challenges being created by technology is being mandated today. Precision chassis-level immersion cooling, for example, enables an ideal environment for IT equipment to be installed and successfully operated in diverse locations. Traditional cooling techniques use forced air to cool equipment, putting equipment at risk in those harsh environments. A fully sealed precision immersion cooling solution delivers reliable server operations by isolating sensitive electronic components and circuits from harmful gaseous and particulate contaminants.

Precision immersion can also extend the operating life cycle of hardware by reducing service and maintenance call-outs. The plug and play nature of a sealed chassis enables a consistent service and support model. Servers can be monitored and managed remotely. A technician who can replace a module at the data centre campus can just as easily make the same replacement in a remote location. When swapping out the chassis as a complete module, the service call is simplified and eliminates exposure to environmental elements on-site, de-risking service operations.

Sound pollution is a major consideration with edge computing. Depending on the environment, IT equipment noise needs to be as minimal as possible. Precision cooling eliminates the requirement for server fans and the required HVAC equipment is significantly reduced. This enables near-silent server operations. The noise then becomes more comfortable for the non-IT tenants sharing the building, as well as the IT teams moving about the space. It also helps detract attention from the equipment to deter theft, vandalism, etc.

Finally, for an industry with well-publicised ambitions to provide carbon net-zero operations, sustainability at the edge is a critical component to any data centre strategy. Advanced liquid cooling solutions are capable of achieving 1.03 PUE or below. Precision cooling captures >95% of server heat inside the chassis, significantly reducing energy costs and emissions associated with server cooling. Water consumption is negligible as little to no mechanical chilling is required.

As our demand for data increases, how and where we process data will continue to evolve. Edge computing is just beginning to demonstrate its impact. As the expanding network of data centres continues to grow, access to power, cooling and connectivity will become even more important. The industry must turn to technology to mitigate challenges being caused by technology. Precision immersion liquid cooling is fast becoming one of those critical solutions for delivering reliability and energy efficiency while responding to increased processing and storage requirements.

Show CommentsClose Comments

Leave a comment