Where is the edge headed?

While edge computing has many definitions, deploying IT systems at the edge is a fast-growing trend with considerable market impact (>$100 billion).

The pendulum is swinging back hard and fast – from mainframes to client/server to the cloud and now back to edge. Edge computing will never completely displace the cloud or corporate data centres, but it will significantly impact how IT teams think about and architect their systems. To understand why, let us look at the market, edge definitions and near-term predictions for edge computing.

Why defining ‘the edge’ became cloudy

Data production at the edge continues to grow exponentially as applications increasingly must be run locally. Incredible volumes of data are being created at the edge from energy and manufacturing to retail and healthcare, to first responders and ‘smart cities’, as well as data from cameras, digital sensors, POS systems and a host of other IoT devices. To improve customer experiences, efficiencies, and profits, data must be processed and analysed where it is created, which for these industries is the edge. This is challenging for organisations and enterprises relying on cloud strategies and architectures, as cost, latency and reliability issues abound in cloud implementations, let alone the growing regulatory constraints on data location and movement. 

With increasing Twitter chatter from vendors, analysts, and media, it is no wonder there is confusion around defining edge. Business and IT leaders have spent the last decade thinking of and implementing ‘cloud first’ strategies. Some hesitate to make edge infrastructure investments. Despite the discord, we know what is true about the edge and what will continue to be debated. 

The truths about the edge

According to Gartner, “edge computing is part of a distributed computing topology where information processing is located close to the edge, where things and people produce or consume that information.” The edge is any location outside the data centre or cloud where an organisation needs to run applications locally to minimise the need to process data in a remote data centre. The edge could be an oil rig in the Atlantic Ocean, a manufacturer’s 10 operations sites or a retail store chain’s 10,000 locations.  

Most edge sites need cloud or corporate data centre connectivity to access certain services. Still, many organisations avoid deploying compute and storage at the edge, only to find that depending on cloud services is expensive, does not allow for real-time application processing, and risks internet outages and application downtime. Some cloud enthusiasts are realising they are spending unnecessarily with a ‘cloud first’ strategy.

However, managing, using and protecting data at the edge is difficult as most tools are designed for the data centre or cloud. IT managers struggle to classify which edge data must remain there, be archived in the cloud, backed up to a data centre or deleted altogether. These challenges pigeonhole decision-makers into expensive cloud-led application, data and management options rather than exploring flexible and future-proofed strategies that address edge data’s specific parameters and uses.

What the edge is not 

Edge and IoT are frequently mistaken for each other, which clouds their distinction. IoT is actually an edge use case or an edge subset. The same goes for the conflation of cloud and edge. Some business leaders misconceive that to deploy an edge site, they should employ the same strategy as they do with the cloud.

Colocation facilities (colos), which businesses rent for servers and other computing hardware, are not the edge. Colos can have hundreds or thousands of servers, and petabytes of storage. They really are remote data centres, but owners/providers of such facilities sometimes promote themselves as ‘edge data centres’ to leverage the ‘trendy’ term. Yet if a retailer tried to use one of these ‘edge data centres,’ they would get the same result as running apps in the cloud – with the same problems described above.

Problems with the edge

The edge presents challenges for IT departments that try to build infrastructure to run applications, store data, and analyse it at these small and sometimes remote sites. Most infrastructure hardware and software were designed with the cloud and large data centres in mind. These solutions don’t work at the edge due to the cost and complexity required to provide a 100% uptime environment to run all local applications.

Edge sites are small and don’t have the space, power and cooling needed for data centre-class hardware. Most edge sites look for small-footprint servers and storage – typically some type of hyperconverged solution. It has been difficult and expensive to deploy edge computing infrastructure that provides 100% uptime and the performance needed to run all local applications. However, many vendors have solved this problem and most integrators and VARs now have tools to help end users do just that.

Another major problem with edge computing is complexity. These sites often lack IT staff to perform maintenance or management. Since edge computing systems are typically critical enterprise infrastructure, organisations must acknowledge the complexity and seek solutions that are simple to install and operate and be confident in the knowledge that all edge sites can be managed remotely, from a single pane of glass.

What will the edge look like in the future?

In the next five years, we expect to see edge problems addressed through industry innovations around the way applications and data are deployed and managed. Organisations today want simple-to-use hardware, software and management tools designed specifically for the edge so they can run their applications and analyse their data in real-time, but no such tool exists in the market, forcing enterprises to pay for more comprehensive systems that are overkill.

Edge applications will be increasingly container-dependent, which can save an organisation by eliminating the expensive ‘hypervisor tax.’ Additionally, data services will need to improve. Today data gets created, stored and usually backed up and some of it is sent to the cloud or a data centre for further processing and analysis. As organisations’ need to take action on data at the edge continues to grow, storage systems must evolve to support this requirement.

In the next 10 years, edge computing will have more data and processing than the cloud, and the edge will be the prominent location in which data and applications are managed and processed. Innovation will happen faster at the edge than in the cloud. This shift will galvanise a change in server design, storage capabilities and management software, with products becoming smaller, faster and easier to manage. The cloud won’t go away, but will not be used heavily for edge computing use cases. 

The right definition of edge today

Though the debate around the latest or true definition of the edge will likely continue, one might argue the right definition of edge for today is as follows: edge is anywhere outside the data centre or cloud where an organisation needs to run applications locally to avoid the cost, latency and reliability risks of doing everything in the cloud. 

Related Articles

Top Stories