Skip to content Skip to footer

The 5G paradox

David Keegan

David Keegan

CEO of DataQube
Share on facebook
Share on twitter
Share on linkedin
5G
Image credit: DataQube

The Internet of Things and the ‘smart’ phenomenon are redefining the data centre landscape.

Machines are churning out data in such high volumes it’s almost inconceivable, with said data needed to be dealt with at the source if the embedded tech and interconnected devices/applications reliant on the gathered information are to be workable in the real world. 

To give this some perspective, a single driverless car (according to Intel) could potentially generate up to 5TB every hour. Factor this in with the cameras and sensors needed to make them road safe, not to mention the associated roadside infrastructure and comms networks needed for operability and safety purposes, this figure could easily be an underestimation.

This intensified data generation is then combined with 5G, poised to be the catalyst to all things smart, not to mention innovations in AI and machine learning. This fifth-generation network, because of its ultra-high-speed, ultra-low-latency capabilities, is driving change across the industry, with data centre infrastructures in their existing format needing a total rethink in terms of their connectivity capabilities and physical locations if they are to keep pace with this data explosion.

The race is on to move to the edge

The drive for data centres to re-evaluate their data handling capabilities, which has been accelerating for some time, has been given a turbocharge by Covid-19. Such is the prediction for change that Gartner estimates that 75% of enterprise-generated data will be created and processed outside centralised facilities by 2025. The global market for edge data centres is also expected to nearly triple to $13.5 billion by 2024 from $4 billion in 2017, thanks to the potential for these smaller, locally located facilities.

In tandem, investment in 5G infrastructure is also on the up, with the GSMA predicting that operators will allocate more than 80% of their CAPEX towards building 5G networks within the same timeframe. 5G and edge computing are clearly transforming the way we harness and use data but considering they fulfil very different requirements in the overall ‘data processing ecosystem’, the rapid growth of both poses the question – what would happen if they were to act as one?

Firstly this ‘marriage’ would deliver an augmented end-user experience and enhanced interconnectivity between devices. Secondly, and far more significant, combining 5G and edge computing delivers unprecedented accessibility to large pools of data by drastically reducing latency and streamlining service delivery at the edge. 

Both these capabilities are instrumental to the widescale deployment of the IoT and automation. The theory is great; the reality is somewhat different because edge data centre facilities up to the job are in short supply and the rollout of 5G is not happening as quickly as everyone might have hoped. 5G may well promise to be the holy grail to all things IoT, but edge computing and IoT are already in widespread use. They have been for some time and are working perfectly well on existing 4G networks, so why all the hype? 

5G has been built for machines

To fully understand the true impact 5G is set to have on digitisation, you need to understand the key difference between this next-generation network and its predecessors. Until now, all mobile networks have been designed to meet the needs of people. 5G has been designed with machines in mind. 

The latency rate for existing 4G networks is 200 milliseconds, not far off the 250 milliseconds it takes for humans to react to visual stimuli. The 5G latency rate is significantly lower, at just 1 millisecond. 5G also promises to reach delivery speeds of up to 10 Gbps. These differentiators may be of little consequence to commercial telecoms services, but this high-speed/low-latency performance will allow machines to achieve near-seamless communication. 

The technology/usability paradox

While 5G delivers extremely high data rates, the downside is that this new network is shaking up the telco and data centre industries in their entirety. 5G frequency bands have much shorter propagation rates than their 4G counterparts. This in turn requires significantly more telco equipment above ground to overcome line-of-sight propagation challenges and fibre cabling below ground to enable seamless connectivity to the cloud. 

Before aspirations of an interconnected world can be achieved, existing telco sites need to incorporate cloud computing into their existing infrastructures to facilitate interconnectivity between IoT devices, other edge data centres and centralised facilities, as well as to manage the associated backhaul. 

And the CAPEX needed to make this happen is proving to be a major stumbling block for the mobile network operators. Their 3G and 4G spectrum investments did not reap the rewards they expected, and as such, justifying the upfront sums needed to the different stakeholders is challenging. The funds needed are so significant that the GSMA does not predict 5G to be universal until at least 2028. 

Existing edge data centre setups are neither viable nor affordable 

These connectivity barriers are a major quandary for regular data centres. Not only must they find a viable means of decentralising their data processing without compromising performance, security, or data fidelity, they also have the interconnectivity aspect to contend with. 

The only options currently available are containerised data centres, purpose-built data centres or micro data centres, all of which involve deployment times of 18 months plus and require huge upfront investment. More significantly, their install sites are significantly limited because these types of facilities are totally unsuitable for tall buildings, underground locations, at the side of motorways, in railway sidings etc. 

An alternative approach is needed to data processing and edge computing, and unless the major players can find cost-effective and viable means of delivering HPC at the edge within acceptable timeframes, government aspirations for the ‘smart’ revolution can be little more than a pipedream.

Show CommentsClose Comments

Leave a comment