With legacy data centres closing their doors – here’s how to stay ahead

Dean Clark
Dean Clark
Chief Technology Officer at GFT

With aging facilities closing faster than ever, Dean Clark, Chief Technology Officer at GFT, argues that those who turn the upheaval into data-agile, cloud-powered advantage will pull ahead.

Whilst data centres continue to play a key role in today’s technology landscape, legacy data centres raise important issues. The infrastructure of these data centres is often outdated, leading to low energy efficiency and high running costs. They are not fit to meet the demands of modern computing. As a result, some organisations are working to upgrade their data centres, such as Vodafone who recently unveiled their data centre modernisation strategy to enable high-performance computing, whilst maximising legacy infrastructure.

However, modernising legacy data centres can be challenging and expensive, so many legacy data centres are shutting down instead. For example, Singaporean telco Singtel announced last year the closure of five of its legacy data centres in Singapore as it pivots towards more sustainable, AI-focused facilities.

In this context, pressure on organisations is rising as they often need to be extremely reactive to move their data quickly shortly after a closure notice lands. It’s a headache, but it also highlights the importance of data agility, which refers to the ability to grab, process and use data quickly and smoothly. In today’s modern era where the tech landscape can change overnight, agile data empowers businesses to adapt quickly, innovate and stay ahead of the curve. It’s no longer a nice competitive advantage; it’s essential for survival.

Why it’s time to look beyond aging data centres

ESG standards have become front of mind for the tech industry, so legacy data centres that run too hot, consume too much energy and leave a significant impact on the environment have no place in the tech landscape. What’s more – legacy data centres make processes less efficient. Their siloed setups and patchy data quality slow decisions and, above all, inflate costs, compared to what we expect from today’s cloud-native thinking and real time analytics. 

Upfront hardware spending locks you into guesses about demand, and scaling up means waiting for boxes to arrive. Security is also a challenge. Cloud providers refresh their defences constantly, whereas on-prem teams struggle to keep pace. Add worries about data residency and rising energy bills, and the case for change writes itself.

The key steps for a successful transition to the cloud

Cloud providers offer elastic capacity, granular cost control and built-in security, and many organisations are moving to the cloud as a result. But this transition can’t be done overnight. You still need a plan.

The first step in any cloud migration is to design a clear cloud strategy that’s aligned with your business goals. Triage workloads, by impact, move the critical stuff first – and pick the right mix of public, private or multicloud. Multicloud is becoming more popular within the industry because it avoids lock-in and lets you cherry-pick strengths.

In parallel, it’s key to ensure strong governance for security, compliance and data protection. And once you’ve reached this step, keep tuning performance and costs. The job isn’t ‘done’ the day you flip the switch. Monitoring needs to be done on an ongoing basis to check everything is functioning as expected.

Tech alone won’t cut it – invest in people and process too

When transferring your organisation’s data, applications and workloads to a cloud environment, you should not just rely on advanced tech to do the job – people are also core to any cloud strategy. Investing in upskilling your teams, redesigning workflows or bringing in external partners for consultancy is key to making the most of cloud technology and eventually making processes quicker and more efficient. Make portability a design rule, especially if you still rely on mainframes or COBOL apps. Executive backing is crucial: cloud programmes stall fast without a senior sponsor clearing roadblocks.

This need to migrate to maximise performance through cloud computing is especially true for banks who are currently competing with cloud-native challengers that launch features in weeks, whilst traditional institutions are still chained to mainframes. Moving core workloads to the cloud isn’t just a good idea; it’s the bare minimum.

Take your time with your cloud migration

A rushed transition to the cloud can reveal vulnerabilities, and in complex environments even more. Cloud migration impacts every layer of the business; apps, data stores, ops models, governance, so misjudging the timing can lead to overruns, downtime and budget blow-outs. Instead of treating the shift like a business emergency, approach it as a full-scale business transformation.

The importance of data agility

Data agility is key to faster decision-making, flexible operations and innovation – and this data agility can be achieved by moving to modern cloud platforms, as shown by the example of Romanian neobank Salt Bank. Salt Bank impressively launched in under a year on Engine by Starling’s cloud-native core. Ignoring banks’ traditional infrastructure enabled the bank to move quickly and concentrate on customers’ needs as a priority.

For a similar outcome, businesses must invest in modern data platforms and a culture that treats data as a strategic asset. Real-time streaming should deliver insights in the moment, not days later. And with the market constantly moving, both strategy and tech must be reviewed continuously, not just once after the cloud migration.

In conclusion, whilst data centres closing down creates challenges for organisations, it also presents a great opportunity to adapt and innovate to stay ahead. Navigate the transition widely, and you’ll come out faster, leaner and much more ready to face the future.

Related Articles

Top Stories