The cloud is virtually ubiquitous today with Gartner predicting public cloud spend to reach nearly $600 billion globally this year, and grow by another 21% in 2024. At the same time, hybrid and multi-cloud environments are becoming commonplace with data and workloads residing in different locations.
On the face of it, this should provide organisations with greater flexibility and resiliency, but being able to move data between different clouds is often more complex and expensive than expected. Organisations often face huge exit fees, and the integrations, security, tools, and skills needed to manage data differ for each cloud provider. Because of these costs and complexities, many organisations can find themselves locked into cloud services.
This has caught the eye of regulators in the UK, who are starting to take a look at some of the practices of the major cloud providers ‘lock-in’.
The problem with lock-in
The UK regulator for communications services Ofcom identifies three issues with the behaviours of major cloud providers. The first is the egress fees they charge to transfer data to another provider, which can be significantly higher than other cloud service providers (CSPs). The second is also financial: discounts are often structured to incentivise customers to use a single hyperscaler, even if better alternatives exist. And third, technical restrictions which make interoperability between cloud platforms challenging, with each hyperscaler having its own machines, security posture, tooling and configurations. All of these require a unique set of skills to manage.
This is more than an abstract regulatory issue. Vendor lock-in could lead to public cloud customers facing large price hikes, without alternative, cost-effective options to take their data and business elsewhere. Organisations may also find over time that their provider no longer offers the capabilities their business needs, but they’re still forced to stay.
At the same time, if too many critical service providers like banks or telcos rely on a small pool of cloud providers, they – and their customers – risk being exposed to potentially crippling outages.
As a result, organisations aren’t shying away from cloud, but they are becoming wary about what data and workloads are stored there. Recent research reveals that nine-in-ten ITDMs (92%) plan to migrate more data to the cloud over the next three years. While only 4% have repatriated data back on-premises in the last 12 months, 76% plan to repatriate some data back to on-premises environments in the next 36 months. More than half (54%) say this is down to fears over vendor lock-in.
Freedom is everywhere, except in the cloud
Think back to how difficult switching banks or energy, internet and mobile providers used to be. It was more trouble than it was worth, fraught with delays and endless red tape. As a result, regulators stepped in, industries modernised, and it became much easier to move between service providers, giving consumers more flexibility and choice. The same should apply to public cloud, with customers having the same freedom when choosing and switching between service providers.
However, even if regulators decide to take action, it’s unlikely to unleash a flurry of activity that upends the status quo. Why? Because any regulatory action will most likely focus on egress fees and committed spend discounts. There may be efforts to force more consistency between cloud environments – but ultimately, swapping cloud providers takes time, resources, and specialised skills.
As such, in the hybrid and multi-cloud age, organisations need greater flexibility to move data between clouds. A unified data platform can help by providing a layer of abstraction that makes it easier to securely move data from cloud to cloud or from on-premise to any cloud. Having the choice to switch cloud providers is one thing, but having the ability to do so is another.