Skip to content Skip to footer

Taking data analytics to the cloud

Image: Adobe Stock / Connect world

In 2017, the Economist published an article titled. The world’s most valuable resource is no longer oil, but data. In truth, firms in the financial services sector, especially capital markets, have known this for many years.

The digitisation of financial services arguably began with the big bang of 1986 when major exchanges like the London Stock Exchange switched over to an automated quotation system, replacing the trading floor. From that moment on, technology has continued to disrupt and transform the sector – the introduction of High-Frequency Trading (HFT) and cryptocurrencies are two obvious examples, but there are many more.

Data, more specifically the analysis of data, has been fundamental to this transformation. Financial services firms were among the first companies to realise that faster access to deeper, richer insights could give them a significant competitive advantage over their peers, especially in areas such as HFT.

However, and rather inevitably, the focus on analytics and the development of ever more sophisticated – and automated – algorithms using Machine Learning to generate maximum alpha and minimise risk has created what can be best described as an ‘arms race’ in the sector. For banks to remain agile and competitive, they must constantly update their offering to customers, developing new strategies that advance positions and protect customers. For that to happen, they need to re-think their approach to data analytics.

The time-series continuum

Banks can’t out-compete their competitors with aged data. Yet many organisations are still working to timescales of days and weeks when it comes to developing, validating, and launching new algos and models when they need to be thinking in seconds and minutes.

This might seem a strange statement after lauding the sector in the opening paragraphs for being at the vanguard of data and analytics. But the truth is many existing data management and analytics platforms simply can’t deliver the speed of insight required on the huge volumes of data being created.

Why is this? Arguably a combination of legacy systems – inflexible on-prem data centres and applications that cannot be easily scaled to deliver the storage and compute needed to tackle the huge volumes of data being created – and an approach to analytics that puts too much focus on historical, or big data.

The key is to think of all data, especially financial data though this applies to many other industries, as being time-series, i.e. having a specific value at a fixed point in time. The real-time context here is incredibly important as the business value of that data begins to perish from the moment it is created.

By adopting an approach that thinks of data analytics as being a continuous insights ‘engine’ rather than a batch process (even if batch processing is done hourly or daily), there are never any ‘gaps’ in an organisation’s understanding of how the business is operating. The analytics and insights being provided are a continuous and recorded stream of truth that deliver the insights that enable the rapid innovation and development of products and services.

Continuous analytics in the cloud

All financial organisations have a cloud strategy. The benefits of scale, flexibility, and cost savings are well understood. The strategic focus should now shift to ensuring cloud architecture is optimised to deliver the continuous analytics that are fundamental to ongoing operational efficiency and commercial success.

There are three key considerations:

1: Access to data sources: The more market, trade, order, and reference data that can feed into an analytics platform, the better the quality of the insights that can be extracted. However, financial organisations have struggled with managing the sheer volumes, formats, and location of the data. Continuous analytics depends on the ability to instantly connect data sources, regardless of location, and build data pipelines to run analytics on. How easy is it to build these pipelines? Are they resilient and can they scale?

2: Optimised for coding: How quickly can quants, data analysts, and data scientists extract value from the data rather than spend time managing and processing it? Can they use preferred languages such as SQL and Python to develop and validate new algorithms in timescales that deliver a competitive advantage? Are there in-built Machine Learning interfaces or easy access to relevant microservices? Can reports be built quickly and visualised in a way that delivers immediate value?

3: Upgrade path: Most, if not all banks, as part of their cloud strategy, will be looking at how to use the cloud to re-architect and eventually, rebuild, their data management and analytics systems. What does that journey look like? How easy will it be to move data and optimise costs? What is the path to re-architecture and in some cases re-write applications? How integrated will it be with their cloud vendor’s wider ecosystem of services and support?

Ultimately, we see most if not all banks moving the vast amount of their data management and analytics requirements to the cloud. There will still be some processes that require ultra-low latency that can only be delivered on-prem, but otherwise, the cloud offers a myriad of benefits. However, all clouds, just like all real-time analytics platforms, are not the same and banks need to pay careful consideration to how their existing data sources, applications, and processes can be migrated, all the while prioritising the need for a continuous stream of analytics and insights.

James Corcoran
James Corcoran
Head of Growth at KX

You may also like

Stay In The Know

Get the Data Centre Review Newsletter direct to your inbox.