Skip to content Skip to footer

Five must-haves for real-time analytics using time series data

Image: Adobe Stock / Connect world

The data problem has not appeared overnight. It has been growing for years, and not just in the technology sector.

When Thomas Jefferson was president, he was receiving around 150 letters per month. One hundred years later, Theodore Roosevelt needed a dedicated staff of around 50 to handle the increased volume. By Harry Truman’s time, it was arriving at a rate of three truckloads per day.

Current volumes are around a massive 65,000 letters per week – not to mention the emails, tweets, and social media posts that go along with them – and that’s the data volume challenge every business is now facing.    

But as well as increasing in volume, data has become more valuable, and the decisions based on it are more critical – just think of predictive healthcare, anomaly detection, predictive maintenance and operational equipment efficiency, and pre-and post-trade analytics. And that’s before businesses consider what machine learning may reveal in terms of trends or patterns, which, depending on how quickly they’re found, could either yield opportunity or spell disaster for unprepared teams.

But many companies in all industry sectors are not getting enough of these insights. They are too focused on solving technical problems around their data at the expense of uncovering valuable information with it. And the main reason why this is happening is that their time series database and real-time analytics software aren’t up to the demands being placed on them. Here are five must-haves for any modern real-time analytics engine:

Time series optimisation    

Most data today is time series based, generated by processes and machines rather than humans. Any analytics database should be optimised for its specific characteristics like append-only, fast, and time-stamped. It should be able to quickly correlate diverse data sets and perform in-line calculations as well as execute fast reads and provide efficient storage.

Openness in a connect world

Alongside this, the data landscape of most large, modern enterprises is broad. This means any analytics engine has to interface with a wide variety of messaging protocols and support a range of data formats along with inter-process communication (IPC) and REST APIs for quick, easy connectivity to multiple sources. It should also cater for reference data, like sensor or bond IDs, that enable it to add context and meaning to streaming data sets, giving the ability to combine them in advanced analytics and share them as actionable insights across the enterprise.      

Historical data taking the lead    

By combining real-time data for immediacy with historical data for context, companies can make faster and better in-the-moment responses to events as they happen. And eliminate the development and maintenance overhead of replicated queries and analytics on separate systems. This ability to rapidly process vast quantities of data using fewer computing resources is also well suited for machine learning initiatives, not to mention reducing TCO and helping businesses to hit sustainability targets.          

Easy, and early, adoption

Look for analytics software built with microservices that enable developers and data scientists to quickly ingest, transform and publish valuable insights on datasets without the need to develop complex access, tracking, and location mechanisms is a must. Complications like data tiering, aging, archiving, and migration can take up valuable time and resources which could be better used to concentrate on extracting actionable insights. Natively integrating with major cloud vendors and making this available as a fully managed service should also be an important consideration if businesses are seeking an easy adoption process.

Proven in production

Time series databases have been around for a long time. However, the ever-growing volume, velocity, and variety of data, and the need to generate rapid insights and actions from it, means many technologies are not proven in the field. Look for software where there are robust use cases and clear examples of return on investment.    

Data has evolved and businesses need to recognise this and evolve too. It is now an asset with its own C-Level owners, allowing businesses to automates decisions in fields including but not limited to trading, production, and network. Plus, it holds a higher cost value, as you pay more for it, but it is able to contribute greater economic freedom to businesses than ever before. These aren’t even the only ways its changed, with it becoming fast paced in how fast it can affect markets and created in type, as unstructured and structed data find themselves fused together more and more.

As most people know, there are many benefits businesses can reap from continuous, context-rich data analytics-driven insights and so these changes must be recognised and activities adjusted accordingly. Real-time analytics using time series data can deliver better business decisions, enable enterprises to react faster to market changes, increase customer satisfaction, and improve their bottom line providing they have the right technology in place.

James Corcoran
James Corcoran
Head of Growth at KX

You may also like

Stay In The Know

Get the Data Centre Review Newsletter direct to your inbox.