It is down to organisations to protect their own data in the cloud and prevent unauthorised data access, writes Sergei Serdyuk, VP of Product Management at NAKIVO.
Organisations are driven to utilise cloud technologies for the considerable flexibility and scalability that cloud affords, allowing businesses to easily adjust their usage to match changing requirements. Cloud platforms also bring cost savings by reducing hardware and maintenance investments, while the flexible pricing model of cloud services take their cost effectiveness a step further. The cloud enables collaboration and data sharing with ease, while well-executed cloud migration has the potential to increase revenue and profitability. With all these benefits, it is easy to assume that once your data is in the cloud, it is automatically safe and sound. Unfortunately, that is not the case.
The shared responsibility model
There are at least three major reasons why companies need to protect their data in cloud environments.
Firstly, it is crucial to know that data protection is usually not the cloud provider’s responsibility. The provider might handle the infrastructure, but data protection falls under the remit of the customer. The second reason is the risk of violating data privacy and retention regulations. If there are not proper measures in place to comply with those regulations, organisations could find themselves on the wrong side of the law. The third reason is the risk of common data loss threats such as ransomware attacks, where data can be held hostage and victims often forced into a position to pay a hefty ransom to get their data back.
Cloud service providers do not usually promise to protect your data. Most providers operate on the Shared Responsibility Model, which is often misunderstood. This model explicitly states that the cloud provider is only responsible for maintaining the availability and security of the infrastructure. This means it is up to customers to protect their own data and prevent unauthorised access. Therefore, assuming the cloud provider will handle the protection and recovery of one’s digital assets could be costly mistake.
The need for data protection in the cloud is also enforced by industry specific regulations. There are also government policies that restrict the storing of personal information off-site. For example, in the healthcare and financial services industries, there are strict laws requiring organisations to not only protect sensitive data like patients’ records and financial transactions, but also store them for extended periods of time. The problem is that cloud service providers do not always offer the degree of control and visibility organisations need in order to comply with such specific regulations. This means cloud can carry a high risk of data violation, which could have legal and financial consequences. Although bulk providers usually have strong security measures in place, data housed in the cloud is still vulnerable to threats like data leaks and ransomware attacks. Recently a specialised ransomware type known as Ransomcloud has emerged, which specifically targets data in cloud platforms.
Given the current state of data protection laws and strict uptime requirements, even a fairly low risk of data loss is unacceptable. At the same time, the high accessibility that cloud platforms provide has its drawbacks. For instance, it leaves the organisation vulnerable to scenarios such as a disgruntled employee gaining access to company data and deleting files, which could only be recoverable from backups. This can also happen accidentally as human error remains one of the most common causes of business data loss.
Core components of a reliable cloud data protection strategy
Data classification
Different types of data require different approaches to protection. Your strategy should help to classify data according to its type, sensitivity and business value if stolen, altered or destroyed. With data classification, one can prioritise efforts and focus on protecting critical data that requires fast recovery in the event of a disaster. However, data that is not critical but important enough to retain could be suitable for a cloud environment.
Data backups
Once the above priorities are set, the procedures to support data backups should be at the centre of the strategy. These procedures are integral to avoid data loss, as well as being a requirement for compliance. Backups should be performed regularly, as a disaster can happen at any time. Once the appropriate storage has been configured for backup workloads, the next step is a process of assigning tiers to backup data based on its importance. Critical backup data usually goes to the highest tier, while data that is not critical to business operations goes to less expensive storage complex infrastructures, aiding in simplifying and streamlining workflows.
Immutability
It is not only cloud data that is vulnerable – backup data is also not automatically protected. This issue can be addressed by applying immutability to backups, effectively preventing any changes to backup data.
Encryption
Encryption is a crucial and often overlooked aspect of any cloud data protection strategy. Allowing your cloud service provider to encrypt your data is like locking your house up and giving someone else the keys. A rule of thumb is to encrypt data before it leaves your premises to be stored on the cloud. By doing so complete control can be maintained over the encryption keys and the security of data.
The zero trust model
Insider threats can be as severe as external ones, with employees misusing their permissions to leak or delete critical data. Traditionally, companies have relied on user authentication using only passwords, which can be stolen or easily guessed by experts. The zero trust model by default does not automatically trust device users, but instead establishes reliable verification mechanisms. Utilising multi-step authentication processes such as role-based access control provides staff with access levels depending on their role. Likewise, adding two factor authentication acts as an additional safeguard against unauthorised access, making data much more secure.
Data recovery
Part of the data protection strategy should include robust measures for data recovery in case of accidents or disasters. A dedicated site for disaster recovery should be established, with recovery procedures for multiple scenarios configured. This ensures a predefined plan and with resources in place to recover data and resume operations.
The next step is to make sure data recovery utilises available resources efficiently. As the amount of data organisations generate and store grows rapidly, ensuring the data protection process remains cost effective and efficient is vital.
Finally, data recovery test runs should be carried out to identify any bottlenecks or issues, and ensure defined objectives are met. The worst time to find a weak spot in defence is during an attack or a system failure.
Hybrid cloud: a balanced approach
While recommending not to place all bets on public cloud, it need not be abandoned either. Utilising hybrid cloud architecture, a balanced approach to cloud adoption can be leveraged with success. This model combines all the benefits of private clouds and public clouds to deliver a blend of security, visibility, performance and cost effectiveness.
Critical data can be stored in the private cloud, ensuring higher levels of control and security, while less sensitive data can be kept in public clouds. This creates a layer of redundancy by distributing data across multiple locations – an important safeguard against data loss. Hybrid cloud storage also seamlessly scales storage infrastructure according to need. It reduces overheads, aiding the balance of the consumption of on premises and cloud storage resources resulting in long-term cost savings.