Article by Blancco enterprise and cloud erasure solutions VP Fredrik Forslund
“Data is the new oil” is a phrase we have come to hear all too often. Are they comparable, though? Oil is scarce, whilst data is anything but. However, when you start to dig into how these commodities are used by businesses, there are undeniable similarities.
Both oil and data are highly valuable. Data is one of, if not the biggest asset to today’s businesses. Organisations can see improvements to efficiency, quality of service and greater customer retention by using it effectively. But hoarding and storing data in siloes, in an oil-tycoon-like manner, has become common practice for enterprises. By locking this data away in vast warehouses, enterprises are fracturing under the pressure of managing huge quantities and are under threat of the damaging consequences that could amount from a possible data spill.
Furthermore, with digital transformation driving a move to the cloud, organisations are under pressure to pay attention to their data in a way they haven’t needed to before. Failure to prepare properly ahead of a cloud migration could see enterprises risk eye-watering fines breaching regulatory standards, with the potential for catastrophic effects to bottom lines.
So, before embarking on this journey to the cloud – what are the steps organisations should be taking, to make sure it’s a smooth one?
Get under the hood of your data
The fact is that most organisations do not truly understand the data in their possession. It’s vital to regularly review what exists in order to make key decisions about the data that does and doesn’t hold value to the business. This becomes especially important when migrating to the cloud, as enterprises need to be aware of exactly what is being transferred. It presents an opportunity for the enterprise to employ proper methods of data classification; categorising and organising data to ensure its most effective and efficient use.
Classification of data encourages enterprises to make decisions about what data should be saved, and what data should be sanitised. Data sanitisation is a crucial process as it involves deliberately, permanently and irreversibly removing data from a memory device, making it unrecoverable.
Why is this so important? Proper understanding of your data guarantees that it is always accounted for, ensuring compliance with regulatory pressures such as GDPR and risk mitigation. Furthermore, sanitisation and the erasure of unused data will ultimately save money. If you have less data to store, then you’ll have less to pay for. In fact, the Global Databerg report, published by Veritas Technologies, found that that only 15% of the average company’s data is considered business-critical, 33% is redundant, obsolete or trivial (ROT) and 52% is unclassified (dark) and holds no value.
Another practice that helps in reaching a conclusion on what data to move is to have a strong understanding of your data’s lifecycle. Enterprises can orchestrate a step by step approach of internal screening to garner a consensus on the perceived longevity and future use of data. Close, continued analysis of the value of data will give an idea of its lifecycle and ultimately will contribute to cost savings.
Be prepared for your exit
It’s important to think about where the journey might end before embarking on a cloud migration. The cloud market is competitive – and changes rapidly. It’s common for enterprises to restructure and shift providers, but at this point it’s crucial to have their data house in order.
Can you ensure that all sensitive data will be properly erased when switching providers? Does your cloud provider have the means to sanitise that data? Do you have a full record and audit trail of all data that was stored with that provider? These are critical questions to be addressed when planning a cloud exit strategy.
If all goes well, a company may never have to implement its data exit strategy. But it’s important to be prepared - and this contingency provides an additional security measure.
Don’t forget the hardware
Sometimes companies are so enthusiastic about a move to the cloud, they jump in, feet first with a full transition. Typically, these companies take a hybrid approach to cloud services, storing data both on premise and off. In some instances, this is the best option as it provides full flexibility.
However, it’s crucial to think about the redundant hardware you’ll be left with on site. It’s just as important to handle this hardware with the similar thorough and rigorous processes you have in place for handling data.
Data sanitisation is critical in this instance, to prevent any duplication of data on old systems and to mitigate the risk that decommissioned machines can represent. It is also crucial that any device or server that is retired from use is accounted for, to reduce the risk of a security breach if a piece of hardware is lost or stolen.
Keeping an audit trail of when data moves from your private legacy infrastructure into the cloud is one of the better ways to maintain a reviewal process and stay on top of your data. Data should be accounted for at every step of its journey. Constantly monitoring the data lifecycle will help you continually understand its value to the organisation. These processes are basic hygiene factors that should be implemented in a business’s data security strategy from the outset of acquiring a piece of data, especially with the advent of stricter regulatory policies such as the GDPR.
Adopting these best practices can save enterprises from unforeseen doom. Taking the necessary care of your data is as vital as checking the oil pressure on a refinery: if you’re not careful there will be too much data to reasonably control and your siloes will burst.