Story image

SwiftStack's 4 steps to moving enterprise data to the cloud

16 Aug 2017

​Public cloud infrastructure is increasingly the ‘go-to’ storage option for organisations of all sizes.

However, SwiftStack says those with hundreds of terabytes or petabytes of data find the shift to cloud more complex, disruptive, and inflexible than it is presumed.

While the business values of cloud storage are obvious, large data volumes can present significant challenges for migration, compatibility and agility.

“Pricing based on consumption, elastic scalability, improved collaboration, and other key advantages of the public cloud are attainable goals, but those with large data volumes must be mindful of their unique environment,” says Joe Arnold, SwiftStack president and chief product officer.

“Fortunately these organisations will also find that the right cloud data management tactics and tools will unleash more value from that data, and respond as business needs and workloads evolve.”

SwiftStack have provided a list that details how to transition even petabyte-scale data to cloud environments in just four steps:

1. ‘Drift and Shift’ to cloud-native storage

According to SwiftStack, data that is not yet in the cloud is stored in silos, each with specific data access protocols which can make it extremely complicated to ‘lift and shift’ to the public cloud.

SwiftStack recommends a slight modification of the term to ‘drift and shift’ where organisations shift storage to a cloud-native format that uses on-premises storage. The data remains where it is so this step is both low cost, low risk and can be done over time, achieving the business benefits of cloud storage while having the data ready to move to public cloud when the time is right.

2. Automate operations

Organisations can make it possible for even a single administrator to manage a multi-petabyte hybrid cloud infrastructure by utilising data management software with built-in automation that operates based on policies set and controlled by IT.

Define the service objectives for protection, synchronization, location, access, capacity usage, etc., and let the software control the placement of data and its delivery to applications. SwiftStack stresses the importance of this step, because as the business demands evolve, so can the policies controlled by IT.

3. Stay flexible

According to SwiftStack, all major public cloud providers use object storage platforms for long-term retention and governance of the end user’s data (Amazon, Google, Microsoft, and Rackspace, for example).

There are enough differences and proprietary technology that means moving a petabyte or even just a part of a petabyte from one provider to another may not be possible. IT can stay in full control with flexible data management across all locations and clouds that is achieved with cross-cloud platform compatibility.

4. Metadata mastery

SwiftStack asserts legacy storage like SAN and NAS systems just weren’t built with metadata in mind. However, cloud-native storage retains metadata with the object data, rather than in a separate database only its own application can read.

Cloud storage is the ideal medium to take advantage of metadata as it enables users to harness, organise and analyse metadata associated with petabytes of business data – something that would have been unthinkable just a few years ago.

In ongoing cloud war, Google to acquire data migration specialist
Google is currently behind AWS and Microsoft in the cloud battle, and it would seem this play is an attempt to claw some ground back.
Interview: CyrusOne’s new Europe president on aggressive expansion
In this exclusive interview Tesh Durvasula shares how the company plans to have a Europe data centre portfolio providing nearly 250 MW by the year’s end.
Enterprise cloud deployments being exploited by cybercriminals
A new report has revealed a concerning number of enterprises still believe security is the responsibility of the cloud service provider.
Pure Storage expands enterprise data management solutions
It has integrated StorReduce technologies for a cloud-native back up platform, and expanded its data fabric solution for cloud-based applications.
HPE launches new real-time processing edge platform
The platform is said to help communication service providers (CSPs) to capitalize on data-intensive, low-latency services for media delivery, connected mobility, and smart cities.
‘Digital twins’ entering mainstream use sooner than expected
The term ‘digital twin’ may sound foreign to some, but Gartner says it is rapidly becoming established among modern organisations.
Infinera launches new ‘disruptive’ network architecture
The new end-to-end network architecture is said to enable instantly scalable, self-optimizing networks that adapt to the demands of specific users and applications.
Survey finds DC managers want more efficiency, not horsepower
More servers and more CPU power used to be the answer to boosting data centre performance, but it appears this is no longer the case.