Story image

Solving the storage dilemma: Is open source the key?

Business IT is facing storage growth that’s exceeding even the highest estimates, and there’s no sign of it slowing down anytime soon. Unstructured data in the form of audio, video, digital images and sensor data now makes up an increasingly large majority of business data and presents a new set of challenges that call for a different approach to storage.

For CIOs, storage systems that are able to provide greater flexibility and choice, as well as the capability to better identify unstructured data in order to categorise, utilise and automate the management of it throughout its lifecycle are seen as the ideal solution.

One answer to solving the storage issue is software-defined storage (SDS) which separates the physical storage hardware (data plane) from the data storage management logic or ‘intelligence’ (control plane). Needing no proprietary hardware components, SDS is the perfect cost-effective solution for enterprises as IT can use off-the-shelf, low-cost commodity hardware which is robust and flexible.

A research paper by SUSE entitled Managing the Data Explosion Challenge with Open Source Storage found that rising storage costs consistently ranks at the top of business concerns across the industry, but data growth is only one part of a more complex equation. The greatest ongoing cost for IT usually lies in system support and management.

For these reasons, open source storage is an excellent option that is highly customisable and scalable, with a strong community, and a high quality of code. If you’re thinking of making open source storage part of your strategy, consider the four points below

1. Maintain multiple storage vendors 
While it can be tempting to take hold of a vendor’s short and medium-term pricing strategy by placing all of your business with them and generating operational simplicity in the process with one set of storage tools and processes, you could be playing poker with your storage budget and gambling that your vendor partners will not punish you with price hikes later.

2. Pay attention to the Cloud war
Amazon unquestionably currently has the lead in adoption over Microsoft Azure and Google Compute. Nevertheless, everyone knows that Amazon is playing a ‘long game’ of profit tomorrow, not today. Hence, many have a foot in Azure or Google Compute, even when they have a leg in AWS because there must be an exit plan. But this comes with a price – and that price is operational complexity – a price that can be particularly high in the world of storage, where the new pricing models can be about how much data you move down the wire rather than how much you own.

3. Maintain and expand your skill sets to avoid lockdown 
It’s tempting to reduce complexity by standardising on a small set of suppliers. The upside is that you get simplicity – one approach to storage means it’s easy to train staff, some are no longer necessary in a cloud scenario, and arguably, you can get on with that ‘core business’ proprietary vendors love to tell you about of serving customers. However, the downside is if you don’t know how to exit AWS to move to Azure without crippling operations or if you don’t know how much it costs to repatriate data, and if you’ve got nowhere to put it when you do, you are locked down and at the mercy of suppliers.

4. Use open source software-defined storage — or pay more 
If you use only cloud or only proprietary software, your software and hardware costs will always be greater than they need to be. This is a simple fact. Open source means costs savings from moving to commodity hardware, and the total elimination of proprietary software costs. Proprietary storage vendors will tell you, rightly, that cost can reappear as skilled headcount, consultancy and support. But then, if you don’t have skilled headcount, how are you going to maintain your capability to switch cloud providers, and how are you going to assess which vendors to use? 

Research and consider your options thoroughly and consider open source software-defined storage, as it can provide organisations with a highly scalable solution that drastically reduces storage costs in both capital cost and operation expenditure while providing greater adaptability and simplicity in the management of your storage environment.

Article by SUSE APAC chief technologist Peter Lees

HPE building new supercomputer with €38m price tag
It will be installed at the High Performance Computing Center of the University of Stuttgart and will be the world's fastest for industrial production.
Study claims Denmark has been misled by hyperscale data centres
Despite repeated assurances from Danish ministers, almost none of the new hyperscale data centres are reusing their waste heat.
Workday – who are they and what do they do?
We quickly summarise everything you need to know about the up and coming business software leader.
Workday customers start deployments to AWS infrastructure
Business software vendor Workday has turned it's previously announced AWS partnership into a reality.
Cisco expands security capabilities of SD­-WAN portfolio
Until now, SD-­WAN solutions have forced IT to choose between application experience or security.
HPE launches new Datacenter Care capabilities
The company asserts the new capabilities will be delivered by HPE PointNext and will help businesses simplify complex IT environments.
Digital transformation an almost $2 trillion industry by 2022
"The unprecedented speed at which technologies are coming to market supporting DX strategies can only be described as frantic."
Eccellente! AWS building new data centres in Italy
The new infrastructure region is expected to be complete in early 2020 to enable customers to run workloads and store data in Italy with even lower latency.