Story image

Solving the storage dilemma: Is open source the key?

Business IT is facing storage growth that’s exceeding even the highest estimates, and there’s no sign of it slowing down anytime soon. Unstructured data in the form of audio, video, digital images and sensor data now makes up an increasingly large majority of business data and presents a new set of challenges that call for a different approach to storage.

For CIOs, storage systems that are able to provide greater flexibility and choice, as well as the capability to better identify unstructured data in order to categorise, utilise and automate the management of it throughout its lifecycle are seen as the ideal solution.

One answer to solving the storage issue is software-defined storage (SDS) which separates the physical storage hardware (data plane) from the data storage management logic or ‘intelligence’ (control plane). Needing no proprietary hardware components, SDS is the perfect cost-effective solution for enterprises as IT can use off-the-shelf, low-cost commodity hardware which is robust and flexible.

A research paper by SUSE entitled Managing the Data Explosion Challenge with Open Source Storage found that rising storage costs consistently ranks at the top of business concerns across the industry, but data growth is only one part of a more complex equation. The greatest ongoing cost for IT usually lies in system support and management.

For these reasons, open source storage is an excellent option that is highly customisable and scalable, with a strong community, and a high quality of code. If you’re thinking of making open source storage part of your strategy, consider the four points below

1. Maintain multiple storage vendors  While it can be tempting to take hold of a vendor’s short and medium-term pricing strategy by placing all of your business with them and generating operational simplicity in the process with one set of storage tools and processes, you could be playing poker with your storage budget and gambling that your vendor partners will not punish you with price hikes later.

2. Pay attention to the Cloud war Amazon unquestionably currently has the lead in adoption over Microsoft Azure and Google Compute. Nevertheless, everyone knows that Amazon is playing a ‘long game’ of profit tomorrow, not today. Hence, many have a foot in Azure or Google Compute, even when they have a leg in AWS because there must be an exit plan. But this comes with a price – and that price is operational complexity – a price that can be particularly high in the world of storage, where the new pricing models can be about how much data you move down the wire rather than how much you own.

3. Maintain and expand your skill sets to avoid lockdown  It’s tempting to reduce complexity by standardising on a small set of suppliers. The upside is that you get simplicity – one approach to storage means it’s easy to train staff, some are no longer necessary in a cloud scenario, and arguably, you can get on with that ‘core business’ proprietary vendors love to tell you about of serving customers. However, the downside is if you don’t know how to exit AWS to move to Azure without crippling operations or if you don’t know how much it costs to repatriate data, and if you’ve got nowhere to put it when you do, you are locked down and at the mercy of suppliers.

4. Use open source software-defined storage — or pay more  If you use only cloud or only proprietary software, your software and hardware costs will always be greater than they need to be. This is a simple fact. Open source means costs savings from moving to commodity hardware, and the total elimination of proprietary software costs. Proprietary storage vendors will tell you, rightly, that cost can reappear as skilled headcount, consultancy and support. But then, if you don’t have skilled headcount, how are you going to maintain your capability to switch cloud providers, and how are you going to assess which vendors to use? 

Research and consider your options thoroughly and consider open source software-defined storage, as it can provide organisations with a highly scalable solution that drastically reduces storage costs in both capital cost and operation expenditure while providing greater adaptability and simplicity in the management of your storage environment.

Article by SUSE APAC chief technologist Peter Lees

Server Technology beats out competition at DCS Awards
Server Technology has taken out the top spot for the Data Centre PDU Innovation of the Year at the DCS Awards.
LogRhythm releases cloud-based SIEM solution
LogRhythm Cloud provides the same feature set and user experience as its on-prem experience.
IGEL & ControlUp bring analytics to endpoints everywhere
The strategic partnership allows IGEL to integrate with ControlUp’s real-time monitoring and analytics capabilities via the IGEL Universal Management Suite (UMS).
Nutanix evolves multicloud offerings
Nutanix has expanded its multicloud solutions portfolio to further evolve its offerings across public and private cloud.
Bluzelle launches data delivery network to futureproof the edge
“Currently applications are limited to data caching technologies that require complex configuration and management of 10+ year old technology constrained to a few data centers."
Exploring the different needs for cloud services across Europe
Although digital transformation is happening across Europe, each country continues to have its own IT needs and the different cloud markets highlight this.
Trend Micro introduces cloud and container workload security offering
Container security capabilities added to Trend Micro Deep Security have elevated protection across the DevOps lifecycle and runtime stack.
Veeam joins the ranks of $1bil-revenue software companies
It’s also marked a milestone of 350,000 customers and outlined how it will begin the next stage of its growth.