I recently sat in on a webinar on 2017 data center predictions presented by a trio of analysts at the research firm IDC and was struck by a recurring theme: data center infrastructure management (DCIM) tools will be increasingly important in the year ahead.
For the webinar, “IDC FutureScape: Worldwide Datacenter 2017 Predictions,” the analysts discussed four of their top 10 data center predictions for 2017. They included this one:
Pay-as-you-go/use models will account for 50% of on premise and off-premise physical IT and data center asset spending by 2018, strengthening business and IT partnerships
Susan Middleton, Research Director with IDC’s Technology Financing Strategies group, said IT business models are changing rapidly and that increasing complexity of options will drive IT teams to establish partnerships with providers that understand their industry.
Companies will make the case for using pay-as-you-go services based on agility – the ability to deliver IT services faster, to meet business needs. Her advice was to look for providers that have proven service level agreements (SLAs) in your vertical industry.
DCIM comes into play regarding the SLA angle. To meet their SLAs, data center providers need to ensure their facilities are highly reliable and available. DCIM tools can be a big help in that regard, with their various monitoring capabilities that alert providers to issues before they become catastrophic problems.
Rick Villars, IDC’s Vice President for Data Center & Cloud, correctly pointed out that some organizations will continue to own and operate their own data centers.
Making the business case for that will increasingly be a matter of getting maximum use out of the data center space, which he said will lead to the use of smarter data center technologies and strategies.
He didn’t say “DCIM,” but that’s what I heard, given DCIM tools do indeed help companies maximize the use of their data center space.
Another prediction the analysts discussed was this one:
In 2017, only 20% of enterprises will deploy software defined data centers on schedule because capacity constraints in critical facilities delay transformation efforts
With respect to this topic, the IDC Research Director Jennifer Cooke did talk about DCIM and the need for “smarter” data centers.
Converged data center architectures put heavy pressure on data center power and cooling systems. Her advice was to explore smart data center technology to ensure you have a proper foundation in place.
She correctly noted that doesn’t necessarily mean buying all new infrastructure; you can instrument much of what you already have with sensors and monitoring tools that will help you determine whether you’ve got the proper power and cooling in place to support the type of converged infrastructure that software-defined data centers depend on.
As Villars noted, we hear lots of talk about the Internet of Things and smarter cities and roads but, “We should be talking about smarter data centers; that is a critical part of this conversation.”
I noted two additional predictions on the top 10 list that were not discussed during the webinar but that nonetheless point to the need for DCIM:
If converged infrastructure puts pressure on data center power and cooling, then hyper-converged infrastructure puts on even more pressure.
Failure to use a quality DCIM tool is what will lead those 30% of businesses to suffer failures due to mismatches in power delivery and IT workload profiles. With DCIM, enterprises can accurately predict how much load a given row or rack can handle from a power and cooling perspective and proceed accordingly.
It’s clear that DCIM will play an increasingly important role in 2017.
Article by Paul Desmond, Scneider Electric Data Center Blog