Story image

DCIM’s improvement to workflow processes is key to eliminating data center challenges

29 May 2017

Until recently, there’s been a numbing inevitability about survey results from research into the data center sector.

Finding number one; all downtime is the result of human error.

Finding number two; uptime is more important than efficiency – the penalty for not being able to keep the lights on is greater than the cost of keeping the lights on.

And then, earlier this year, IDC’s Research Director, Jennifer Cooke, shared with us her research amongst around 400 enterprise and service provider data center operators in a report titled “Datacenter Facilities Infrastructure Management and Operations Survey, January 2017.”

It’s a really insightful study in my opinion, amongst other things, indicating the majority of downtime is now being caused by system error.

But for me, the most interesting aspects of the research is in unpacking the impact of a range of data center problems on the business itself.

In particular, IDC reports that there is a clear line of delineation between data centers which use dynamic management tools to provide real-time visibility of infrastructure and those using static methods. In short, the former experience fewer problems.

The main problems emerging include slower equipment deployment times and the inability to meet deadlines.

Underlying causes such as the lack of a holistic view of data center resources (e.g., power and cooling capacity), and a lack of coordination between IT and facilities organizations, clearly have a negative impact. Unless addressed, this is likely to escalate as the demand for IT services grows.

So, it’s no great surprise that improving internal processes and investing in software are perceived to be the highest priorities in terms of overcoming data center challenges.

With process change pivotal, a top benefit of DCIM is improving workflow. The idea of using software for dynamic management of resources and following this through with internal process improvements is a key step forward.

I hope you’ll forgive this commercial message from our sponsor, but the recent introduction of Struxureware Data Center Operation 8.1 includes new features that recognize this requirement.

As such, this new version enables data center operators to design, manage and execute workflow tasks more efficiently and gain greater control over the data center environment.

New features include workflow template creation to simplify workflow management at the same time as providing more control of tracking, moves, adds, and changes across all data center assets in order to improve system performance.

Additionally, integration of third-party IT services such BMC Remedy, enables the sharing of relevant information across systems from multiple vendors to give a complete view of data center performance and availability to boost reliability and efficiency.

Naturally we’ve considered the needs of users and provided an adaptive user interface so that the system can be accessed via any desktop, tablet or smartphone device.

Whether in the data center floor or the NOC, this makes it simpler for everyday tasks to be managed, such as tracking status updates, managing project timelines, and setting task owners.

Bridging the gap between IT and facilities management has long been seen as an elusive holy grail.

However, in meeting the need to get equipment moves, adds and changes deployed in a transparent and accountable manner, the workflow benefits which are integral with the DCIM package provides management with a sustainable solution to current and future data center challenges. 

Article by Henrik Leerberg, Schneider Electric Data Center Blog

Intel building US’s first exascale supercomputer
Intel and the Department of Energy are building potentially the world’s first exascale supercomputer, capable of a quintillion calculations per second.
NVIDIA announces enterprise servers optimised for data science
“The rapid adoption of T4 on the world’s most popular business servers signals the start of a new era in enterprise computing."
Unencrypted Gearbest database leaves over 1.5mil shoppers’ records exposed
Depending on the countries and information requirements, the data could give hackers access to online government portals, banking apps, and health insurance records.
Storage is all the rage, and SmartNICs are the key
Mellanox’s Kevin Deierling shares the results from a new survey that identifies the key role of the network in boosting data centre performance.
Opinion: Moving applications between cloud and data centre
OpsRamp's Bhanu Singh discusses the process of moving legacy systems and applications to the cloud, as well as pitfalls to avoid.
Global server market maintains healthy growth in Q4 2018
New data from Gartner reveals that while there was growth in the market as a whole, some of the big vendors actually declined.
Cloud application attacks in Q1 up by 65% - Proofpoint
Proofpoint found that the education sector was the most targeted of both brute-force and sophisticated phishing attempts.
Huawei to deploy Open Rack in all its public cloud data centres
Tech giant Huawei has unveiled plans to adopt Open Rack proposed by the Open Compute Project in its new public cloud data centres across the globe.