Story image

7 tech trends that define next-generation data center monitoring

15 Jun 17

Data center monitoring methods have undergone a dramatic change over the last several years.

Consider data center power and cooling infrastructure monitoring as an example. These monitoring systems handle roughly 3 times more data points than they did 10 years ago. Such monitoring used to be an afterthought or a “nice to have.”

Today, automated monitoring has moved front and center as too few human resources are available to manage the large quantities of information coming in. According to IDC, by the year 2020, about 1.7 megabytes of new information will be created every second for every human being on the planet. 

Older remote monitoring systems were never designed to support this amount of data and the associated alarms that get generated, let alone extract value from that data.

These older systems were desktop-based, limited in data output, and largely reactionary (i.e. depended on humans to interpret what was wrong). New generation digital remote monitoring tools now address these limitations.

Today, there are 7 technology trends that are influencing the way data centers are monitored:

Embedded sensing devices – The volume of devices embedded in data center power and cooling equipment has increased 300% over the last ten years. Prices of these sensors have dropped while quality has risen. These sensors record performance data at a much more granular level which shortens the payback on money invested in monitoring.

Cybersecurity – Any solution that gathers, centralizes or distributes data now needs a cybersecurity layer. Vendors using a Secure Development Lifecycle (SDL) process are better positioned to ensure your security because products that have been coded, pretested, verified and validated utilizing cybersecure methods are better protected from a cyberattack.

Cloud computing – Cloud computing enables digital remote monitoring services. IT services such as predictive analytics and machine learning can run on a cloud computing platform to further increase the value of data center monitoring.

Big data analytics – These high-level analytics convert the large volumes of real-time data being gathered across multiple devices into information that is useful for driving decisions. An example would be the application of algorithms to component performance (such as UPS or CRAC) to predict when the optimal time would be to perform maintenance (sometimes referred to as predictive or condition-based maintenance).  This lowers the cost of maintenance through fewer service calls.

Mobile computing – Mobile devices enable managers to move between locations without being disconnected from daily operations. The whole point of monitoring data centers is to identify and address a state change before downtime occurs. Mobility enables faster response times.

Machine learning – Machine learning improves on the data analytics model by using results from previous learning. In a data center “near misses” surrounding downtime are key to learning. Understanding and documenting why these incidents occurred through machine learning reduces the risk of future errors.

Automation for labor efficiency – The business notion of “doing more with less” is now supported through digital remote monitoring. No longer do technicians need to physically inspect racks to determine if temperatures are too high or if other conditions are abnormal.

Data centers are becoming more reliable and efficient through the use of digital remote monitoring and more efficient maintenance operations.  These monitoring services help to extract the benefit of new big data and machine learning technologies. Modern platforms need to be designed to take advantage of the data constantly being generated by the data center physical infrastructure. 

Article by Victor Avelar, Schneider Electric Data Center Blog Network

Dell dominates enterprise storage market, HPE declines
The enterprise storage system market continues to be a goldmine for most vendors with demand relentlessly rising year-on-year.
The key to financial institutions’ path to digital dominance
By 2020, about 1.7 megabytes a second of new information will be created for every human being on the planet.
Is Supermicro innocent? 3rd party test finds no malicious hardware
One of the larger scandals within IT circles took place this year with Bloomberg firing shots at Supermicro - now Supermicro is firing back.
Record revenues from servers selling like hot cakes
The relentless demand for data has resulted in another robust quarter for the global server market with impressive growth.
Opinion: Critical data centre operations is just like F1
Schneider's David Gentry believes critical data centre operations share many parallels to a formula 1 race car team.
MulteFire announces industrial IoT network specification
The specification aims to deliver robust wireless network capabilities for Industrial IoT and enterprises.
Google Cloud, Palo Alto Networks extend partnership
Google Cloud and Palo Alto Networks have extended their partnership to include more security features and customer support for all major public clouds.
DigiCert conquers Google's distrust of Symantec certs
“This could have been an extremely disruptive event to online commerce," comments DigiCert CEO John Merrill.