Story image

Opinion: A clear definition of AI for colocation providers

20 Jul 18

Article by Schneider Electric Data Center Science Center senior research analyst Patrick Donovan

If you keep track of industry trends at all, then I bet your newsfeed has been filled with exciting stories and bold predictions about artificial intelligence (AI), machine learning (ML), and neural networks.

With hyperbolical headlines such as, “How Artificial Intelligence Will Self-manage the Data Center” and “Is 2018 When Machines Take Over?”, I’m sure many people are mentally rolling their eyes as they click to the next story.

With any new trend, of course, there’s hype, confusion, and misleading claims. And companies sometimes want to grab on to it and claim it for their own before things are fully baked. But this doesn’t mean there isn’t substance behind all the talk.

I believe in the power of AI to make data centers better. Here in the Data Center Science Center, we believe AI’s biggest impact for colo data centers will be on reliability and less so on efficiency.

Also, we believe it will take a bit longer than some might think before AI’s value is really felt; there are some key challenges to overcome before this trend really hits its stride. I will address this in my next blog on this subject.

What is AI?

Artificial Intelligence (AI) and Machine Learning (ML) are two terms often used interchangeably or considered to be synonyms.

AI refers generally to the concept that a machine or system can be “smart” in carrying out tasks and operations based on programming and the input of data about itself or its environment.

ML, on the other hand, is an approach or method for making a machine or system more intelligent… to enable it to be more autonomous and self-adjusting as conditions change. ML is fundamentally the ability of a machine or system to automatically learn and improve its operation or functions without human input. ML could be thought of as being the current state of the art form of imbuing a machine with AI.

One technique for implementing ML that is said to be driving a lot of the current advancement in AI, is Deep Learning (DL). DL is a much more compute-intensive form of ML.

Deep Learning, also known as deep structured learning or hierarchical learning, involves the algorithmic analysis of vast numbers of data points over multiple levels where the output of one level is fed to the next in a successive fashion.

This layered framework is often referred to as an artificial “neural network” because of its intended resemblance to neural networks in human brains. This approach reduces error and speeds up the learning process.

Artificial intelligence represents a very broad spectrum of capabilities. Mechanical system controls using PLCs and automation servers that have existed for years, for example, is a form of artificial intelligence.

But that’s not what most people are talking about when they use that term today. Today it’s about using machine learning algorithms and deep learning, neural networks to automate operations in increasingly self-sufficient, reliable, efficient, and adaptive ways even as the environment changes in real-time.

Accelerated Technical Support Strengthens Foundation for AI

These methods of developing AI – machine learning, deep learning, neural networks, etc. – have been around for years, but technology limitations held back their progress. In the last few years, IoT, Big Data, and the availability of graphical processing units (GPUs) have greatly accelerated its development and application.

More advanced artificial intelligence depends, in part, on massive amounts of related/correlated data from which algorithms are developed and then used to enable machines to learn and make decisions.

The amount of device and environmental data has exploded thanks to decreasing costs for sensors, network connectivity, storage, and bandwidth. The growth of Big Data analysis means this rich data can be handled and its value extracted more quickly, and with less resources than in the past.

Furthermore, processing this data in real-time with a low error rate requires powerful parallel processing capabilities that are now possible with today’s GPUs. These trends have created a technological foundation for applying AI in data centers.

Current machine learning-based AI methods are very strong at doing two fundamental things:

  1. Recognizing patterns in very large and well-labeled datasets – image recognition and natural language processing, for example.
  2. Automating current processes and services that require data in decision making – as in services, maintenance, and hardware replacement.

But there are some key challenges related to data which must be dealt with before machine learning-based AI is widely developed in the industry and adopted by colocation providers.

AI Will Increase Performance for Colocation Providers

Nonetheless, I’m confident the industry will solve these challenges, and in many ways, they are being addressed now. Colocation data centers, of course, strive to be efficient with resources and quick to deploy new capacity all without compromising availability for their tenants. Data analytics and increasingly AI will become tools for providers to incrementally improve their performance on these points.

Achieving cyber resilience in the telco industry - Accenture
Whether hackers are motivated by greed, or a curiosity to assess a telco’s weaknesses; the interconnected nature of the industry places it in a position of increased threat
DigiCert's QuoVadis acquisition extends PKI expertise in EU
DigiCert has now officially completed its acquisition of QuoVadis Group from Swiss security firm WISeKey International.
Commvault fully integrates backup with Cisco Hyperflex
Its IntelliSnap technology has been validated to work with Cisco HyperFlex hyper-converged systems without the need for third-party tools.
Huawei continues 5G trials despite ongoing concern
Huawei completed the 5G NR test at 2.6GHz spectrum in the 5G trial organised by the IMT-2020 (5G) Promotion Group. 
Experts comment on record 772mil-user data breach
Dubbed “Collection #1”, the data set contains emails and passwords with over a billion unique combinations of email addresses and passwords.
Top risk facing organisations? Why, it’s an IT talent famine
For some time there has been talk about how the IT industry is crying out for new talent and skills, which a lot of people have glossed over. But now Gartner says it is a harsh reality.
LISA Double Access fibre management system to launch at Cisco Live
“In a data centre, the protection of the fibre is key, which is exactly what the LISA Double Access offers customers.”
Data centre cybersecurity actions that most people overlook
Schneider’s Steven Carlini discusses ways to improve data centre cybersecurity that most people don’t think of until it’s too late.