Story image

Speak like a data center geek: Virtualization

09 Dec 2016

Our blog series, “How to Speak Like a Data Center Geek,” works to demystify the complexities of the data center, and this time, we’re taking on virtualization. This is a topic that has managed to remain on the cutting edge for more than five decades, ever since virtualization technology was applied to software starting in the 1960s.

Virtualization is, in some sense, about illusion, though not the kind that involves, um, spitting out live frogs. It can create what the O’Reilly Media book “Virtualization: A Manager’s Guide,” called “the artificial view that many computers are a single computing resource or that a single machine is really many individual computers.”

Or: “It can make a single large storage resource appear to be many smaller ones or make many smaller storage devices appear to be a single device.”

The goals of virtualization include:

  • Higher performance levels
  • Improved scalability and agility
  • Better reliability/availability
  • To create a unified security and management domain

Whatever the goal, odds are virtualization technology is at work in your data center right now. In this “Data Center Geek” entry, we’ll look at a few different layers of virtualization.

Virtualization

First, we start with a baseline definition. Virtualization is a way to extract applications and their underlying components from the hardware supporting them and present a logical or virtual view of these resources. This logical view may be strikingly different from the physical view.

Consider a virtually partitioned hard drive, for example. Physically, it’s plainly just one hard drive. But virtualization allows us to construct a logical division of the hard drive that creates two separate hard drives that operate independently, maximizing processing power.

Access virtualization

This layer allows individuals to work from wherever they are, while using whatever networking media and whatever endpoint device is available. Access virtualization technology makes it possible for nearly any type of device to access nearly any type of application without forcing the individual or the application to know too much about the underlying technology.

Application virtualization

This technology works above the operating system, making it possible for applications to be encapsulated and allowing them to execute on older or newer operating systems that would normally pose incompatibilities. Some forms of this technology allow applications to be “streamed” down to remote systems, execute there and then be removed. This approach can increase levels of security and prevent data loss.

Processing virtualization

This technology is the current media darling. This layer hides the physical hardware configuration from system services, operating systems or applications. One type makes it possible for one system to appear to be many, so it can support many independent workloads. The second type makes it possible for many systems to be viewed as a single computing resource.

Network virtualization

This layer can hide the actual hardware configuration from systems, making it possible for many groups of systems to share a single, high-performance network while thinking each of those groups has a network all to itself. See? Illusion.

Network virtualization can use system memory to provide caching, or system processors to provide compression or eliminate redundant data to enhance network performance.

Storage virtualization

Like the network virtualization layer, this layer hides where storage systems are and what type of device is actually storing applications and data. It allows many systems to share the same storage devices without knowing that others are also accessing them. This technology also makes it possible to take a snapshot of a live system so that it can be backed up without hindering online or transactional applications.

Article by Jim Poole, Equinix blog network

How Dell EMC and NVIDIA aim to simplify the AI data centre
Businesses are realising they need AI at scale, and so enterprise IT teams are increasingly inserting themselves into their company’s AI agenda. 
Orange Belgium opens 1,000 sqm Antwerp data centre
It consists of more than 500 high-density 52 unit racks, installed on the equivalent of 12 tennis courts.
Time to build tech on the automobile, not the horse and cart
Nutanix’s Jeff Smith believes one of the core problems of businesses struggling to digitally ‘transform’ lies in the infrastructure they use, the data centre.
Cloud providers increasingly jumping into gaming market
Aa number of major cloud service providers are uniquely placed to capitalise on the lucrative cloud gaming market.
Intel building US’s first exascale supercomputer
Intel and the Department of Energy are building potentially the world’s first exascale supercomputer, capable of a quintillion calculations per second.
NVIDIA announces enterprise servers optimised for data science
“The rapid adoption of T4 on the world’s most popular business servers signals the start of a new era in enterprise computing."
Unencrypted Gearbest database leaves over 1.5mil shoppers’ records exposed
Depending on the countries and information requirements, the data could give hackers access to online government portals, banking apps, and health insurance records.
Storage is all the rage, and SmartNICs are the key
Mellanox’s Kevin Deierling shares the results from a new survey that identifies the key role of the network in boosting data centre performance.