Story image

Is it time to let the X-factor determine data center temperature?

02 May 2017

It’s been a couple of years now since a group of experts from major server hardware vendors, wrote a paper about the “x-factor,” which is a way to quantify server reliability at different data center temperatures.

The idea was to help companies make more informed, business-driven decisions about how to operate their data centers.

I’m wondering whether companies are now comfortable enough to control their data center temperatures around this idea of the x-factor, rather than simply setting it at a certain temperature all year ‘round.

First, a little background on how the x-factor works. The idea is that servers are sensitive to temperature; they fail more quickly at higher temperatures than lower temperatures.

ASHRAE TC 9.9 several years ago published the third edition of its Thermal Guidelines for Data Processing Environments which outlined server reliability rates at various temperatures.

This is where the x-factor comes in. It’s a way to measure the relative expected server reliability at different temperatures.

The TC 9.9 group used a data center operating temperature of 68°F as its baseline; this temperature represents an x-factor of 1.00. If the temperature goes higher, the x-factor goes up; when it goes lower, the x-factor goes down.

For example, at 59°F the x-factor is 0.72, meaning there is a 28% lower probability of server failure if the data center were operated constantly at that temperature vs. operating at 68°F. 

At the other extreme, at 113°F the x-factor is 1.76, meaning there’s a 76% higher probability of failure vs. operating at 68°F.

It’s important to note that 76% figure is relative to what the server failure rate would normally be; it doesn’t mean there’s a 76% chance of failure.

Typical annual server failure rates are quite low, around 2% to 4%. Even using the higher 4% figure, according to ASHRAE TC 9.9 calculations, operating at 113°F continuously would only raise the actual server failure rate by an additional 3%, to 7% annually.

Here’s where things get interesting.

As it turns out, if you operate for part of the year at a lower temperature, that can offset the times when you operate at higher temps, and this can have a profound impact on data centers that use ambient outside air for cooling at least part of the time.

Let your data center get very cool in winter months and you can operate it at higher temps in summer, so long as your x-factor stays at whatever you’re comfortable with given your risk profile.

In fact, you could let your x-factor decision dictate the temperature at which your data center runs at different times.

Using an intelligent cooling system and a data center infrastructure management (DCIM) platform, it’s certainly possible to operate in this fashion – and perhaps realize significant gains in energy efficiency.

The question is, are companies ready to take that kind of leap?

Is the kind of reliability data we have from ASHRAE TC 9.9 enough to convince you that this approach would work?

The data is highly focused on servers; it does not address other gear such as networking equipment and storage systems to the same extent.

Is that a deal-breaker? 

Article by John Niemann, Schneider Electric Data Center Blog

Orange Belgium opens 1,000 sqm Antwerp data centre
It consists of more than 500 high-density 52 unit racks, installed on the equivalent of 12 tennis courts.
Time to build tech on the automobile, not the horse and cart
Nutanix’s Jeff Smith believes one of the core problems of businesses struggling to digitally ‘transform’ lies in the infrastructure they use, the data centre.
Cloud providers increasingly jumping into gaming market
Aa number of major cloud service providers are uniquely placed to capitalise on the lucrative cloud gaming market.
Intel building US’s first exascale supercomputer
Intel and the Department of Energy are building potentially the world’s first exascale supercomputer, capable of a quintillion calculations per second.
NVIDIA announces enterprise servers optimised for data science
“The rapid adoption of T4 on the world’s most popular business servers signals the start of a new era in enterprise computing."
Unencrypted Gearbest database leaves over 1.5mil shoppers’ records exposed
Depending on the countries and information requirements, the data could give hackers access to online government portals, banking apps, and health insurance records.
Storage is all the rage, and SmartNICs are the key
Mellanox’s Kevin Deierling shares the results from a new survey that identifies the key role of the network in boosting data centre performance.
Opinion: Moving applications between cloud and data centre
OpsRamp's Bhanu Singh discusses the process of moving legacy systems and applications to the cloud, as well as pitfalls to avoid.