Story image

Is it time to let the X-factor determine data center temperature?

02 May 17

It’s been a couple of years now since a group of experts from major server hardware vendors, wrote a paper about the “x-factor,” which is a way to quantify server reliability at different data center temperatures.

The idea was to help companies make more informed, business-driven decisions about how to operate their data centers.

I’m wondering whether companies are now comfortable enough to control their data center temperatures around this idea of the x-factor, rather than simply setting it at a certain temperature all year ‘round.

First, a little background on how the x-factor works. The idea is that servers are sensitive to temperature; they fail more quickly at higher temperatures than lower temperatures.

ASHRAE TC 9.9 several years ago published the third edition of its Thermal Guidelines for Data Processing Environments which outlined server reliability rates at various temperatures.

This is where the x-factor comes in. It’s a way to measure the relative expected server reliability at different temperatures.

The TC 9.9 group used a data center operating temperature of 68°F as its baseline; this temperature represents an x-factor of 1.00. If the temperature goes higher, the x-factor goes up; when it goes lower, the x-factor goes down.

For example, at 59°F the x-factor is 0.72, meaning there is a 28% lower probability of server failure if the data center were operated constantly at that temperature vs. operating at 68°F. 

At the other extreme, at 113°F the x-factor is 1.76, meaning there’s a 76% higher probability of failure vs. operating at 68°F.

It’s important to note that 76% figure is relative to what the server failure rate would normally be; it doesn’t mean there’s a 76% chance of failure.

Typical annual server failure rates are quite low, around 2% to 4%. Even using the higher 4% figure, according to ASHRAE TC 9.9 calculations, operating at 113°F continuously would only raise the actual server failure rate by an additional 3%, to 7% annually.

Here’s where things get interesting.

As it turns out, if you operate for part of the year at a lower temperature, that can offset the times when you operate at higher temps, and this can have a profound impact on data centers that use ambient outside air for cooling at least part of the time.

Let your data center get very cool in winter months and you can operate it at higher temps in summer, so long as your x-factor stays at whatever you’re comfortable with given your risk profile.

In fact, you could let your x-factor decision dictate the temperature at which your data center runs at different times.

Using an intelligent cooling system and a data center infrastructure management (DCIM) platform, it’s certainly possible to operate in this fashion – and perhaps realize significant gains in energy efficiency.

The question is, are companies ready to take that kind of leap?

Is the kind of reliability data we have from ASHRAE TC 9.9 enough to convince you that this approach would work?

The data is highly focused on servers; it does not address other gear such as networking equipment and storage systems to the same extent.

Is that a deal-breaker? 

Article by John Niemann, Schneider Electric Data Center Blog

DigiPlex’s data centre heat reuse system wins award
Its solution to reuse heat to warm thousands of local homes took out the accolade at the recent 2018 Energy Awards.
STT GDC to build hyperscale data centre in Singapore
ST Telemedia Global Data Centres (STT GDC) today unveiled ambitious plans for expansion with its largest data centre in Singapore to date.
Golden opportunities for enterprise e-waste reduction
E-waste is a hot topic in tech circles, and Park Place's EMEA MD believes there could be huge opportunities if data centres and enterprises improve their practices.
How Schneider Electric aims to simplify IT management
With IT Expert, Schneider Electric aims to ensure secure, vendor agnostic, wherever-you-go monitoring and visibility of all IoT-enabled physical infrastructure assets.
Pitfalls to avoid when configuring cloud firewalls
Flexibility and granularity of security controls is good but can still represent a risk for new cloud adopters that don’t recognise some of the configuration pitfalls.
Cisco dominates record-high Ethernet switch & router markets
While the market is flourishing, it’s tough-going as Cisco has increased its majority share of the pie.
Why total visibility is the key to zero trust
Over time, the basic zero trust model has evolved and matured into what Forrester calls the Zero Trust eXtended (ZTX) Ecosystem.
Gartner names Proofpoint Leader in enterprise information archiving
The report provides a detailed overview of the enterprise information archiving market and evaluates vendors based on completeness of vision and ability to execute.