Story image

Greening the data centre

01 Jun 2011

In today’s 24x7 world of information availability, on-demand services, and round-the-clock commerce sites, companies are increasingly adding high-performance servers, storage and other equipment to their data centres to satisfy user and customer demand.As a result, companies find they need more and more power to run and cool this equipment. At the same time, the cost of electricity is on the rise. And many companies are trying to be good corporate citizens by becoming green – or at least greener. The combination of these factors is forcing many enterprises to evaluate their data centre power consumption and find ways to become more energy-efficient.Several trends are significantly driving up data centre power requirements. Firstly, most companies need more computing power to run their web sites or business and financial applications, for which servers must often run round-the-clock. Secondly, newer computers use higher performing processors that consume more electricity. And thirdly, there is a trend to physically consolidate servers by moving to high-density rack and blade servers, thus packing more processing power into smaller spaces within data centres. The result is that the power usage in corporate data centres is shooting through the roof. In fact, data centres typically required, on average, 1 kilowatt (kW) per rack in 2000. Six years later, the average per rack was up to 6.8 kW, and today it’s significantly higher again. The amount of electricity needed to cool the equipment in these racks has risen in a similar fashion.If nothing changes, power and cooling issues (and costs) are likely to only get worse in the future, as both power requirements and the price of electricity are expected to keep rising.Practical stepsFaced with growing power consumption requirements to run and cool data centre equipment, companies are looking for ways to reduce electrical usage and costs. To figure out where to focus attention on energy, one must understand what contributes to power consumption. Studies have shown that up to 50% of data centre energy is consumed by IT equipment, and another 35 to 40% is for cooling. Given that IT equipment is the biggest energy consumer, it makes sense to look at equipment itself to reduce power usage. But that has not been the case. Most companies do not even know how much power their equipment is drawing.ConsolidationA popular approach to reducing data centre power consumption would be to simply use fewer servers, and that is exactly what many companies are doing today by virtue of server consolidation and virtualisation projects. Virtualisation extends the benefits of physical consolidation, allowing applications run on virtual machines – several virtual servers on one physical box – which consume computing resources based on an application’s needs. This allows for even more efficient use of a server’s capabilities. Consolidation and virtualisation can produce significant results. In some cases, companies have been able to realise a 10-to-1 reduction in the number of servers they required, which would consequently cut power consumption.Air flow managementAnother area of focus is optimising air flow within the data centre. In the past, data centre racks were typically arranged to all face the same direction. But because most equipment manufactured today is designed to draw air through the front and exhaust it from the rear, there is a more efficient way to set up racks: the hot-aisle/cold-aisle arrangement. This approach arranges racks front-to-front so the cooling air rising into the cold aisle is pulled through the front of the racks on both sides of the aisle and exhausted at the back of the racks into the hot aisle. Only cold aisles have perforated tiles, and floor-mounted cooling is placed at the end of the hot aisles – not parallel to the row of racks. Parallel placement can cause air from the hot aisle to be drawn across the top of the racks and to mix with the cold air, causing insufficient cooling to equipment at the top of racks and reducing overall energy efficiency.Air flow management of another sort should also be taken into account. Specifically, the high number of servers in many data centre racks often means there are many power and Ethernet cables running throughout any single rack or under the floor of a raised-floor data centre. In some cases, the cables obstruct air flow and do not allow the heat to be removed or the cool air to circulate. IT managers should thus check to be sure the cables are not obstructing air flow. EconomisersIn many parts of the country, winter provides an opportunity to augment traditional data centre cooling. In particular, outside air can be used to help cool data centres. Accomplishing this requires the use of what are called economiser systems, which come in two types. First, there are air-side economisers that allow outside air to enter a data centre to aid in cooling. The second type of economiser is a fluid-side economizer. These systems are commonly incorporated into a chilled-water or glycol-based cooling system. An overseas study on building control systems found that, on average, the normalised heating and cooling Energy Use Intensity of buildings with economisers was 13% lower than those without economisers.Supplemental coolingWhile raised-floor cooling has proven itself an effective approach to data centre environmental management, as rack densities exceed 5 kW, and load diversity across the room increases, supplemental cooling should be evaluated for its impact on cooling system performance and efficiency. At higher densities, equipment in the bottom of the rack may consume so much cold air that remaining quantities of cold air are insufficient to cool equipment at the top of the rack. The height of the raised floor creates a physical limitation on the volume of air that can be distributed into the room, so adding additional room air conditioners may not solve the problem.Rising rack densities and high room diversity can be solved by pumped refrigerant cooling infrastructure that supports cooling modules placed directly above or alongside high-density racks to supplement the air coming up through the floor. This has a number of advantages, including increased cooling system scalability, greater flexibility and improved energy efficiency. Two factors contribute to improved energy efficiency: the location of the cooling modules and the fluid used to transport the heat. A two-phase refrigerant (R134a) is the most effective. Higher density applications require fluid-based cooling to effectively remove the high concentrations of heat being generated. From an efficiency perspective, refrigerant performs better than water for high-density cooling. The R134 refrigerant can be pumped as a liquid, but converts to gas when it reaches the air. This phase change contributes to greater system efficiency. R134 is approximately 700 % more effective in moving heat than water, which coincidentally, is 700% more effective than air. It also ensures that expensive IT equipment is not damaged in the event of a refrigerant leak.Traditional floor-mounted cooling systems with under-floor air delivery will continue to play an essential role in data centre environmental management. It is recommended that traditional systems be configured to deliver the required cooling for the first 50-100 Watts per square foot of heat load as well as solve the room’s full humidification and filtration requirements. Supplemental cooling can be deployed for greater densities.Going GreenA good way to start the ‘greening process’ is through a thorough analysis of your own server room or data centre. Specialist data centre infrastructure providers can audit your specific environment – often at little or no cost to the business – and make practical recommendations on how the business can save money and lessen its environmental impact by reducing energy consumption.It’s equally important to find a consultant that offers a holistic approach to greening the data centre. Improving your business’s efficiency, while compromising the availability or security of your business systems, will only result in complications down the line. A partner that understands how your business works and can deliver specific solutions that address your entire ecosystem of hardware, software and infrastructure, will ultimately prove a better companion as you embark on the road to greener pastures. 

Protecting data centres from fire – your options
Chubb's Pierre Thorne discusses the countless potential implications of a data centre outage, and how to avoid them.
Opinion: How SD-WAN changes the game for 5G networks
5G/SD-WAN mobile edge computing and network slicing will enable and drive innovative NFV services, according to Kelly Ahuja, CEO, Versa Networks
TYAN unveils new inference-optimised GPU platforms with NVIDIA T4 accelerators
“TYAN servers with NVIDIA T4 GPUs are designed to excel at all accelerated workloads, including machine learning, deep learning, and virtual desktops.”
AMD delivers data center grunt for Google's new game streaming platform
'By combining our gaming DNA and data center technology leadership with a long-standing commitment to open platforms, AMD provides unique technologies and expertise to enable world-class cloud gaming experiences."
Inspur announces AI edge computing server with NVIDIA GPUs
“The dynamic nature and rapid expansion of AI workloads require an adaptive and optimised set of hardware, software and services for developers to utilise as they build their own solutions."
Norwegian aluminium manufacturer hit hard by LockerGoga ransomware attack
“IT systems in most business areas are impacted and Hydro is switching to manual operations as far as possible.”
HPE launches 'right mix' hybrid cloud assessment tool
HPE has launched an ‘industry-first assessment software’ to help businesses work out the right mix of hybrid cloud for their needs.
ADLINK and Charles announce multi-access pole-mounted edge AI solution
The new solution is a compact low profile pole or wall mountable unit based on an integration of ADLINK’s latest AI Edge Server MECS-7210 and Charles’ SC102 Micro Edge Enclosure.