Adiabatic cooling and energy saving in a data centre
12 February 2019
Figure 2: Indirect free-air adiabatic.
The expansion in the data centre industry and its associated energy demands is of great concern in Ireland and elsewhere. The energy consumption of the cooling process can be up to 40 per cent of the total used.
With the support of SEAI’s Excellence in Energy Efficient design (EXEED) program, SmartPower investigated the use of a new innovative adiabatic cooling system for the Cork Internet Exchange (CIX).
This system uses no chillers and relies simply on the ambient air for the cooling load. The energy savings are expected to be 75 per cent of that used in a conventional chilled water-cooling system.
The traditional approach used mainly in data centres today, is that of a dedicated Chilled Water (CHW) system solution, with chillers serving as the main refrigeration element and pumps for flow distribution.
The system consists of a number of air-cooled chillers, a CHW network with primary and secondary pumps, and Computer Room Air Conditioning (CRAC) units.
Many of these installations have free-cooling coils installed on the chillers that use the ambient air for cooling and avoid the compressor operating for a considerable part of the year.
The purpose of the chilled water (CHW) distribution system is to deliver the required cooling to a number of elements on site, including the cooling to the supporting UPS, switch rooms, but most importantly to the data hall ring which feeds the CRAC units that service the IT cabinet heat loads.
The CRAC units will operate in full recirculation mode inside a sealed environment by removing heat from the room return air and rejecting this heat into the CHW system. This traditional concept with CRACs, chillers and pumps is demonstrated schematically in Figure 1.
Adiabatic or evaporative cooling is a physical phenomenon in which evaporation of a liquid, typically into the surrounding air, cools an object or a liquid in contact with it. When considering water evaporating into air, the wet bulb (WB) temperature, as compared to the air’s dry bulb (DB) temperature, is a measure of the potential for evaporative cooling.
The greater the difference between the two temperatures, the greater the evaporative cooling effect. Every litre of water evaporated requires about 2,500 kilojoules (kJ) of energy, and this energy is taken directly from the air, cooling it to close to the air WB temperature.
The adiabatic system can then be combined with a cross-flow heat exchanger as shown in figure 2 (main image) and is designed to initially lower the (DB) temperature of air, and so increase the effectiveness of the heat exchanger in cooling down the warm return air from the data hall to the desired supply temperature.
In this way the adiabatic element is only required when the ambient temperature rises above 21°C and most of the cooling is done by sensible heat transfer from the outside air in the heat exchanger.
The American Society of Heating Refrigeration Engineers (ASHRAE) recommends an internal maximum data suite temperature of 27°C. A further advantage of this scheme is that the data hall would remain a fully sealed environment, isolated from the external ambient air, such that there is no moisture or pollutant infiltration.
Also, the system can be built up in modules and can be applied in many modes such as in the rooftop shown here in figure 3, but also in raised floor formats with hot/cold aisle containments. Thus, it can have a considerable advantage in space requirements over alternative cooling methods.
A water distribution and storage system are required. There is thus a requirement to ensure that appropriate Legionella pneumophila controls are put in place for the evaporative cooling section of the system.
The data centre is a complex environment that is designed to house IT equipment. Utility power entering the data centre has to pass through a number of stages of voltage transformation, distribution and cleaning before finally being delivered to the IT equipment.
Most of the power used within the facility is converted to heat, requiring significant cooling system capacity which draws an additional load in a traditional, recirculating air data centre.
There are also a number of ancillary support systems in the data centre such as lighting, generator pre-heaters, fire suppression systems as well as human occupied areas which also require electrical power.
Cooling system energy consumption
Up to 40 per cent of the energy used in a data centre can be used in the cooling systems, thus, it is not surprising that the external temperature has an impact on the energy efficiency of the facility.
This is due to a number of causes and varies in impact between different designs of facility and geographic location. Ireland’s climate on the whole is considered very favourable for the location of data centres as there are no extremes of heat or humidity.
The ratio of the input power to the IT power in a data centre is known as the Power Usage Effectiveness (PUE). The values in Ireland are found to be typically in a range of 1.5 -1.7 for the traditionally cooled data centre.
At Cork Internet Exchange (CIX) at Holyhill, Cork, the use of adiabatic cooling in a new €6 million extension is expected to lower the overall PUE to less than 1.4.
All the cooling energy is supplied by the ambient air with no chillers used. The system is designed and supplied by an Irish company, EDPAC, based in Carrigaline, Co Cork.
The assembled modules installed include many innovative features such as the use of the attic space for the inlet air plenum and the unique placement of the air to air heat exchangers to permit ease of air circulation and containment of the hot and cold aisles respectively.
The expected electrical demand for a 400kW cooling module is an average of 12kW throughout the year (see figure 4) The energy savings are expected to be 75 per cent of that which would otherwise be used in a traditionally cooled data centre.
1.) Data centre Energy efficiency metrics white paper, DCSG members repository, Liam Newcombe http://dcsg.bcs.org
2.) 2017 Best Practice Guidelines for the EU Code of Conduct on Data Centre Energy Efficiency Version 8.1.0, Mark Acton, Paolo Bertoldi, Liam Newcombe, et al, 2017 EUR
3.) Cundall/Excool – Data Centre Cooling Product Comparison Report Desktop Study Job No: 1003780 Date: October 2011.
The author acknowledges the assistance and support of Jerry Sweeney, managing director, CiX, and Noel Lynch, managing director, EDPAC. We are also indebted to SEAI, which supported the installation with an EXEED grant.
Author: Donal Deering CEng MIEI, senior engineer, SmartPowerhttp://www.engineersjournal.ie/2019/02/12/adiabatic-cooling-and-energy-saving-in-a-data-centre/http://www.engineersjournal.ie/wp-content/uploads/2019/02/a-aaaaacooler2.jpghttp://www.engineersjournal.ie/wp-content/uploads/2019/02/a-aaaaacooler2-300x249.jpgElecdata,energy,SEAI