OPTIMISATION OF OUTDOOR AIR COOLING SYSTEM FOR MODULAR DATA CENTRES

OPTIMISATION OF OUTDOOR AIR COOLING SYSTEM FOR MODULAR DATA CENTRES Jayantha Siriwardana1, Wong Yew Wah2, Toh Kok Chuan1, and Wen Yonggang3 {1Energy R...
Author: Merry McDonald
23 downloads 0 Views 768KB Size
OPTIMISATION OF OUTDOOR AIR COOLING SYSTEM FOR MODULAR DATA CENTRES Jayantha Siriwardana1, Wong Yew Wah2, Toh Kok Chuan1, and Wen Yonggang3 {1Energy Research Institute, 2School of Mechanical and Aerospace Engineering, and 3School of Computer Engineering}, Nanyang Technological University, Singapore ABSTRACT This paper presents the initial results of optimising the use of outside air for cooling modularized data centres (MDCs) under tropical climates. MDCs with outside air-cooling have been in use in temperate climates. The concept was test-bedded in Singapore for the first time at Nanyang Technological University (NTU). The initial operation data showed that there is a potential of 73% of time, the data centre can be operated using only outside air or mixture of outside air with warm return air. Here, we analyse the cooling system performance of the MDC test-bed with the aid of energy models and standard weather files to evaluate the possibility of cooling mode control algorithm in the air-handling unit (AHU). The results show that, enthalpy based cooling mode control has the potential of maintaining consistent high-energy efficiency of outside air cooled MDC at low IT loads. Keywords: Modular Data Centres (MDCs), outside air cooling, energy efficiency, TRNSYS INTRODUCTION Data centres, that house and maintain the back-end information and communication technology (ICT) systems, represent a large chunk of energy footprint of many buildings in our cities. Modern ICT equipment such as high performance servers used in data centres requires much higher heat release per unit area making development of new cooling technologies a necessity for future applications. In the high temperature and high humid tropical climate in Singapore, provision of adequate and efficient cooling for data centres is challenging and, according to National Environment Agency (NEA), currently represents about 50% of their energy cost. Therefore, it is important to look into energy efficient data centre cooling technologies for tropical climates in order to achieve future sustainability of our modern cities. Outside air cooling of data centres have been used in countries where temperate climate conditions prevail. Recent studies have been published on the potential of using economizer cycles for building ventilation and cooling [1, 2, 3], data centre cooling [4, 5, 6, 7], and energy analysis of such cycles in building ventilation [8]. It has also been shown that the use of

Air Conditioning module

ICT module

Figure 1 Modular data centre test bed at NTU campus economizer cycles can save on energy bills and help to maintain a low total cost of ownership [9, 10]. Lee and Chen [11] reported that data centres in Singapore have a marginal potential of energy savings using outside air for cooling under current ASHRAE recommendations. The remainder of the paper is organised as follows: First, we introduce the modular data centre and its cooling system. Next, we present analysis of cooling modes currently employed and discuss the possibilities of enhancing the cooling performance under both high and low ICT loads. MODULAR DATA CENTRE Scalable modular design is one of the key practices to maintain high energy efficiency throughout the lifetime of data centres [12]. Modularized data centres are one of such technology that can be used to expand the data centre resources of an organization as it grows. It has a number of advantages, such as quick deployment, high density, and an integrated, single supplier IT and infrastructure solution. They are also used to cater for regular changing demands as in the case of cloud computing business scenarios. The modular data centre test bed shown in Figure 1 was built in NTU campus as collaboration between NTU and Toshiba Asia Pacific Pte Ltd under Green Data Centre Innovation Challenge by Infocomm Development Authority (IDA) of Singapore. It is equipped with an outside air cooling system

supported by mechanical cooling system described below to fulfil its cooling demand. The objective of this project was to demonstrate the viability of this outside air cooling technology in tropical climate and analyse its performance. ICT Module

Flow Controller

Mixing Chamber

Outside air Filter

Cooling coil 9m

Flow Controller

Standard 45U server racks X 10

3m

Cold Aisle Entrance room

To Data Centre

Exhaust warm air

Warm return air from data centre

Figure 3 Functional schematic of air conditioner unit Hot Aisle

Figure 2 Schematic floor plan of modular data centre The modular data centre test bed consists of ICT module and air conditioner module as shown in Figure 1. The footprint of the data centre is 9m by 3m, which is similar to the size of a standard 20 feet container. As illustrated in Figure 2, the ICT module houses 10 standard 45U racks. The hot aisle is physically separated from the cold aisle with a wall to eliminate airflow inefficiencies inside. Table 1 gives a summary of ICT module specifications Table 1 Specifications of ICT module Modular data centre foot print Number of server racks Maximum ICT load per rack Maximum ICT load of data centre Supply air temperature at cold aisle Return air temperature at hot aisle

9m x 3m 10 9kW 90kW 28~32oC 37~39oC

Air Conditioner Module The air conditioner module is placed upon the ICT module as indicated in Figure 1. This module is designed to primarily use outside air to meet the cooling demand from ICT module. Direct expansion cooling coils are installed to supplement the cooling demand where outside air conditions are unfavourable or to meet the entire cooling load when closed loop operation required. The schematic diagram in Figure 3 shows the concept of using

outside air for cooling. To attain desired conditions, outside air is sometimes mixed with warm return air, dehumidified, and mechanically cooled as explained below. American Society of Heating, Refrigerating and AirConditioning Engineers (ASHRAE) is the primary organization that sets standards for data centre operation environments. In 2011, ASHRAE updated its published guidelines for temperature and humidity for data centre environments [13]. The envelope indicated by ○ 1 in Figure 4 requires the strictest temperature and humidity control. Most mission critical data centres follow this guideline with temperatures at lower end of the envelope to allow lead-time for any cooling system failure as well as inefficient airflows if any. Generally, ○ 1 envelope is sufficient for providing outside air for data centre cooling in temperate climate conditions. In the tropical climate in Singapore however, the conditions tend to hover in the area indicated by the dotted circle in Figure 4. In order to realize outside air cooling under tropical climate, air conditioner module of the modular data centre test bed has an extended envelope as indicated by ○ 2 in Figure 4. This results in increasing the desired dew point (DP) upper limit from 17 oC to 24oC in the desired air condition with no changes to temperature or relative humidity (RH) conditions.

Table 2 Operation modes of the air handler of air conditioner module AHU OPERATION MODE M1 M2 M3 M4 M5

DESCRIPTION

Cooling with outside air only Cooling with outside air and mixing hot return air Cooling outside air with compressor Cooling with compressor and mixing hot return air Cooling return air with compressor

TEMPERATURE (T), RELATIVE HUMIDITY (RH), AND DEW PPOINT (DP) CONDITIONS T RH DP < 32oC < 80% < 24oC < 32oC > 80% < 24oC o > 32 C < 32oC > 24oC Recirculation mode for control

Figure 4 ASHRAE thermal guidelines compared with the conditions maintained within the modular data centre Table 2 shows the air handling unit (AHU) operation modes of air conditioner module. Depending on the condition of the outside air, one of 5 modes is selected by the AHU controller to deliver the desired air condition within the ICT module. Here, cooling modes M1 and M2 are entirely rely upon outside air while cooling modes M3 and M4 are supplemented by mechanical cooling. Cooling mode 5 is not used under normal operation conditions and reserved for emergencies and controlled experiments. Note that, conditions maintained within the ICT module so that DP does not decrease below 24oC to avoid possible condensation within ICT equipment and hardware damages.

As illustrated in Figure 5, efficiency of a data centre could be improved by reducing the energy used in the cooling and power supply architecture.

ANALYSIS OF COOLING MODES

Error! Reference source not found. shows the measured energy consumption data of the modular data centre for the two months, December 2012 and January 2013. It has been operated around 60% ICT load (≈ 60kW) and the daily averaged pPUE is within

UPS

PDU

(C)

(A)

Energy from utility

(D)

(B)

Cooling system

Figure 5 Definition of PUE

ICT Equipment

3000

Total energy consumption ICT energy consumption pPUE

2500

3 2.8 2.6 2.4

2000

2.2 1500

2 1.8

1000

Max.=1.57

500

Avg.=1.31

0

Min.=1.2 Jan’13

Dec’12

1.6 1.4 1.2 1

Figure 6 Daily measured energy consumption and calculated daily mean pPUE of modular data centre for Dec'12 and Jan'13

Calculated partial PUE

(1)

Daily electricity consumption (kWh)

Quantifying Energy Efficiency With outside air cooling, substantial energy savings can be gained since the mechanical cooling system is not used. The most common measure to quantify the energy efficiency of a data centre is Power Usage Effectiveness (PUE) [14] metric that is defined as the ratio of the total energy drawn by a data centre to the energy used by the ICT equipment in the facility:

In our case, the modular data centre does not have a uninterruptible power supply (UPS) system in place. This is the common practice with modular data centres since they are designed to rely on a reliable power supply system from a base station. Therefore, in order to differentiate the energy efficiency figures from data centres with UPS systems within their perimeter of calculation, we use the term partial PUE (pPUE) [14] to refer our energy efficiency calculations.

TMY weather data

Calculate AHU operation mode

Calculate cooling system hourly energy requirement Mode M1 or M2

Mode M3 or M4

ICT module

Fan Fan + DX chillers

Hourly energy requirement for the cooling system

Hourly cooling requirement data

Figure 7 Schematic of modular data centre energy simulation with TRNSYS the range of [1.2, 1.57] with overall average of 1.31. The average PUE for a Singapore data centre is 2.07 according to National Environment Agency in Singapore [15]. With the pPUE values mentioned, the modular data centre can be expected to perform PUE values well below the Singapore average. Energy Model of Modular Data Centre Due to high operation temperatures, the cooling system primarily rely upon outside air with supplementary conditioning using hot return air and cooling coils. With high ICT loads close to 90kW (close to 100% ICT utilization), this may be very effective since the temperature of return air from the ICT module is close to 40oC which is well above the outside air conditions in Singapore, making it effective to reject it to the environment and draw outside air for cooling. However, when the data centre is not running at its full capacity, this situation may not necessarily be true.

humidity and dew point data in TMY weather data. Energy requirement for the cooling system changes depending on the AHU operation mode. For modes M1 and M2 only energy requirement is the fan and no mechanical cooling is involved. Whereas modes M3 and M4 involve mechanical cooling and dehumidification, hence energy requirement for DX chillers should also be considered in addition to the fan. The simulator can be used to predict the air conditioner module energy requirement under different ICT loads, thereby calculate the pPUE. In this study, we investigated three ICT load set points, namely, 30kW, 60kW, and 90kW and calculated hourly spot pPUE values. Figure 8 shows a plot of calculated pPUE values along with the AHU operation mode for the months December and January. Energy Efficiency Analysis of Cooling Modes

To analyse the effectiveness of outside air based cooling system, we developed an energy model of the modular data centre that enable us to simulate and predict the energy efficiency levels at different ICT loads. The modular data centre energy model was implemented in TRNSYS [16], a simulation program primarily used in the fields of renewable energy engineering and building simulation. Figure 7 shows a schematic diagram on how the energy requirements are calculated using TRNSYS. The ICT module was modelled in TRNBuild, the tool associated with TRNSYS to model building energy demand, incorporating passive heat gains from ambient and internal ICT loads. Typical meteorological year (TMY) hourly weather data for Singapore was used in this case. Hourly AHU operation mode of the air conditioner module can be calculated using temperature,

From Figure 8, obviously M1 and M2 are the most efficient AHU modes with no mechanical cooling involved. However, it is evident that when mechanical cooling required as in the case of mode M4, pPUE increase drastically, especially so with low ICT loads. The range of pPUE variation for ICT load of 60kW is similar to that of Figure 6. This resemblance serves to be the validation of the simulation to perform further analysis. We also used TMY weather data to calculate the expected availability of each AHU mode in a typical year. Table 3 summarizes this along with expected pPUE values for each mode at each ICT load. Mode M3 gives the worst pPUE while M4 is not far behind. It is also important to note that the mode M4 can be expected 38.6% of the time of a typical year. If the modular data centre is operated with low ICT loads it will result in worse than expected energy efficiency performance.

Mode

Simulated pPUE value

2.2

ICT load = 30kW ICT load = 60kW ICT Load = 90kW

2 1.8 1.6 1.4 1.2

1 5 4 3 2 1 0

December

January

Figure 8 Simulated spot pPUE based on hourly standard Singapore weather data for the months, December and January Table 3 Expected availability of each cooling mode in a typical year and expected averaged pPUE AVERAGE AVAILABILITY M1 19.7% M2 40.5% M3 1.2% M4 38.6% Annual averaged partial PUE

AHU MODE

ICT LOAD = 30kW 1.3 1.3 1.7 1.53 1.3935

PARTIAL PUE ICT LOAD = 60kW 1.2 1.2 1.45 1.3 1.2416

ICT LOAD = 90kW 1.1 1.1 1.3 1.2 1.141

The reason for getting higher pPUE values at low ICT loads is twofold. Firstly, when the demand for cooling decreases with low ICT loads, chillers tend to operate at their low efficiency range. When chillers are required under this condition due to unfavourable outside air condition, higher proportion of energy is required for cooling system making pPUE value to hike. Secondly, at low ICT loads, recirculation of hot return air may be energy efficient rather than having outside air cooled and dehumidified with cooling coils. In contrast to data centres in permanent establishments such as buildings and warehouses, modular data centres often used to supplement shortterm ICT load increases and add-as-you-grow strategies. Therefore, it is important for modular data centres to be energy efficient in the whole spectrum of ICT loads. Enthalpy Based Cooling Mode Improvement To improve the energy efficiency of modular data centre at low ICT loads, there can be several measures taken as follows: 1. Use cooling coils along with the mechanical cooling system with wide-band high coefficient of performance (COP) 2. Reduce the resistance in air delivery ducts

Figure 9 Enthalpy based cooling mode selection and modification 3.

Improve cooling modes and selection algorithm to look into hot return air condition In this paper, we investigate the third measure with simulated performance analysis. As shown in the pshychrometric chart in Figure 9, the allowable supply air condition for the ICT module is depicted by the area covering regions ○ 1 and ○ 2 . Thus, the highest allowable enthalpy for the supply air becomes 81kJ/kg. We can use the 81kJ/kg

Mode

Simulated pPUE Value

2.2

ICT load = 30kW ICT load = 60kW ICT Load = 90 kW

2 1.8 1.6 1.4 1.2 1 5 4 3 2 1 0

December

January

Figure 10 Simulated spot pPUE with modified cooling modes M3 and M4, based on hourly standard Singapore weather data for the months, December and January Table 5 Enthalpy based improvement of cooling modes M3 and M4 OUTSIDE AIR ENTHLPY < 81kJ/kg < 81kJ/kg > 81kJ/kg > 81kJ/kg

RETURN AIR ENTHALPY

AIR SELECTION

< 81kJ/kg > 81kJ/kg < 81kJ/kg

Return air Outside air Return air Use air with less enthalpy

> 81kJ/kg

equi-enthalpy line as shown in Figure 9 as the guide to improve the energy efficiency of cooling modes M3 and M4. We can still use the logic in Table 2 to determine the AHU operation mode. If mode M3 or M4 is selected, then we can look into Table 4 to further select on whether to use outside air or hot return air based on their enthalpy level. The advantage of using hot return air where possible is that it is filtered and dehumidified in accordance with the ASHRAE standards. Therefore, sensible cooling of hot return air may be energy efficient than de-humidifying and cooling outside air. Moreover, thorough filtering is not required in contrast to the case with outside air. We simulated the energy efficiency of modular data centre with above modifications to AHU operation modes M3 and M4. Figure 10 shows the simulated pPUE values for 30kW, 60kW, and 90kW ICT loading for the months December and January of a typical meteorological year. When compared with the simulation results of the existing system in Figure 8, pPUE values for mode M4 are not high as before. Specifically, with low ICT loads, the hikes in the pPUE values are similar to that of high ICT loads.

Table 4 Average pPUE for simulated modular data centre energy consumption for months December and January ICT LOAD 30 kW 60 kW 90 kW

AVG. pPUE (USUAL MODES) 1.53 1.28 1.2

AVG. pPUE (REVISED MODES M3, M4) 1.39 1.25 1.2

One can also observe that the range of pPUE for all ICT loadings is now within the range of measured pPUE data in Figure 6. This is significant because it indicates the possibility of operating the modular data centre at high energy efficiency in all ICT loading scenarios. Table 5 shows the expected pPUE for the two months simulated before and after mode M3 and M4 modified to look into enthalpy of outside air and hot return air. Even at averaged figures, the improvement is significant for low ICT loads while for high ICT loads there is no significant improvement. This is understood since the high ICT loads result in hot return air with enthalpy is more than 81kJ/kg making it energy efficient to cool outside air after discarding return air. CONCLUSION In this study, we analysed the outside air cooling system of modular data centre located at NTU. An energy simulator of modular data centre was implemented using TRNSYS and the performance was validated with 60% ICT loading results of measured operation data. We then simulated energy performance of the proposed enthalpy based cooling mode improvement and compared the pPUE figures

with that of simulated existing AHU operation modes. Our results indicate that the proposed enthalpy based improvement to select between outside air and hot return air is effective to maintain the high energy efficiency of the modular data centre at all ICT loads. As for further research, we are currently working on studying the energy efficiency implications of lowering the current cold aisle temperature set point. This may reduce the burden on server fans as well as enable us to maintain supply air condition close to the ASHRAE recommended range. ACKNOWLEDGEMENT This research was carried out as a collaborative project between Nanyang Technological University and Toshiba Asia Pacific Pte Ltd under Green Data Centre Innovation Challenge funding from Infocomm Development Authority (IDA) of Singapore. Authors would like to acknowledge their support in this work. REFERENCES [1] H. Bulut, M. A. Aktacir, “Determination of free cooling potential: a case study for Istanbul, Turkey,” Applied Energy, Vol. 88, pp. 680 – 689, 2011. [2] C. Inard, J. Pfafferott, C. Ghiaus, “Free-running temperature and potential for free cooling by ventilation: a case study,” Energy and Buildings, Vol. 43, pp. 2705 – 2711, 2011. [3] I. M. Budaiwi, “Energy performance of the economizer cycle under three climatic conditions in Saudi Arabia,” International Journal of Ambient Energy, Vol. 22, pp. 83 – 94, 2001. [4] J. Dai, D. Das, M. Pecht, “Prognostics-based risk mitigation for telecom equipment under free air cooling conditions,” Applied Energy, Vol. 99, pp. 423 – 429, 2012. [5] Y. Udagawa, S. Waragai, M. Yanagi, W. Fukumitsu, “Study on free cooling systems for data centers in Japan,” 32nd International Telecommunications Energy Conference, pp. 1 5, 2010. [6] V. Sorell, “OA economizers for data centers,” ASHRAE Journal, Vol. 49, pp. 32 – 37, 2007. [7] J. Siriwardana, S. Jayasekara, S. K. Halgamuge, “Potential of air-side economizers for data center cooling: A case study for key Australian cities,” Applied Energy, Vol. 104, pp. 207 – 219, 2013. [8] Y. Yao, L. Wang, “Energy analysis on VAV system with different air-side economizers in china,” Energy and Buildings, Vol. 42, pp. 1220 – 1230, 2010. [9] C. M. Scofield, T. S. Weaver, “Using wet-bulb economizers: data center cooling,” ASHRAE Journal, Vol. 50, pp. 52 – 58, 2008.

[10] A. Shehabi, S. Ganguly, L. A. Gundel, A. Horvath, T. W. Kirchstetter, M. M. Lunden, et al., “Can combining economizers with improved filtration save energy and protect equipment in data centers?,” Building Environment, Vol. 45, pp. 718 – 726, 2010. [11] K. –P. Lee, H. –L. Chen, “Analysis of energy saving potential of air-side free cooling for data centers in worldwide climate zones,” Energy and Buildings, Vol. 64, pp. 103 – 112, 2013. [12] K. C. Toh, K. J. Tseng, S. K. Panda, S. E. Lee, “Green Data Centre Technology Primer,” Nanyang Technological University and National University of Singapore, 2012. [13] ASHRAE whitepaper, “2011 Thermal guidelines for data processing environments – expanded data center classes and usage guidance,” Technical Committee (TC) 9.9, ASHRAE; 2011. [14] Y. Joshi , P. Kumar (Eds.), “Energy Efficient Thermal Management of Data Centers,” Springer US, 2012. [15] M. Salim, R. Pe, “Data Centre Energy Efficiency Benchmarking,” National Environment Agency (NEA), Singapore, 2012 [16] TRNSYS simulation software official website, http://sel.me.wisc.edu/trnsys/

Suggest Documents