Liquid Cooling For Data Centre Servers Passes Intel Tests

Dunking servers in oil doesn’t hurt them, Intel says

Intel has given the thumbs up to the idea of cooling servers by dunking them in mineral oil, after a one year trial of a system from Green Revolution Cooling (GRC).

The GRC system tested by the giant chip maker, known as  CarnotJet , immersed the servers in mineral oil designed to remove heat from server electronics more efficiently than traditional air-cooling. Green Revolution Cooling claims its system can reduce the energy used to cool a data centre by as much as 95 percent, which can halve the total energy consumed in some cases.

Data centre power and cooling costs are significant concerns to organisations, and liquid cooling has been widely tipped as a good way to reduce these.

In 2009, IBM predicted that all servers would be water cooled by 2019, as liquds remove heat more efficiently, and in a form that can be used for other purposes: the hot water used to cool an IBM supercomputer at The Swiss Federal Institute of Technology in Zurich  is used to warm nearby buildings, for instance.

However, specialist server-cooling companies such as GRC tend to use oil rather than water, for its better thermal properties. LiquidCool (formerly Hardcore Computing), is another immersion advocate backed by Sun founder Scott McNeally.

UK firm Iceotope also uses liquid coolant, but uses special redesigned blades, through which a non-oil coolant (3M’s Novec) flows. The Iceotope system users normal vertical racks, but adds a separate coolant circulation system. It doesn’t need pumping however, Iceotope points out, using convection to move the fluid.

Mike Patterson, senior power and thermal architect at Intel, told technology news site GigaOm that GRC’s technology seems safe for servers and their components, and that it appears to do the job of reduce the energy wasted in removing heat from the machines.

According to Patterson, Intel found that server racks cooled by traditional air technologies operate with a Power Usage Effectiveness (PUE) rating of 1.6, while the PUE rating for Intel servers that were submerged in Green Revolution’s GreenDEF mineral oil coolant were between 1.02 and 1.03.

PUE  measures the energy efficiency of systems, which divides the total power used (including cooling) by the power delivered to the IT kit. If  less power is used in cooling, the ratio gets closer to the ideal value of 1. To reach close to 1 through traditional air-cooling technologies would cost a lot of money and require significant innovation, Patterson said.

He also noted that being immersed in the mineral oil did not damage the server or its various components, from hard drives to processors. After the yearlong immersion test ended, Intel sent the systems to its labs for analysis, which “came back with a thumbs-up that a year in the oil bath had no ill effects on anything they can see,” Patterson told GigaOm.

Power and cooling costs increasingly have become key issues for organizations, particularly those with massive, highly dense data centres that run massive numbers of workloads. Data centre infrastructure vendors and chip makers alike are continuously looking for new ways to increase the efficiency of their products.

Running data centres hotter results in less energy waste, as less work is done trying to remove heat. Outside of liquid cooling, the state of the art is to use outside air for cooling, without running expensive and energy-hungry air-conditioning units.

In the CarnotJet System, normal server racks are used, but they are installed horizontally, with the blades hanging down into a bath of GRC’s GreenDEF mineral oil, which absorbs heat. the oil is pumped away, cooled and filtered, before returning to the bath holding the rack. GRC offers server racks of 10U (17.5 inches), 42U (73.5 inches) and 60U (105 inches).

Despite the positive result, Intel has not committed to adopting GRC technology in its data centres. According to Patterson, the company is still in the evaluation phase, seeing how housing the systems in the coolant impacts performance and other metrics.

He also said that the idea of immersing data center systems in coolant might find faster acceptance among C-level executives—who are the ones paying the bills—than with data centre administrators, who may be more worried about housing their servers in mineral oil.

Jeffrey Burt of eweek.com contributed to this article.

Are you a a cool green tech guru?  Try our quiz!