GRC’s data centre cooling system saves energy costs and keeps things simple – by dunking standard blades vertically in a bath of coolant
The idea of liquid cooling for servers has gained some ground, with a US company claiming a customer win for a system that simply immerses server blades in a tank of colant.
Green Revolution Cooling (GRC) launched at the SC09 show in November, at the same time as UK company Iceotope launched a liquid-cooled server system, but GRC says its system is radically simpler, cheaper and easier to maintain. The company showed a horizontal 42U rack filled with 37 servers and 2 switches which it says only required 40 Watts of cooling power for 5,000 watts of server power.
Immersion – the simpler option?
The GRC system takes normal server blades and, after a “60 second” modification, installs them vertically in a tank which is filled with GRC’s own GreenDEF coolant. This reduces cooling requirements of data centres to a tiny fraction, and does away with the need for air conditioning, since all the heat is taken away directly by the cooling fluid.
Liquid cooling has been widely used in high-end supercomputers, and IBM has predicted that all servers could be cooled by liquid within ten years. However most such cooling systems have been complex to maintain, or else have kept the liquid well clear of the electronics, losing the full benefit of liquid cooling. For instance, some server racks have liquid cooling through the front panels.
GRC’s system suspends the electronics directly in the liquid, but despite this, the company says the system is easy to maintain. A video on the GRC site shows a technician lifting out one of the blades, allowing it to drain over the tank, and removing and replacing a RAM stick – the video itself (below) lasts only a minute.
“We gave literally a hundred maintenance demonstrations at SC 09 and people were very surprised at how easy it was to service,” said Mark Tlapak, co-president of GRC.
By contrast, Iceotope keeps turns each blade into a sealed unit containing coolant, and runs a water-cooling cooling circuit through an otherwise normal-looking data-centre chassis to cool the blades, creating a system that is the same size and shape as normal server chassis, which Tlapak says is “beautiful but costly” – an over-complex and proprietary means to achieve cooling,
“GR Cooling accepts virtually any server from any OEM, while Iceotope seems to be a custom solution where a motherboard is attached to an Iceotope design which requires a customized redesign of the OEM server,” said Tlapak.
Cooling loops are complex
“Iceotope’s system is a case within a case within a rack, making it very difficult to work on individual servers,” said Tlapak in an email to eWEEK Europe. Making a change to an Iceotope system, he said, requires opening outer container, disconnecting the water circuit and removing the inner container, opening the sealed blade and draining the coolant – and then reversing the whole process to put the system back together.
The coolant drains off very quickly leaving the server dry and ready for “near-instant access,” he said. otherboard in a box while our servers all sit in non-sealed collant-filled 42U rack where servers are easily removed and capable of being worked on in 60 seconds.
The water cooling loop in the Iceotope system is an extra expense and complexity, as the heat has to go to through an extra material before it is removed – and it also increases the possibility of leaks, said Tlapak.
Responding to these comments, Peter Hopton, the inventor behind Iceotope, dismissed GRC as “fishtank manufacturers” with only a year’s development compared to “several years” for Iceotope. GRC tells us they have been developing since 2007.