The most efficient liquid cooling systems use advanced engineering. Peter Judge finds that basic tech can be just as effective
HP has used warm-water cooling to make its new Apollo 8000 more efficient – and it’s taken a straightforward engineering path to get there.
The high-performance computing (HPC) system was created for the US National Renewable Energy Laboratory (NREL), which says it has achieved a PUE of 1.06, and is getting a steady supply of water at 40C for heating its buildings.
Heat pipes and plumbing
That’s unusual, but now the system has been launched (at the HP Discover event in Las Vegas) and is a product on HP’s books, it’s possible to have a look inside the machine. And the biggest surprise is it uses highly conventional technology.
Apollo, in short, is not rocket science.
The water-cooling community know that to get maximum density – we are eventually going to be running special liquids directly across chips, through micro-channels. Universities are researching this, and IBM is doing it, in the SuperMUC machine at Germany’s Leibnitz institute.
That’s high-end technology: the IBM-built SuperMUC offers 3 Petaflops, uses 40 percent less power than a comparable air-cooled machine, and provides hot water for the surrounding district (see pictures here).
HP’s Apollo, on the other hand, achieves similar goals, but uses technology that is present inside ordinary laptops and household heating systems.
The difficulty with liquid cooling is getting the cooling to the chips. It needs to be safely contained, and also easy to disconnect, if the electronics need to be changed or upgraded. Systems which circulate liquid through the actual computer tend to be cumbersome.
Keep it simple
HP’s answer is to use heat pipes in the actual computer modules. These are sealed metal tubes containing a liqud (normally alcohol). This evaporates at the hot end, condenses at the cold end, and recycles by convection. In our picture, these are the copper tubes inside the module – with some spare aluminium ones alongside.
Heat pipes are clean and efficient because of evaporation (a phase change is the best heat removal mechanism). They are also cheap and already in mass production, because they are in widespread use inside laptops.
HP uses the pipes to take the heat from the electronics to the side of the computing modules. Metal bars there are in contact with the water circulation system in the rack. This makes a “dry disconnect” system in HP’s terms, or one where there is no liquid flowing between module and rack. The electronics can be slid out anytime, without having to turn off or unplug the water.
The technology is pretty basic, but it works. For NREL, the hot water is part of its contract with HP. If their machine is ever idle (apparently a very unlikely event), it would be kept running for the heat it provides.
That may sound weird, but computers turn energy into heat just as effectively as any other heating system. PUE is all about putting as much of that energy as possible through the processors first, but all the energy ends up as heat one way or another.
So Apollo 8000 is clever and straightforward, and well-engineered. Another step forward for water cooling in data centers!
A version of this article appeared on Green Data Center News.
What do you know about renewable energy? Take our quiz!