Data Centre Cooling – It’s Not Rocket Science

All it takes to get an efficient data centre is to apply well known technologies better, says Mike West – the man behind Europe’s most efficient data centre

There’s no rocket science in cooling data centres, but very efficient ones are still a rarity. So when a record-breaking centre set up in Surrey, we were very keen to speak to the man behind it.

The Petroleum Geo-Services (PGS) data centre in Weybridge Surrey appears to be the first in Europe to have an annual efficiency score (PUE) as 1.2.

keysource10.jpg

PUE is the amount of energy put into the system divided by the amount that reaches the servers – and most of today’s data centres have a score greater than two, which means less than half the power input reaches the IT kit. By contrast, 1.2 means only one fifth of a Watt goes on overheads for every Watt at the servers – and Google got gasps of surprise, when it announced last year that some if its centres had achieved that figure.

That’s a feather in the cap for Keysource, the company that built the PGS centre but it’s important not to oversimplify the discussion, said Mike West, Keysource’s managing director. A centre’s PUE depends a lot on the outside temperature, and should be quoted as an annual figure, based on a year’s figures for temperature fluctuations.

“The important factor is the annualised PUE in kwH,” said West. “Air conditioning is where the biggest gains can be made. Losses from UPS inefficiencies, and standby power, are all linear and predictable, but cooling is the area of biggest opportunity.” Because cooling depends on outside temperatures and other factors, it’s the one where extra work can get the biggest gains. “Nuances around the specialist mechanical and electrical plant can have a dramatic effect on the outcome of the facility from an efficiency and performance point of virew,” he said.

No secret sauce

Despite this, the big surprise is that, despite having a fancy name – Ecofris – Keysource’s data centre design has no “secret sauce”. Rival data centre builder Imtech ascribed the success of its Common Rail Cooling design to multi-storey architecture, but West says Keysource’s Ecofris involves no major break with earlier technologies – it just pushes them further than they have normally been pushed before.

“The biggest issue is the high density hardware,” he said. Bladecentres can pack more processing power in a smaller space, but that raises the amount of heat that needs to be disippated. The PGS data centre has around 16kW per rack position. .

The only way to get a low PUE is to cut down the amount of active cooling that needs to be done, and use “free cooling”. Instead of turning on mechanical chillers that burn power and push the PUE up. PGS only needs to turn on chillers when the ambient temperature is above 24C. Most free cooling systems so far have needed a temperature of five degrees – so they can only be used for maybe 1000 hours a year.