New Cooling Saves 45 Percent Of Energy At Data Centre

CloudCloud ManagementDatacentreEnterpriseFinancial MarketsGreen-ITInnovationLegalRegulation

Oil explorers PGS get an efficient data centre. Is that so they burn less of the fuel they find?

A new cooling system in a UK data centre has achieved a record efficiency score and 45 percent energy savings – using only well-established technology.

The Petroleum Geo-Services (PGS) centre in Weybridge, Surrey, has a PUE efficiency score of 1.2 as defined by the Green Grid. This means that for every Watt oil exploration company PGS gives its servers, only another 0.2W is needed for cooling, lighting and other overheads.  It should save PGS 15.8kWh a year, and reduce the company’s carbon footprint by 6,800 tonnes of CO2.

A PUE of 1.2 is a significant achievement given that most of today’s data centres achieve only around 2.2, according to Mike West, managing director of Keysource, the company that built the centre. Although the centre uses a Keysource design called “Ecofris” – there is actually no special ingredient, said West. .

“To achieve that kind of PUE, you must rely on free cooling,” he said. This uses the ambient temperature instead of turning on chillers, and normally only works when the air temperature is very low. Ecofris is exceptional in allowing free cooling when the ambient air temperature is as warm as 24C, so chillers need to be turned on rarely, said West. “On UK climate figures for 2007, the site should need chillers for only 87 hours in the whole year.”

Cooling air enters the server room through a wall and is sucked through the server racks which use about 20kW per rack position, and a total of 1.8MW for its IT load. In general the air comes in at 22C, and is ducted out from the “hot aisle” between the server racks, at 32C. “The efficiency is high, because we have complete separation between hot air and cold air,” said West. “The only way cold air can get back to heat exchangers is by passing through the racks and getting heated up.”

Commenting on the Common Rail Cooling System, launched by Imtech last month, West said: “Imtech’s claim of a PUE of 1.4 is creditable, but it appears they are still in the process of building their first centres.”

The figure of 1.2 is significant, as Google said it had surpassed it in some data centres last year, although the claim created much debate and excitement. “As far as we know, no one has come close to this in Europe,” said West.

There’s no rocket science in Ecofris, said West, just good process engineering, some of which is sourced outside the IT industry from sectors like the pharmaceutical industry: “We don’t buy much plant form traditional data centre plant suppliers.”

The system also uses computing – in the form of CFD (computational fluid dynamics) software to model the air flow.

Read more i nour interview with Mike West.

Click to read the authors bio  Click to hide the authors bio