If We Cool Our Data Centres With Liquid, What Do We Do Next?

Peter Judge

Well done Intel for getting a PUE of 1.02, says Peter Judge. Now let’s look beyond the PUE measurement for efficiency

Liquid cooling has got a stamp of approval from Intel, producing stunning thermal efficiency for data centres. This is great, but it  should just shift the attention of the Green IT movement onto the next source of wasted energy.

Intel tested systems from Green Revolution Cooling (GRC), and found that dunking servers in a bath of mineral oil for a year did not do them any harm at all. In fact, the experience made them more efficient, as liquid removes heat better than air does.

Hot oil that stops chips frying

Intel quoted a PUE (power usage effectiveness) figure of 1.02 to 1.03 for the GRC system, and compared this with 1.6 for “traditional server racks”. It’s not clear what they were comparing their set-up with, as many data centres based on traditional server racks claim PUE figures much better than 1.6.

It’s also not clear what was included in the PUE for the GRC kit, as proper PUE figures refer to a whole data centre, not an isolated rack.

But those are quibbles. A PUE of 1.02 is a very good result. For every 1 Watt of power delivered to the server, only 0.02 of a Watt or less was used to power the GRC carnotJet system which cooled it (PUE is simply the ratio of the total power used, to the power delivered to the IT kit).

You may be asking why Intel isn’t pressing  ahead and moving its data centres over to oil cooling straight away, if the results are that good. Well, for one thing, the GRC system involves installing racks horizontally, which would mean completely refitting the data centre floors, and would greatly reduce the number of server per square metre – which is of equal importance to the power consumption in almost any data centre.

Liquid cooling advocates are welcoming the result though, and seeing it as another step towards their long-term victory. With this in our armoury, it really should be possible to all but eliminate energy waste from the cooling in our data centres.

If energy is sorted, let’s cut wasted cycles

But that really should not be the end of the story. Even if it took no energy to cool them down, and all the energy reached the processors, no one could say data centres are superbly efficient. What happens to all the processor cycles that energy produces? How many are wasted by systems which are idling?

In the end, a PUE result this close to the ideal of 1 is actually a sign that we may ultimately need a new measure of efficiency. The BCS Data Centre Specialist Group has put forward a suggestion as to what that could be: FVER, or Fixed to Variable Energy Ratio.

FVER is actually very similar to PUE. Where

PUE  =  total facility power / IT power

the definition of FVER uses energy, and divides the total energy used by the variable energy – i.e. the energy used when the data centre is doing actual work.

FVER  =  total energy/variable energy

The switch from power to energy is neither here nor there, as both PUE and FVER are defined as an average over the year. The real difference is in replacing total IT power with the “variable” part, the bit where actual work is done. Most of the energy included in that “IT power” element is actually not used for anything useful.

“Why does your data centre use ninety-something percent of its full ‘useful work” power to do nothing?” asks Liam Newcombe of Romonet, who started the FVER campaign. “Finding another 0.01 off your PUE is not the big opportunity any more, taking a big bite out of the IT power is.”

He goes on: “The main problem we are trying to focus attention on is the constant power draw from the IT equipment which is the same in an enterprise data centre at 3am on Sunday as it is at 3pm on Monday despite the huge difference in output. Nobody would run a factory or any other industrial process this inefficiently so why do data centres get away with it?”

So even if liquid cooling solves the problem of heat in data centres, there will still be plenty more Green IT work to be done.

Are you a a cool green tech guru?  Try our quiz!