Data Centre Efficiency – There Is No Magic Measure

At the moment, the holy grail of data centre regulators is to find a single productivity metric which measures how much ‘useful’ IT work a data centre does for each kiloWatt hour (kWh) of energy it consumes. The idea being that data centres – like household fridges or freezers – can then be given a simple mark which shows how energy efficient they are – or not – along a sliding scale.

Such an initiative will have significant implications for the market because big customers like the US Federal Government will require their suppliers to achieve a minimum score in whichever metric is selected.

Metrics could damage the industry

However for me, it’s neither possible nor desirable to have an all encompassing metric that indicates how much ‘useful IT work’ a data centre achieves. Whilst I understand that regulators want to regulate and I’m all for cutting costs and improving energy efficiency in the data centre – this isn’t how to do it. Not only is it a flawed idea, it could have a dramatic, damaging and long lasting impact on the data centre market as a whole.

Firstly, why is it not possible? The reason why it will never be possible to develop a useful single metric is because every business’ definition of ‘useful’ is different and subjective. In fact, people with different responsibilities inside a single business often have different opinions on what data centre services are ‘useful’. An enterprise data centre provides a broad range of different software services to the business – each of which will have a very different value to the business units that receive them. Some of these services will have a very direct and measurable value – others less so.

How do you measure value?

For example, let’s consider the way a credit card business might use its data centre services. A company like this will require a range of services – such as call handling, payment services, account queries and back up services – from its enterprise data centre. However, it’s very unlikely that the business’s CFO will view all these services as being equally important or valuable. Each service will be viewed differently depending on how critical it is to business activities.

The business will be far more likely to place a higher business value on its ability to process payments than it will on, say, the ability to provide a customer with an account balance. It’s common sense that a CFO would expect a different weighting of reliability, productivity and availability across different operations and this means they are likely to score differently in ‘useful IT work / kWh’.

However much regulators might want it to be, IT is not an undifferentiated utility service like water or electricity. Each kWh from an outlet or litre of water from a tap is interchangeable and has the same value. The same is not true of IT services.

Email is not substitutable for high frequency trading, ‘IT units’ cannot be used interchangeably between different businesses and different corporate environments, despite some of the marketing claims for ‘cloud’ services. However, this is exactly how data centres are being treated by regulators looking for a single metric that can be applied in the same way to all IT users. The bottom line is that we can only have a sensible discussion about IT ‘productivity’ and the resulting efficiency in the context of what business value these services deliver.

Why bother?

So, a single metric might not be possible – but is it even desirable? I believe not. If we allow the regulators to pursue their goal we will most likely see data centre operators aiming to meet the minimum standard of the metric rather than lower their overall emissions.

To explain this point further, we must look to other sectors which have had similarly flawed regulation imposed on them – such as the automotive industry. In this sector, regulators have created a minimum average efficiency for each manufacturer across all cars sold. Of course, the aim here is to encourage manufacturers to make cars more efficient – but often, this isn’t what happens at all. What happens instead is that some manufacturers have continued to sell their ‘premium’ models as before but have also introduced one, very low emission engine vehicle into their range which brings their average energy efficiency ‘score’ in line with regulation.

If we consider the data centre market, it is easy to see how similarly misconceived metrics will drive similar behaviour. If I operate a financial services data centre full of high frequency trading platforms and I need to meet the target ‘efficiency’ it’s unlikely that I’ll change how I deliver my HFT service, whether it is inefficient or not. Instead I will probably  buy some other operator who delivers commodity, low impact services with very low overheads. Alternatively, I might choose to run a dummy “test trades” on my platform all night and all weekend to keep my utilisation up and meet the target “efficiency” that way.

A single magic metric for data centres will almost inevitably both be unfair and fail to deliver the goals of increased energy efficiency, however we choose to measure it. The quest for a single catch-all metric of IT efficiency is much like the quest for the perfect meal, entirely subjective to the person making the assessment, neither comparable nor transferable to anyone else.

Liam Newcombe is CTO at data centre predictive modelling software company Romonet, and is also technical secretary for the BCS Data Centre Specialist Group

How well do you know Green IT? Find out with our quiz!

TechWeekEurope Staff

View Comments

  • Liam,

    Excellent piece you've written here. I especially like your analogy and explanation of the automotive fuel efficiency standard.

    While I could argue that basic measurements of anything have yet to become commonplace (at least in the US), it is encouraging to see the notion of "value" come more and more to the fore in these discussions.

    We see a number of data center metrics beginning to include some notion of purpose to the Business (eg, CADE, FVER, and so on). However, the method of measuring this value is very much a work in progress. I'm going to be speaking soon at DCD about the notion of "useful work," in this very context.

Recent Posts

Vodafone Germany Confirms 2,000 Job Losses, Amid European Restructuring

More downsizing at Vodafone after German operation announces 2,000 jobs will be axed, as automation…

14 hours ago

AI Poses ‘Jobs Apocalypse’, Warns Report

IPPR report warns AI could remove almost 8 million jobs in the United Kingdom, with…

15 hours ago

Europe’s Longest Hyperloop Test Track Opens

European Hyperloop Center in the Netherlands seeks to advance futuristic transport technology, despite US setbacks

16 hours ago

NHS Scotland Confirms Clinical Data Published By Ransomware Gang

NHS Dumfries and Galloway condemns ransomware gang for publishing patients clinical data after cyberattack earlier…

17 hours ago

Fewer People Using Twitter After Musk Takeover – Report

Research data suggests fewer people are using Elon Musk's X, but platform insists 250 million…

20 hours ago

Julian Assange Wins Temporary Reprieve For US Extradition Appeal

US assurances required. Julian Assange handed a slender reprieve in fight against his extradition to…

22 hours ago