The recession may be cutting our emissions. It’s better if we can apply efficiencies while still growing, says Peter Judge
A report this week says that data centre energy use is not growing as fast as had been feared. The bad news is that the slowdown is probably only temporary, and there is no evidence that it’s been brought about by better efficiency.
Worse news is another survey suggesting that – while new services can be built in the best possible manner – there may be limits to how much efficiency can be achieved with existing services.
Between 2000 and 2005, the amount of power used in data centres doubled, and it was expected to double again in the second half of the decade, according to a 2005 report by the Environmental Protection Agency (EPA).
But the growth in the second half of the decade turned out to be much less than in the first half, according to a report commissioned by the New York Times. Worldwide, the energy used by data centres went up by only 56 percent over the five years, and in the US, it only went up by 36 percent, said the author Jonathan Koomey, a professor at Stanford University.
Green IT experts can’t take the credit
What happened? The second half of the decade was the period when efficient data centres grew as an idea – could it be that the movement made that much difference, that quickly?
Well, no, says Koomey. It wasn’t the Green IT movement that restricted energy growth. It was the economic downturn.
I think that’s a bit gloomy. The period did see explosive growth in social networking (and in data centres for government and big corporations). I’ve not got all the figures Koomey has, but I reckon efficiency measures and consolidation in data centres must have had some impact on the overall figure. If nothing else, the economic downturn and the cost of energy will have made companies look at their bills and find how to reduce them.
Some efficiency has been achieved during that time by consolidation of data centres. That’s been going on in all sectors – and most people expect that to continue.
That’s where this week’s second report is a disappointment. Consolidation is going ahead in all sectors, but a study carried out for Juniper looked at US Federal government IT and found that the process may reach a limit, because of complexity.
As services from different organisations and departments are consolidated, they are brought together in sites, which then have an incredible amount of different software and services. Government operations are naturally diverse of course, but I’m sure similar things are happening in the private sector too.
In 60 percent of US government data centres, there are more than 20 operating systems, says the study, perhaps surprising some people unaware there are that many OSs extant. Also, the report says that 48 percent of sites have more than 20 management software applications in use. This figure is very sad: all those management applications – each one of which will have been sold as the single panacea for all IT ills – end up managing one or two little applications, taking up resources and making the IT managers job more complex.
The end result? “Federal data centre consolidation will stall when the cost of managing the complexity approaches the savings captured from consolidation,” said Brian Roach, vice president, Federal sales, Juniper Networks. And you can bet the same thing applies to a lot of private sector consolidation, and to things over here in the UK.
You can see where Roach is going , or course. He’s about to make a pitch for how Juniper can help simplify things, similar to the short-lived pitch from HP last year, about cutting “Information Gridlock” (this year, apparently, HP is all about the “Instant-On Enterprise”).
What I get from this is a reminder, that cutting energy use by consolidation requires serious systems work. It’s still preferrable to the kind of energy reduction produced by the recesssion. That generally involves someone going out of business.