Windows Server 2012 Is Cloud-Augmented, Not Cloud-Hosted

Eric is a veteran British tech journalist, currently editing ChannelBiz for NetMediaEurope. With expertise in security, the channel, and Britain’s startup culture, through his TechBritannia initiative

Windows Server 2012 does not change the traditional model for Windows servers – it just extends it with cloud delivery, says Microsoft’s Wayne Mayer

The release of Windows Server 2012 has brought many changes within the operating system. Many of these are under-the-hood incremental improvements in operations and efficiency but the main additions in virtualisation and cloud services support are worth a mention.

Windows Server has features which support cloud infrastructures to make it easier to move to a pure or hybridised cloud architecture. Talking to TechWeekEurope, Wayne Mayer, Windows Server product manager at Microsoft UK, stressed that this is not switching over to delivering servers in the cloud, but using the standard server model that has been developed over two decades – but augmented with cloud deployment capabilities.

Cloud support

“It is a platform that allows services across a cloud,” Meyer (pictured) said. “When you think about a consistent platform across what we have on premise today – those traditional IT departments that have their on-premise infrastructure. Today, these departments may want to burst to the cloud, say a Windows Azure environment, when they need to scale out, or they may want to scale out to a local cloud service provider. It’s not ‘cloud-based’ as in cloud hosted, I want to make that clear, it is one single platform that takes us across multiple infrastructures to give customers a choice of how they want to go ahead and consume different pieces of technology.”

The ability to support and tie together established in-house server applications and new cloud services is essential to keep Windows Server central to many companies’ IT provision. Microsoft accepts that the future will see hybrid combinations, sais Mayer, with cloud services playing an important role in relieving the load on IT departments and creating a more efficient working environment.

“In the near future it’s all about hybrid IT,” he said, “and it is really about offering that choice for IT departments as to how they want to consume those services. Commodity services can be put up into the cloud – things they maybe don’t want to run and manage themselves. Messaging or emails handled through Office 365 is a perfect example of delivering that. While email is really important to business, some businesses typically struggle in providing large amounts of storage for their end users or maybe they struggle to provide the mobility their users require, whereas Office 365 enables that automatically.”

The key element to tie these remotely hosted services and existing facilities is Active Directory which enables a user to use a single sign-on to hide the complexity of the system. Using their usual credentials, an employee is immediately connected to all the services they require in a blended environment that hides the complexity of binding numerous services together.

Windows Server 2012 also tackles the problems of combining disparate storage “pools” into a single virtual entity. To achieve this, there is a new file storage format, VHDX, with the X implying the extensions made to the current Virtual Hard Disk format used by Microsoft HyperV server virtualisation.

“VHDX, which is designed to better handle current and future workloads and address the technical demands of the needs of an enterprise,” said Mayer. “The migration is fairly straightforward because you can continue to use the old format but you can go ahead and use the new VHDX which has a maximum file size of 64TB, whereas with the old you were looking at 2TB. We want to encourage users to move forward and there are efficiencies to gain out of that.”

With such a large storage resource, the associated tools for managing the environment becomes a crucial issue. Chris Losch, enterprise infrastructure architect at the London Borough of Newham, a Windows Server 2012 beta site, explained, “There are a number of tools, one of which enables you to see what kind of data de-duplication numbers you can expect. We ran it on our EDRMS [electronic document and records management system] data source which is a 1.93TB store and we saw a 27 percent saving which equated to 524GB. We have seen higher savings in things like source repositories of around 44 percent and in one particular area we have seen up to 65 percent which was a generic document store which wasn’t part of the DMS system. Between these two areas there was a 1TB saving – which is not insignificant.”

Virtual licensing

The most interesting changes in Windows Server 2012 surrounds the Hyper-V virtualisation engine. The hypervisor has been improved to allow up to 8,000 virtual machines (VMs) per cluster. Mayer admitted, “I think the number of customers that would have 8,000 VMs per cluster is relatively small but what it does  show is that in Hyper-V version 3 there has been a massive amount of focus and scope around building out enterprise-scale performance to be able to support those really large workloads that we have within the business environment today.”

The two versions of Windows Server, Standard and Datacenter Editions, differ only in how the virtualisation licensing works. In Standard, the licensing has changed to a per processor scheme. The basic unit is a two processor server which requires a single licence to allow two VMs to run on that machine. The next step up is two licences to run four VMs – the minimum required for a four processor server.

Perhaps the most significant development is the unlimited licence in the Datacenter Edition. This removes the constraints which have concerned virtual machine managers who want the freedom to run up virtualised application severs but have had to monitor VM licensing limits.

This could be a deal-maker in local government sites where neighbouring authorities are experimenting with shared resources. By pooling the hardware requirements, large savings can be delivered which appeal to councils looking for reductions in their budgets.

Newham is partnering Havering Council in one such rationalisation but Losch wants to allow Havering a degree of autonomy in how their share of the datacentre is managed. Newham has developed templates for VM configurations which Havering can access and deploy.

“We manage the templates so when they [Havering] want to provision a new version of Windows Server with, for example, an enterprise level Datacentre version, we build the template for them and make it available. The application support staff can just request a [virtual] server and start putting their applications on top of it – so we are effectively running Infrastructure as a Service as a cloud offering.”

Naturally, Newham has to manage the virtual estate through Windows System Center 2012 but licensing compliance issues are not a concern through its use of Datacenter Edition.

The change could also present a challenge to market leader VMware. With Hyper-V instances effectively becoming unlimited and free, there is a possibility that the virtualisation specialist will feel pressure gradually growing if Hyper-V replaces VMware. The position could mirror the way in which Microsoft Windows trashed Novell’s Netware local area networking supremacy over a five year period during the 90s.

How well do you know Microsoft? Try our quiz!

Microsoft is attempting to further this potential incursion. “We recognise that customers have an investment in terms of learning,” said Mayer. They may have several people who are experts in VMware, we have tools to help them migrate and we’ve got online, free training at for Hyper-V and VMware professionals to help transfer those skills.”