Database Virtualisation: The Next Big Thing?
IT managers are well aware of the cost savings that can be achieved by virtualising servers, but now experts say they should focus on virtualising inhouse databases.
With virtualisation, Yuhanna said, the real key is to make the best use of existing resources. “It’s about reusing resources and transparent scale-out – common policies, improving SLAs, and centralised management and mixed workload optimisation,” he said.
The ability to mix workloads with the same database is a significant benefit of database virtualisation.
“Traditionally, you had a warehouse and a transactional system,” Yuhanna explained. “You need access to the same data by several different applications.”
Yuhanna added that virtualisation allows you to decouple the application from the data.
“It’s being able to unify data and make it sharable,” Yuhanna said, describing the benefits of database virtualisation. “More than 30 percent of data in organisations is duplicated because they have to use those databases by multiple applications.” This means that keeping the data consistent is a problem, and it also means that companies are using many databases to access duplicate copies of what is supposed to be the same data. It’s a critical requirement by organisations to unify their data to get a single version of their data. They need consistent data for the user.
Yuhanna noted that database virtualisation is closely tied to data virtualisation and federated data. “We are seeing more trends to heterogeneous data,” he said. “You federate the data into a common meta layer. Forrester calls this the information fabric. Below the data layer is the database virtualisation layer.”
Of course, the idea of database virtualisation is nice, but the real question is how it works in the real world. Greg Asta, director of software development for Omnigon Communications, has the street-level view. Asta is developing a virtualised database application for an agency of the US government. While he’s not allowed to say which agency or to describe the application (it’s classified), he was willing to talk about why he’s using database virtualisation.
Asta said that Omnigon is using Xkoto’s Gridscale database virtualisation system to provide the capabilities he needs for this project. “We use it for a combination of high availability as well as active-active replication and multimaster replication,” he said. “There’s a strong desire for clients who are investing in multiple server locations to not have an active-passive setup.”
In the case of his client’s database, “I can run it off of multiple servers at the same time,” Asta said. “We use virtualisation to maintain data repositories. Virtualisation gives us a highly available infrastructure. We can do rollouts without taking the system down and have outages without going down. We can refer to a virtual database instead of any n-number of databases, which helps development and makes for a more simple architecture.”
The key requirement for Asta’s client is availability. “If the main data centre goes off the grid, because the database is virtualised, they can continue their session. In the ideal scenario, they wouldn’t notice,” he said.
Right now, only a few companies, including Xkoto and Xeround, provide database virtualisation solutions. The 451 Group’s Aslett said Continuent also makes products that provide database virtualisation capabilities, although the company doesn’t make that claim explicitly. Enterprise Strategy Group’s Babineau said he expects all of the major database players to enter the market in the near future.
Contributing Analyst Wayne Rash can be reached at wrash@eweek.com