“Virtualization is a technology that will change the way enterprises handle their IT environments. From storage, to systems, to applications, to desktops, it promises to make business more agile, more secure, more efficient, more available, and more productive. It is hard to underestimate how important it will be.”
— Andi Mann, Research Director, Enterprise Management Associates
Virtualization has been around in computing since the mainframe days of the late ’60s. Those of us who are old enough to remember punch cards (carrying boxes of them around was a great way of getting exercise) might remember the IBM 360 mainframe system and the CP/CMS time-sharing operating system, which simulated the effect of each user having a full, standalone IBM mainframe at their fingertips. Each user’s “virtual machine” was fully independent of those belonging to other users, so if you ran an application that crashed “your” machine, other users weren’t affected.
PCs changed this paradigm in the ’80s, and eventually gave users’ physical machines that today are far more powerful than the mainframes of the ’60s and ’70s. But as desktop PCs began to proliferate, so did servers in the back rooms of most businesses. Soon you’d have two domain controllers, a mail server running Microsoft Exchange, a couple of file servers, a database server, a Web server for your intranet, and so on. Larger companies might have dozens or even hundreds of servers, some running multiple roles such as AD, DNS, DHCP, or more.
Managing all these separate boxes can be a headache, and restoring them from backup after a disaster can involve costly downtime for your business. But even worse from a business standpoint is that many of them are underutilized. How does virtualization for x86/x64 platforms solve these issues?
In a production environment, having a server that averages only 5 percent CPU utilization doesn’t make sense. A typical example would be a DHCP server in an enterprise environment that leases addresses to several thousand clients. One solution to such underutilization is to consolidate several roles on one box. For example, instead of just using the box as a DHCP server, you could also use it as a DNS server, file server, and print server. The problem is that as more roles are installed on a box, the uncertainty in their peak usage requirements increases, making it difficult to ensure that the machine doesn’t become a bottleneck. In addition, the attack surface of the machine increases because more ports have to be open so that it can listen for client requests for all these services. Patching also becomes more complicated when updates for one of the running service need to be applied-if the update causes a secondary issue, several essential network services could go down instead of one.
Using virtualization, however, you can consolidate multiple server roles as separate virtual machines running on a single physical machine. This approach lets you reduce “server sprawl” and maximize the utilization of your current hardware, and each role can run in its own isolated virtual environment for greater security and easier management. And by consolidating multiple (possibly dozens of) virtual machines onto enterprise-class server hardware that has fault-tolerant RAID hardware and hot-swappable components, you can reduce down-time and make the most efficient use of your hardware. The process of migrating server roles from separate physical boxes onto virtual machines is known as server consolidation, and this is probably the number one driver behind the growing popularity of virtualization in enterprise environments. After all, budgets are limited nowadays!
Being able to ensure business continuity in the event of a disaster is another big driver toward virtualization. Restoring a critical server role from tape backup when one of your boxes starts emitting smoke can be a long and painful process, especially when your CEO is standing over you wringing his hands waiting for you to finish. Having hot-spare servers waiting in the closet is, of course, a great solution, but it costs money, both in terms of the extra hardware and the licensing costs.
That’s another reason why virtualization is so compelling. Because guest operating systems, which run inside virtual machines (VMs), are generally independent of the hardware on which the host operating system runs, you can easily restore a backed-up virtual server to a system that has different hardware than the original system that died. And using virtual machines, you can reduce both scheduled and unscheduled downtime by simplifying the restore process to ensure the availability of essential services for your network.
Testing new platforms is a lot easier today because of virtualization. I can run a half dozen virtual machines easily on a single low-end server, and I can even set up a routed network without having to learn IOS by enabling IP routing on a virtual Microsoft Windows XP machine with two virtual NICs. Architects can benefit from virtualization by being able to create virtual test networks on a single server that mimic closely the complexity of large enterprise environments. Developers benefit too by being able to test their applications in isolated environments, where they can roll back their virtual machines when needed instead of having to install everything from scratch. The whole IT life cycle becomes easier to manage because virtualization reduces the time it takes to move new software from a development environment to test and then production.
Another popular use of virtualization today is to ensure application compatibility.
Suppose you upgrade the version of Windows you have running on your desktop and find that a critical application won’t run properly on the new version.You can run the program in application compatibility mode, using the Application Compatibility Toolkit to shim the application so that it works on the new platform. Or you can contact the vendor for an updated version of the application. Another alternative, however, is virtualization: install Microsoft Virtual PC 2007 on each desktop computer where the user needs to use the problem application, install the old version of Windows as a guest OS, and then run the application from there.