Need of Virtualization

Virtualization: Force driving cloud computing

We have been lately hearing a lot about Virtualization whenever there has been a talk about cloud computing. Although most of us may have ourselves used virtual machines in the past to solve a very different purpose altogether, most of us are not really sure what virtualization is and why it is so beneficial!

As usual, I post link to a relevant YouTube video here to get you readers started:

The contents of the video are fairly complete, but I might as well expostulate the same. Most businesses often use a combination of a number of application servers, web servers, image servers, document servers, audio and video servers, and not to forget the database servers.

Although contemporary web usage trends may suggest that all of the above mentioned hardware infrastructure is being used well almost all the time, this is largely a myth and more precisely, an ill-founded specious belief! If 75% of the hardware appears as being used at any point of time on the basis of average number of server requests recorded, the servers are still largely under-utilized. Hmm, it's a bit of a challenge to present this information more convincingly, but, I shall nevertheless give it a try!

What appears as active to us is largely superficial. The servers typically take only about (1-10) ms to service each request. If my estimate is flawed, I can only tell you that it should be much faster! Given this extremely short amount of time taken to service the request, the amount of time the server machine is kept up and running relative to the actual time spent by it servicing the requests, is much higher. This clearly demonstrates that a significant amount of energy is wasted per server in the process of keeping the servers up and ever-ready to service requests upon their arrival. I must again reiterate that the cumulative energy wasted is actually pretty high considering the fact that we use not one server for each purpose, but a number of them for different purposes.

What we must remember here is that efforts to maximize the server utilization is limited by the number of incoming server requests. So, if you have done your best to ensure that a server spends a good fraction of the time servicing requests, this is only as much as the number of requests the server receives at any point of time. So, how exactly do we eliminate this wastage and thereby maximize the profits?The answer to this problem lies with virtualization.

Virtualization essentially means to create multiple, logical instances of software or hardware on a single physical hardware resource. This technique simulates the available hardware and gives every application running on top of it, the feel that it is the unique holder of the resource. The details of the virtual, simulated environment are kept transparent from the application. Organizations may use this technique as illustrated by the video to actually do away with many of their physical servers and map their function onto one robust, evergreen physical server. The advantage here is the reduced cost of maintenance and reduced energy wastage which is not very surprising. As you have fewer physical servers, you need only maintain them and therefore maintenance becomes much easier and cheaper too. As for energy conservation, it is fairly implicit. The amount of energy wasted is a function of the number of physical servers which is clearly much lower in a virtualized environment. Also as far as desktop virtualization is concerned, as the video points out, updates may now be made available much sooner as a single firmware update does not update one client machine, but several instances of the same.

Now, I am not extending the scope of this post to include the technical minutiae. This post is only targeted at enlightening beginneers with regard to why exactly we need virtualization. The working details will be covered in a subsequent post which is due shortly :-P

Related posts: