Cloud Computing | Feature
Breaking Through the Cloud Cover: Virtualization and the Cloud Are NOT the Same Thing
A brief overview illustrating the differences between virtualization and cloud computing.
The terms cloud computing and virtualization are often--albeit mistakenly--used interchangeably. Unfortunately, this perpetuates the idea that they refer to the same thing. They do not. If you're confused, you're not alone. Many people don't know the differences between the two, so we've put together a concise overview that will help keep them straight in everyone's minds.
Why the confusion?
Simply put, there are similarities between the two, and they are occasionally used for similar purposes. There are numerous articles and blog posts about helping consumers decide which better suits their needs--virtualization or cloud computing. But they aren't mutually exclusive! You don't have to have one without the other, and in many instances, virtualization that is hosted on the cloud or a cloud that provides virtualization might be a user's best bet. The chicken or the egg, anyone?
At the forefront of the confusion are the attributes that virtualization and cloud computing share.
Both virtualization and cloud computing provide solutions that can:
- maximize use of computing resources
- add flexibility, agility, and scalability to management of computing resources
- manage multiple computers from a single source
- reduce the number of machines needed in a data center, thereby reducing:
- required space
- cooling systems
- staffing needs
- maintenance and hardware costs
But how--and why--they do these things is where the differences lie.
In its most basic definition, virtualization allows you to do more with less. Most computers aren't used to their full capacity. In fact, far from it. Virtualization offers a way to combine the requirements of multiple machines into one, making use of unused computing resources.
Virtualization comes in many flavors: storage, desktop, memory, operating systems, servers. For example, server virtualization takes several independent, physical servers and puts them onto one machine. In a way, it's like a mirrored fun house, tricking the system into believing there are more servers than there really are.
Virtual servers use less hardware, space, power, and even fewer cooling systems, in turn reducing hardware and administrative costs. Then, down the line, when the need for more server space arises, users can carve a virtual server out of the existing hardware, rather than going out and buying another machine. And the machine is being used much more efficiently than when each server occupied a separate machine.
The same principle can be applied to storage, desktop, memory, and operating system virtualization--the system fools the computer into handling multiple users' needs on one piece of hardware--a useful feature for school districts or companies that need to manage multiple machines from one location or provide identical software or services to multiple users.
Cloud computing, too, has its variations: the type of service being provided, such as software as a service (SaaS), infrastructure as a service (IaaS), or platform as a service (PaaS). The same is true with the management and delivery systems: public, community, hybrid, or private. At its core, cloud computing can be defined as a resource for managing and delivering a service over the internet.
The quickly expanding reliance on cloud computing is in harmony with the growing impact mobile technology has on how people manage their lives. Devices like iPhones and iPads were never designed to run a standard operating system, nor to house applications. By necessity, apps that could run off of a remote server and be accessed through the internet became a way in which mobile devices could mimic the functionality of desktop computers, solidifying the value of cloud-based services. That same technology has found its way back to desktop computers (not unlike the hardwired user terminals employed by early personal computers), and now the cloud delivers applications over the internet without installing the software on the individual PC.
But the cloud, with its utility-based, on-demand model, has the added benefit of elastic scalability. Users can quickly and seamlessly gain more computing resources when needed and then drop down to fewer resources when they aren't. And they only get charged for what they use. Computing resources in a public cloud are shared among all other users; with a private cloud, an enterprise has exclusive use of computing resources managed and held at a remote location, or the entire system can live on-site and within their own firewall.
It's more than likely that virtualization is already in place on the servers used by most public cloud providers--although the end user would never notice, see, or even care. On the other hand, a private cloud would necessarily want to be virtualized, so that a company could more easily manage its resource pool. Both virtualization and cloud computing offer a more manageable system for the end user, but perhaps it's by working in tandem that they can reach their fullest potential.
Forget the chicken and the egg. It's more like chicken soup with rice. You can have the soup, or you can have the rice, but they're both a little better when you have them together.