With a management consulting background, I have seen so many fads sweep through business communities both here in the US and abroad that I can be as skeptical as the next person when it comes to the latest and greatest thing grabbing so much attention. I approached virtualization with a squinted eye, did a lot of homework, and came to the conclusion that, although virtualization is not right for everyone, it has substantial merit and is far beyond being the latest gimmick. This particular technological breakthrough has received global endorsements from the true movers and shakers in the IT world who certainly do not agree on everything – HP, Microsoft, IBM, Dell, EMC, not to mention major analyst firms. What are the factors that are driving virtualization to the forefront?
1. Hardware underutilization
Processing power has been approximately doubling every 18 months and there are no signs of this changing any time soon. In fact, I predict that 25 years from now, the realities of processing power will seem science fictional when compared to our current processing power. I remember the first computer I bought in 1990. The reality of processing power today was science fiction in 1990. There isn’t any CPU from 1990 that could crunch through any application on my computer today. Currently, we can run software without tapping into very much CPU usage.
We are now utilizing only a fraction of a processor’s full capacity (Microsoft estimates this fraction to be 10 to 15 percent). This means that computing power is being wasted, especially in data centers, where 85 to 90 percent of the servers go unused. Despite this waste, the servers continue to require power and air conditioning. That translates into a waste of corporate funds.
This problem is compounded by the fact that in traditional computing, IT has been limited to “one application, one server” causing the purchase of additional machines for additional applications while the machines that already exist continue to be underutilized.
Virtualization technology has broken through the “one application, one server” barrier and made it possible to consolidate multiple applications onto one machine. This is a sound and cost-effective solution to the problem.
2. Server sprawl causing data centers to run out of space
A whole heck of a lot of servers have been purchased and put to use since 1990. In fact, it has dramatically escalated since 2000. Why? There has been an explosion of data requiring new ways to store all that data. Bernard Golden, MBA, states, “In 2003 the world’s computer users created and stored 5 exabytes (each exabyte is 1 million terabytes) of new data. A recent study by the Enterprise Strategy Group predicted that governments and corporations will store over 25 exabytes of data by the year 2010.” Let me break that down even further: 1 exabyte = 1 million terabytes; 1 terabyte = 1,024 gigabytes; 1 gigabyte = 1,024 megabytes; 1 megabyte = 1,048,576 bytes; 1 byte = 8 bits = 1 keystroke.
The Internet has certainly contributed to the problem. In September 2009 there were 1,733,993,741 worldwide Internet users. This represents a 380.3% growth between 2000 and 2009. Today more than 25% of the population of our planet use the Internet. And it is going to further expand.
Virtualization technology is a sound and cost-effective solution to this problem. It makes it possible to host multiple guest systems on a single physical server and saves money for a company by not having to secure as much real estate or as many machines in order to store all their information. It is a HUGE benefit.
3. Significantly rising energy costs
The cost of electrical power to run a business has dramatically increased over the years. A company’s IT infrastructure greatly contributes to that number. When you think of the electricity required to power a computer in each workstation, multiple servers, appliances and peripherals, plus air conditioning to keep those servers cooled down, compounded by the number of servers required to satisfy the “one application, one server” rule, it is relatively elementary math to see the impact of that infrastructure on a company’s energy bill.
Virtualization technology offers a sound and cost-effective solution to this problem. Heck, Pacific Gas & Electric has even introduced a virtualization rebate program.
4. The rising costs in administering IT systems
Even one computer requires someone to look after it, even if it is just the user. A computer doesn’t just take care of itself. It has to be started, sometimes restarted, made secure, defragmented, optimized, cleaned, etc. In a network of computers someone is going to be forced to wear an additional hat called IT Administrator and handle what can be handled with very limited technical expertise and call an expert when expert handling is required. A tiny company usually operates at a break-and-fix level with fingers crossed in hopes of not having to spend anything to simply keep all those machines running. But if that tiny company wants to survive for a period of years, it must expand. And expansion requires an expansion of that patchwork quilt called an IT infrastructure. At some point a single-hatted IT Administrator must be hired, then another and another.
System administration is a labor intensive job. I once consulted the US office of a software company. The office had a total of 90 employees with 4 of those dedicated to system administration. That consisted of monitoring hardware status, replacing defective hardware components, installing OS and application software, installing software patches, updating software applications, monitoring critical server resources such as memory and disk use, and backing up server data to other storage mediums for security and redundancy purposes.
System administrators do not come for a song either. The four IT people working in that US office cost the company around $250,000 per year. So at a certain point of expansion, the cost for all this administration becomes significant.
And while virtualization technology will never eliminate the need for system administration, it can be effective in lowering its cost by reducing the amount of system admin work necessary for hardware. One would still need system admin work for the guest OSes in a virtualized environment. This translates into less IT personnel required while at the same time making an improvement in the system administration overall.