March Madness: 3 Common Ways Data Centers Are Being Mismanaged
March Madness is the widely used shorthand term for referring to the NCAA Division I Men's Basketball Tournament, an annual event that begins every March and ends in early April. It is one of the most watched sporting events each year. The 2015 edition drew almost 20 million viewers for the 10:45 p.m. to 11:00 p.m. block for a game between the University of Notre Dame and the University of Kentucky.
The "madness" in the name is not primarily about the scope or even the intensity of the event, but rather the unpredictability of its outcomes. Upsets, based on the betting line or the seedings (i.e., rankings) of the teams, are so common that many websites sponsor contests with extravagant prices for perfect brackets, safe in the knowledge that correctly picking all 63 games is almost mathematically impossible.
The volatility of postseason basketball provides a good opportunity to talk about another kind of madness: inefficient stewardship and mismanagement of data centers. If missed free throws and turnovers are the key ingredients of March Madness upsets, then failure to take advantage of data center virtualization is the secret sauce behind many of the biggest issues in these facilities. Let's look at three of them, all of which can be solved via virtualization:
1. Too much overhead for hardware and software
"Virtualization trims costs by separating the application, service, or operating system from the underlying hardware."
Events such as unplanned downtime - which can cost an organization more than $100,000 per hour, according to the Rand Group - are especially problematic for data center operators because the normal expenses of managing a facility are already so high. These costs come from:
- Software licensing
- Manual backup systems
- Hardware upgrades
- Paying for unused capacity
Virtualization trims these costs by separating the application, service or operating system from the underlying hardware, in effect emulating it so that it can be run on multiple clients. Up-to-date hardware and software become less important since existing resources can be more fully utilized.
2. Difficulty in scaling or sharing IT services
"Virtualized data centers have many advantages in terms of cost and scalability."
Traditional data centers are not built in a modular style, meaning that it is difficult to expand their footprints as business requirements evolve. This shortcoming prevents organizations from sharing applications such as virtual desktops and unified communications solutions.
Enter data center virtualization. Apps and services can be managed from a central platform, while security protections can be applied consistently across virtual machines. The result is highly scalable IT operations that lend themselves to easy collaboration across offices. Employees also get the flexibility to access virtualized programs from anywhere with an IP network connection, without needing specialized hardware to run it.
3. Lack of data portability
What if a server overheats, fails and takes its stored data with it? Or what if you need to make a backup but have trouble getting a server to boot? VMs simplify both of these tasks.
With data center virtualization, data can be easily backed up to VMs and moved between servers and facilities. Snapshots saved to VMs also ensure that you have access to the most up-to-date data for easier version control.
Learn the ropes of data center virtualization
Many technologies are involved in the virtualization of a data center, from platforms such as Microsoft Windows Server to various VMware and cloud computing (e.g., Azure and Amazon Web Services) products. IT pros can reliably advance their careers by mastering at least a few of these competencies. New Horizons Learning Group provides a wide range of courses in Windows Server, Azure and the other building blocks of effective data center virtualization.