The oft-mentioned upside of virtualized data centers is the energy, time and money saved. With the technology, however, also comes security and planning issues.
Although the technology required to virtualize data centers has been around for more than 40 years, 2007 will go down in history as the end of the beginning the year the technology vaulted into mainstream consciousness and hurtled to the top of every CIO's "must-do" list.
Now that everyone has caught on to the benefits this technology can deliver, chiefly, the ability to reduce an organization's data center footprint by a factor of six, eight or 12, the focus is turning to making sure all these efficiencies and cost savings don't come at the expense of data availability and security.
"We've jumped over the chasm without even looking down below," Don Norbeck, director of product development at SunGard, said in an interview with InternetNews.com. "Luckily, we'll probably land on the other side. But there's still plenty of risk."
These risks, from an organizational standpoint, start long before the first virtualization application is downloaded.
Just like any other software installation, committing to a virtualization project requires not only an appreciation for the technological vulnerabilities inherent in any operating system like bugs, malware and access control but also a fundamental understanding of exactly which applications and systems are used the most, which are the most critical to operations, when they're used, and how to orchestrate the workload of all these applications running on both physical and virtual servers.
For those charged with the responsibility of managing and maintaining one data center or multiple data centers, the temptation to simply initiate a straight-line consolidation take the workloads running on 100 servers and cram them on to 10 or 15 servers is alluring. Every company wants to get greener, lower energy consumption, reduce the size of their data centers and have the ability to shift workloads with a simple click of a button.
"Now that everything is in the pool, people just have to push a button," Norbeck said. "And they'll keep pushing the button until the button doesn't work anymore. Without proper planning and provisioning, you're back to where you were before. How do you audit it? You push a button and then wait to see who starts complaining."
In other worlds, just as all servers are not created equally, neither are the applications running in a corporate data center.
And the stakes are increasing.
This article was originally published onInternetNews.
Next page: The SAP, Oracle and other vendor factors
On Dec. 11, SAP announced it will now begin supporting its virtualized enterprise resource planning (ERP) software running on VMware and 64-bit Windows, Linux and Solaris platforms. It's a sign of the virtualized times.
After all, it's one thing to have the corporate e-mail system or some back-end storage system running on a virtualized machine, but companies are now virtualizing their most critical applications. Workloads can spike unpredictably. Power outages occur. One malicious bit of code has the potential to infiltrate multiple applications. Patching, more than ever, becomes a preoccupation.
And no data center is an island unto itself. It's dependent on multiple vendors to make it all work: security, storage, applications, operating systems and networking equipment.
"Virtualization, as with any emerging technology, will be the target of new security threats," Neil MacDonald, an analyst at Gartner, said in a research report published earlier this year. "Many organizations mistakenly assume that their approach for securing virtual machines will be the same as securing any OS and thus plan to apply their existing configuration guidelines, standards and tools. While this is start, simply applying the technologies and best practices for securing physical servers won't provide sufficient protections for VMs."
MacDonald said that through 2009, 60 percent of production virtual machines will be less secure than their physical counterparts.
At the architecture level, it starts with the hypervisor, which is basically a stripped down version of the Windows or Linux or Solaris operating system.
You have 10 workloads and you merge them onto one," MacDonald said. "That's a very attractive target for a bad guy. Now, if I compromise just one thin layer, I get all 10 machines."
Nand Mulchandani, senior director of security product management and marketing at VMware, deals with the security implications of his company's industry-leading software all day long. Not surprisingly, he thinks most of the security concerns raised by the media and some security experts are overblown.
"Virtualization in some sense looks like a titanic shift in computing," he said in an interview with InternetNews.com. "But frankly, from a security and technology standpoint, it's not as radical as it's been portrayed."
That said, Mulchandani said VMware will be making some significant security-related product announcements in the first half of 2008.
"It's a temporal issue," he said. "The thinking is that anything new and different is going to have problems. Everyone is looking for a big Achilles' heel that no one is talking about. We have to roll with the punches. Tomorrow there will be another new thing in the industry that everyone will call insecure. That's life."
For VMware, Microsoft, Virtual Iron, XenSource and now Oracle, the focus will shift from functionality to security as the virtualization software industry matures.
Misconfiguration and mismanagement the propensity to set up default passwords insecurely has been the scourge of operating systems since their inception. And while virtualization vendors continue to strip down the core operating system in the hypervisor, there's no such thing as a foolproof virtualization project.
"From a threat profile, the most important thing customers should worry about is hardening their platforms," Mulchandini said. "Locking down your platform is something most people in the Windows or Linux world are used to doing. Securing the system and the code is purely and primarily on us."
Because all large corporations and most small- and midsize firms ' didn't have the benefit of a crystal ball, applications and the operating systems running those applications grew in a staggered, chaotic fashion and can't always be configured, provisioned or moved around in a tidy, virtualization friendly box.
But that probably won't dissuade companies from eventually embracing virtualization in their data centers.
"Where money is involved and efficiency is involved, people tend to overlook whatever minor queasiness they might have," Mulchandini said.
This article was originally published on InternetNews.