SPEC rolls out virtualization benchmark rival to VMware's

Nonprofit's benchmark could prove an easier, more flexible alternative to VMmark -- but it's not cheap

So you've rolled up your sleeves and virtualized your data center or server room to wring more performance out of your existing hardware -- and hopefully save a few bucks on power and cooling in the process. The challenge, though, is figuring out whether your virtualized environment is performing as efficiently as it could. Whereas some planners have found it necessary to cobble together homegrown test methodologies for virtualization workloads, the standard bearer has been VMware's VMmark benchmark.

For some IT admins, though, VMmark hasn't proven sufficiently practical or easy to run as they might've like. Plus, it can be tough to get past the fact that though VMmark is billed a vendor neutral, it comes from a vendor with a vested interest in scoring well in virtualization tests. But now there's a new benchmark on the block that might prove a more desirable alternative to VMmark: SPEC has unveiled SPECvirt_sc2010, designed to assess the performance as well as power efficiency of various types of workloads in a virtualized environment.

[ Also on InfoWorld.com: Will vSphere 4.1 help VMware compete with Microsoft, Citrix for small-business users? | Check out the InfoWorld Test Center's first look at VMware vSphere 4.1 | And find out more about breaking through the second phase of virtualization and getting beyond VM stall ]

The benchmark, according to SPEC, measures the end-to-end performance of all system components, including the hardware, virtualization platform, and the virtualized guest operating system and application software. Taking into account that workload types vary from organization to organization (or from machine to machine within an organization), the benchmark supports three types of test apps: a Web workload based on SPECweb2005; a Java application server workload based on SPECjAppServer2004; and an IMAP workload based on SPECmail2008.

The workloads are injected at different time periods during the benchmark run, representing the spikes experienced in real-world server environments. Additional sets of virtual machines are added until the overall throughput reaches a peak or workloads fail to meet required quality of service criteria. The test takes around three hours to complete at its default settings, according to SPEC.

1 2 Page 1
Page 1 of 2