Our tests show that Windows Vista and Office 2007 not only smash Redmond's previous records for weight gain, but given the same hardware diet, run at less than half the speed of generation XP
What Intel giveth, Microsoft taketh away. Such has been the conventional wisdom surrounding the Windows/Intel (aka Wintel) duopoly since the early days of Windows 95. In practical terms, it means that performance advancements on the hardware side are quickly consumed by the ever-increasing complexity of the Windows/Office code base. Case in point: Microsoft Office 2007, which, when deployed on Windows Vista, consumes more than 12 times as much memory and nearly three times as much processing power as the version that graced PCs just seven short years ago, Office 2000.
Despite years of real-world experience with both sides of the duopoly, few organizations have taken the time to directly quantify what my colleagues and I at Intel used to call The Great Moore's Law Compensator (TGMLC). In fact, the hard numbers above represent what is perhaps the first-ever attempt to accurately measure the evolution of the Windows/Office platform in terms of real-world hardware system requirements and resource consumption. In this article I hope to further quantify the impact of TGMLC and to track its effects across four distinct generations of Microsoft's desktop computing software stack.
To accomplish my goal, I'll be employing a cross-version test script – OfficeBench – and executing it against different combinations of Windows and Office: Windows 2000 and Office 2000; Windows XP (SP1) and Office XP; Windows XP (SP2) and Office 2003; and Windows Vista and Office 2007. Tests were first conducted in a controlled virtual machine environment under VMware and then repeated on different generations of Intel desktop and mobile hardware to assess each stack's impact on hardware from the corresponding era.
What does this all mean for Windows IT shops? Should they upgrade to Vista and Office 2007? Or should they stick with Windows XP and Office XP or Office 2003? As I’ve argued in a previous article, “Death Match: Windows Vista vs. XP,” most IT organizations will find they can safely skip a generation and avoid Vista altogether. In addition to sending a message to Microsoft that IT won’t tolerate bloat-ware, it also buys you time to allow the hardware cycle to catch-up with what will hopefully remain a static software target, or at least a slower-moving one (through Windows 7) – a way of putting TGMLC to work for you.
[ Monitor your own Windows and Office system performance with the new Windows Sentinel toolsfrom InfoWorld, which include the DMS Clarity Tracker Agent used in these tests. Share your questions and experiences in the companion blog.]
About OfficeBench: The OfficeBench test script is a version-independent benchmark tool that uses OLE automation to drive Microsoft Word, Excel, PowerPoint, and Internet Explorer through a series of common business productivity tasks. These include assembling and formatting a compound document and support workbooks and presentation materials, as well as gathering data through simulated browsing of a Web-based research database. (For more detail, see "Cross-generational Windows/Office performance: About OfficeBench.") OfficeBench is available for free download from the exo.performance.network Web site as part of the DMS Clarity Studio testing framework.
The Stone Age: Windows 2000/Office 2000
Back in 1999, when I was working as an advisor to Intel's Desktop Architecture Labs (DAL), I remember how thrilled we all were to get our hands on Windows 2000 and Office 2000 – finally, a version of the Windows/Office stack that could leverage all of the desktop horsepower we were building into the next-generation Pentium 4 platform. I remember it was also the first time I had a fully scriptable version of the Office suite to work with (previous versions had supported OLE automation only in Word and Excel). Shortly thereafter, the first version of OfficeBench was born, and I began my odyssey of chronicling TGMLC through the years.
First off, let me characterize the state-of-the-art at the time. The Pentium 4 CPU was about to be unveiled and the standard configuration in our test labs was a single-CPU system with 128MB of RDRAM and an IDE hard disk. A joke by today's standards, this was considered a true power-user configuration suitable for heavy number-crunching or even lightweight engineering workstation applications. It was also only marginally faster than the previous-generation Pentium III, a fact that Intel tried hard to hide by cranking up the CPU clock to 1.5GHz and turning its competition with rival AMD into a drag race.
Sadly, I didn't have access to an original Pentium 4 system for this article. My engineering test bed was long ago scrapped for parts, and I doubt that many of these old i840 chip-set-based boxes are still in use outside of the third world. However, I could at least evaluate the software stack itself. Through the magic of virtualization, we can see that, even with only 128MB of RAM, a Windows 2000-based configuration had plenty of room to perform. During OfficeBench testing, the entire suite consumed only 9MB of RAM, while the overall OS footprint never exceeded 132MB of RAM, roughly half of the available memory. Clearly this was a lean, mean version of Windows/Office. It chewed through the test script a full 17 percent faster than its nearest competitor, Windows XP (SP1) and Office XP. View the overall test results. View more detailed test results at xpnet.com.
Andi Mann from CA Technologies explains how devops is being adopted more by the enterprise
Stanza's on-stage presentation at Demo Traction 2015
Expect Labs' on-stage demonstration at Demo Traction on April 22, 2015
Sponsored by Nuage Networks
Sponsored by Fibre Channel Industry Association
Java synthesized sound ideas, repackaging them in a practical format that turned on a generation of...
Here’s what to expect from the newly released build 10122
Clear Linux, Intel's new container-based distribution, bristles with ideas for how to run containers...
Oh no! Big data is failing because we can't find enough people who know the technology! Relax, they're...