Ultimate cloud speed tests: Amazon vs. Google vs. Windows Azure
A diverse set of real-world Java benchmarks shows Google is fastest, Azure is slowest, and Amazon is priciest
The most interesting options come from Amazon, which has an even larger number of machines and a larger set of complex pricing options. Amazon charges roughly double for twice as much RAM and CPU capacity, but it also varies the price based upon the amount of disk storage. The newest machines include SSD options, but the older instances without flash storage are still available.
Amazon also offers the chance to create "reserved instances" by pre-purchasing some of the CPU capacity for one or three years. If you do this, the machines sport lower per-hour prices. You're locking in some of the capacity but maintaining the freedom to turn the machines on and off as you need them. All of this means that you can ask yourself how much you intend to use Amazon's cloud over the next few years because it will then help you save more money.
In an effort to simplify things, Google created the GCEU (Google Compute Engine Unit) to measure CPU power and "chose 2.75 GCEUs to represent the minimum power of one logical core (a hardware hyper-thread) on our Sandy Bridge platform." Similarly, Amazon measures its machines with Elastic Compute Units, or ECUs. Its big fat eight-CPU machine, known as the m3.2xlarge, is rated at 26 ECUs while the basic one-core version, the m3.medium, is rated at three ECUs. That's a difference of more than a factor of eight.
This is a laudable effort to bring some light to the subject, but the benchmark performance doesn't track the GCEUs or ECUs too closely. RAM is often a big part of the equation that's overlooked, and the algorithms can't always use all of the CPU cores they're given. Amazon's m3.2xlarge machine, for instance, was often only two to four times faster than the m3.medium, although it did get close to being eight times faster on a few of the benchmarks.
Caveat cloudster
The good news is that the cloud computing business is competitive and efficient. You put in your credit card number, and a server pops out. If you're just looking for a machine and don't have hard and fast performance numbers in mind, you can't go wrong with any of these providers.
Is one cheaper or faster? The accompanying tables show the fastest and cheapest results in green and the slowest and priciest results in red. There's plenty of green in Google's table and plenty of red in Amazon's. Depending on how much you emphasize cost, the winners shift. Microsoft's Windows Azure machines start running green when you take the cost into account.
The freaky thing is that these results are far from consistent, even across the same architecture. Some of Microsoft's machines have green numbers and red numbers for the same machine. Google's one-CPU machine is full of green but runs red with the Tradesoap test. Is this a problem with the test or Google's handling of it? Who knows? Google's two-CPU machine is slowest on the Fop test -- and Google's one-CPU machine is fastest. Go figure.
All of these results mean that doing your own testing is crucial. If you're intent on squeezing the most performance out of your nickel, you'll have to do some comparison testing and be ready to churn some numbers. The performance varies, and the price is only roughly correlated with usable power. There are a number of tasks where it would just be a waste of money to buy a fancier machine with extra cores because your algorithm can't use them. If you don't test these things, you can be wasting your budget.