Just when you thought the Backup Wars were Dead...

OK, here's the latest news on the backup wars... and this time I had nothing to do with it. Red-Gate has commissioned a report from the Tolly Group to run a comparative benchmark between them, LiteSpeed, and Idera... a decision I would imagine they're already regretting. Get the report here. I hear the benchmark was my idea. After Idera released their benchmark earlier this year, I was talking to the Red-Gate gu

OK, here's the latest news on the backup wars... and this time I had nothing to do with it.

Red-Gate has commissioned a report from the Tolly Group to run a comparative benchmark between them, LiteSpeed, and Idera... a decision I would imagine they're already regretting. Get the report here. I hear the benchmark was my idea. After Idera released their benchmark earlier this year, I was talking to the Red-Gate guys at TechED and said that we should all get together and do a fair benchmark with everyone on the same platform. Apparently they ran with that idea. Which is fine, but I'm not sure that Tolly has done it the right way.

The report has come out a few times now. My Quest reps tell me that the report has changed a few times since it was released in Sept. And depending on the changes, that could be something or nothing. I'm not going to make a big deal out of that unless they've changed the numbers between versions.

However, what I do find interesting is that Red-Gate has come out on top. I've personally run a lot of independent tests against all 3 of these vendors, and haven't ever published the results because their EULAs prevent it. Let me say though that while Red-Gate is a good tool, it is single-threaded and has never been able to prove itself the hands-down winner over either of the other two.

So what does the report say that I find suspect? Well, for starters, they tried to make up for their lack of multi-threading by writing to 3 files while pinning LiteSpeed and Idera to 1 file. It's certainly interesting from a performance tuning perspective, but it reads more like a report on how to make Red-Gate perform in the same league with the others by overcoming its limitations. I was curious about whether just adding 3 files would be a fair multi-threading comparison, so I put LiteSpeed in my lab last night and ran a couple quick backups. I backed up a 200GB DB with a single file, and then ran the job again with 3 files. Oddly enough, I saw just over a 20% increase in performance by striping... and yes, to the same RAID array. So it appears that even though you can make Red-Gate perform better than expected, a single-threaded application simply can't compete with a mulit-threaded one. It just can't. It should be just as easy to add a striped backup to the Tolly results, and I'm really curious why they didn't do striped backups with LiteSpeed and Idera to show what the difference would have been. If it truly doesn't add that much to the equation, then they should have shown that. So I'm thinking that it clearly makes a difference.

I'll tell you one thing though. This report has shaken out one very important FACT. Idera released their benchmark earlier this year where they claimed to be the fastest backup on the planet because they performed it on solid state disks, and didn't compare it with anyone else. And while the Tolly report shows an unfair bias towards Red-Gate, it does put LiteSpeed and Idera in equal light, and LiteSpeed comes out with almost double the throughput of Idera in more than one graph. In fact, by looking at the Tolly graphs, LiteSpeed beats Idera in almost every benchmark. It just goes to prove that Idera's earlier benchmark was released to intentionally deceive the marketplace. If not, then Idera would have released comparative benchmarks with their competitors on the same platform, and not gone so far outside of the industry standard hardware to be able to make their claim as the world's fastest backup.

I don't believe there was any such intent from Red-Gate. Their reps have already told me that they merely commissioned the report, they had nothing to do with it after that. But it does seem a little odd that a single-threaded app could win by such margins when they paid for the report. I do believe though that Red-Gate themselves at least tried to do it right. They went to an independent firm and asked for a report. I think the issues here lie with the Tolly Group's test.

There's more to be had here. I still haven't contacted the Tolly Group, and I haven't heard back from my inquiries to Idera. There are some unanswered questions that I'm not going to bring up until I get something from these guys and can tell you what actually happened.

Until then, this is what I know so far.

Related:

Copyright © 2006 IDG Communications, Inc.