Here we go again: another expert recommending that people stop using a popular piece of software because it has too many vulnerabilities. In this case, I'm talking about F-Secure's recommendation to abandon Adobe's Acrobat Reader in favor of other PDF rendering programs, like Fox-It or any of the free alternatives available.
You'll often read similar recommendations to dump Microsoft's Internet Explorer (I work full-time for Microsoft) and use any other browser instead. To completely protect yourself, they'll advise moving off of Microsoft Windows all together.
[ Are Chrome, Firefox, Opera, and Safari more secure than Internet Explorer? See the Test Center guide to browser security. Learn how to secure your systems with Roger Grimes' Security Central newsletter from InfoWorld. ]
The idea is that protection can be gained by moving to a more secure product or that it's just inherently safer to use a less popular product because it is less likely to be attacked. Now, the former argument I can buy. If one product has weaker security than another product, who can blame you for switching? Of course, that argument is more complex than it first appears.
What is a more secure product? Do you measure that with known bug counts, severity of bugs, time to patch, or how often it is publicly exploited? And is the product you are moving to actually more secure or just attacked less often because it is not as popular? This leads to the other argument: When it comes to software, there's safety in fewer numbers of users. The idea is that when everyone is using the same application or operating system (OS), a computer monoculture is created that leads to more exploits.
On the face of it, it's a compelling argument, one that's hard to reason against. If we all use the same software, then attackers can write one piece of code to exploit us all simultaneously. It seems to make sense that moving away from a monoculture (an argument first popularized in a paper by Dan Greer and others in 2003) would reduce overall security risk.
And for sure, there are compelling factual arguments on a case-by-case basis. If you didn't run (unpatched) versions of Microsoft SQL Server in 2003, the SQL Slammer worm couldn't get you. Besides Windows, I also run OpenBSD, Ubuntu, and Mac OS X at home, and the last operating systems are attacked less frequently, with the exception of my Apache-based Web servers, which are attacked six to eight times more often than my IIS servers. However, the truth is that I don't get exploited on any of these systems unless I intentionally allow them to get exploited. (I run eight honeypots to monitor malware and hacker behavior.)
The key question is whether a more diverse software landscape -- i.e., the opposite of a computer monoculture -- would be safer over the long run.
Although some may accuse me of using this column to defend my full-time employer, the truth is that no one has presented compelling evidence that moving to a computer multiculture would provide more security protection to more users over the long run. Lots of people have speculated, but if the Windows world just up and splintered into a dozen different OSes and applications, no one has proven that it would provide more security value. Conventional wisdom says it might, but could it really?
I think the potential challenges to this way of thinking are many. First, often the alternatives people choose are as insecure as the market leaders they abandon; people may get get a temporary decrease in security risk until the software they use becomes more popular.
The monoculture argument about safety in fewer numbers holds water only if the move to the less popular software stays under the radar. If the masses follow, nothing is gained. Ten years ago, Microsoft Office document formats were the only game. But then Adobe's PDF arrived, and now PDFs are as common on the Internet as Office files. And not surprisingly, PDF is receiving more than its fair share of hackers and exploits. Some protection vendors are claiming that PDF exploits account for nearly half of all exploits in the wild today, surpassing Office's issues.
Adobe Acrobat Reader critics usually recommend moving to Fox-It, except that Fox-It has already had exploits. Internet Explorer critics recommend moving to Firefox, Safari, Opera, or Chrome, but all of these alternatives are commonly exploited too. Microsoft Office critics often recommend moving to OpenOffice.org, except that it, too, has already been exploited dozens of times, and some expert code analyzers think it is rife with exploits and insecure code. It's so frustrating it makes you want to move to simple text editors and text-only browsers, except those have already been exploited, too, and in any case, they fall well short of the features most of us need. When the world moves to the new product, you're right back where you started.
Life on the run
This isn't the answer, unless you plan on hopscotching around the software world from one program to the next, trying to keep one step ahead of the malicious hackers. While this works on a personal level, it's not so easy to manage in the enterprise. Plus, forcing the original vendor to become a more secure coder might make better use of overall effort. Who wants to be the person who forces their users to move to another product just to watch the original product become more secure than the new alternative?
Case in point: After the Microsoft SQL Slammer worm happened, there have been fewer than a handful of exploits against SQL Server, none popularly exploited. In the same time frame, there have been dozens to hundreds of exploits against SQL Server's most popular competitors -- same with IIS versus Apache after the Code Red worm debacle. Vendors that get a brutal lesson often fix their mistakes faster than the competition.
Note: Outside of the current security discussion, I highly recommend trying alternative products to see what feature sets and benefits they offer, and to be able to accurately point out the strengths and weaknesses when comparing against the products your company is using. Sometimes you can find a gem where you least expect it.
I believe the monoculture argument is losing more steam every day. The second rebuttal point is that the most popular applications are cross-platform already and exploits that work against one version usually work against the others -- not always, but more often than not. For example, a Safari browser exploit can be a problem no matter which OS you use. The same rule generally applies to Adobe Acrobat, iTunes, and Flash exploits. Not all work on all platforms, but they will attack most platforms with varying degrees of success.
Third, the file formats are becoming the new popular attack vector. If everyone up and moved to different applications, it probably wouldn't change a thing. Let's say the world ended up using 100 different word processors evenly distributed in use. People still need to communicate, and whatever document or protocol format becomes the de facto data exchange standard would become the de facto attack point. For example, most of the exploits against SQL databases don't pinpoint a particular platform or version of SQL, but rely upon SQL injection attacks. What make of SQL database you are using is far less important than the security of the code it is running.
Fourth, as computing moves into the cloud and as apps, documents, and protocols become more browser based, the differences between the various vendor products will lessen, and again, hackers will focus their attacks on the common links. Will the future attacks be against OSes, applications, data formats, and protocols, or will they leverage the inherent vulnerabilities in the cloud fabric itself?
I'm sure many readers are still discounting all my previous arguments. Suppose the world does move toward a more diverse computing environment. Is it really that hard for an attacker to attack 20 apps or OSes than one? Yes, of course, but maybe it's not the high hurdle most people think it is.
My fifth rebuttal point is that today's attackers are professional criminals. Make a point defense and they will get around it. Coding for 20 OSes or applications doesn't take that much more effort in real life than coding for one exploit. Look at all of the malware programs today that already use multiple attack vectors. Ten years ago, most malware programs attacked one exploit. Today, it's common for a single malware program to make use of 5, 10, or even 20 or more attack vectors. Conficker, anyone?
If we all ended up with 20 different apps and 20 different OSes, the attackers would simply begin exploiting more of them at once. All of the most popular apps and OSes are exploited pretty regularly. Attackers would learn to separate their entry exploit vectors and post-exploitation code into two separate but coordinated routines. The Metasploit project has been making this easy for more than half a decade. Are we to assume that rich, professional malware attackers will just give up and go home?
The idea of multiple attack vectors married with multiple post-exploit mechanisms in a malware program isn't even a new idea. Plenty of worms are already doing this, but it isn't mainstream because the attackers don't need the additional code and sophistication -- yet. If they use it sooner than they need it, it results in wasted computing cycles, slower malicious code, and easier detection. They'll also broadcast their new offensive techniques to the enemy (i.e., the anti-malware industry and the global community of good).
Lastly, and this is the biggest argument, we can't ignore the fact that most malware programs today (99.99 percent) don't rely upon software security vulnerabilities at all. They just trick the end-user into running malicious code. This attack vector will work for any OS and any application. This point alone should put the monoculture argument to bed. You can change the application, but until we change the end-user thought process, the biggest problem remains.
Two decades ago, I was telling users not to boot up with a floppy disk in the floppy drive. Then not to say yes to that request to run the macro, followed by "don't click on that file attachment." Today, it's "don't run that fake anti-virus program." Will we ever be able to train end-users to avoid running rogue code? Two decades of experience tell me that it isn't as easy as it sounds.
More to the point, in all of those 20 years, moving to a new app or OS never fixed the major underlying security problems (i.e., pervasive anonymity and lack of accountability) that underlie all malicious hacking. If everyone were to put up with the hassle of installing and learning a new product, we should not be surprised to end up with the same security threats again. Consider the last two decades of computer security defenses and their current success rate to see the logic of that argument. A lot of effort and time has been wasted without any improvement in overall computer security, because we did not address the underlying problems.
I'm all for moving to a new OS or application because you like the feature set or you know it is more secure, but don't think you can switch and leave all of the other security problems behind. Most security issues are global in nature and don't magically disappear because you are using a different product. Whatever you use, you must follow the same basic security tenets: patch often, don't run untrusted code, use strong passwords, etc.
The answers to our security prayers do not lie in a computing multiculture. Moving to new products has done zero to address the underlying security weaknesses that malware authors routinely exploit, and until we do that, we're just fooling ourselves.