“Third, the additional bugs that external hackers find are commonly found by examining the patches we apply to our software. Look at our vulnerability statistics. Most of our hits center around two main features. Both features came to the attention of hackers after we had released patches for them fixing internally found problems. In both cases we located the vulnerable code and patched. Within a month, three more related holes were found by the hacker community. OK, so we didn’t do a great job in ferreting out all the errors in the features. After the last round of fixes, we investigated each feature with a more comprehensive analysis and code review. We even hired an external penetration testing team. We found many more holes and patched them. Then in the next six months, we got hacked again in the same features. There’s lots of blame going around, along with better solutions, but it doesn’t change the fact if we had kept the original exploits unpatched, we would have avoided three additional, publicly discussed exploits.
“Fourth, every disclosed bug increases the pace of battle against the hackers. It’s like the anti-virus war. Anti-virus vendors detect each new virus and the virus writers make better viruses. It’s possible that if anti-virus software had never been created, we wouldn’t be dealing with the level of worm and bot sophistication that we face today. If we patch a hole faster than it needed to be patched, it just makes the hackers look harder, faster than they otherwise would. We are at the losing end of every hacker wannabe in the world, and every fix we have to make slows down our product and costs money. Why do we want to encourage a better war? If we shut up, when the hacker finally discovers the bug, the war proceeds slower, and our customers are on the winning side.
“Fifth, when a bug isn’t announced, most hackers don’t exploit it. The vast majority of our customers remain protected, because even if a nonpublicly known bug really is known, it’s only known by a small group of hackers. Damage is very limited. You’ve said the same thing in one of your previous columns that I frequently share with coworkers. Once the bug is publicly known, our products come under attack by thousands of hackers and dozens of worms. Most of our customers are protected as soon as they apply our patches, but for some reason many of our customers never patch, or at least don’t patch until they call us with their system owned and the damage done.
“Industry pundits such as yourself often say that it benefits customers more when a company closes all known security holes, but in my 25 years in the industry, I haven’t seen that to be true. In fact I’ve seen the exact opposite. And before you reply, I haven’t seen an official study that says otherwise. Until you can provide me with a research paper, everything you say in reply is just your opinion. With all this said, once the hole is publicly announced, or becomes high-risk, we close it. And we close it fast because we already knew about it, coded a solution, and tested it.”
On first reading, I thought that there were so many factual mistakes in this reader's argument that I didn’t know where to begin. But as I re-read it, I realized he did make some cognitive points. As Stephen Northcutt of SANS taught me, “Eat the watermelon and spit out the seeds.” There is a little truth in every argument.