“We knew there was a vulnerability, but we didn’t think it was that bad.”
Every security vulnerability should be ranked by criticality. A remote buffer overflow has higher criticality than a local DoS (denial of service) problem. But I’m talking about the people who just ignore the problem. Case in point: A few months ago, I discovered a remotely exploitable directory traversal exploit that gave me root access against an Internet-accessible device that was installed in more than a million consumers' homes. It served as the consumer’s primary access to the Internet and provided subscription-based and on-demand digital media content. The device used BSD as the underlying OS and a very old Internet Web server.
When I reported the problem to the programmer, he said that they had known about the problem for a long time, but they could not think of how it could be exploited. I was dumbfounded. This was remote admin access -- a pretty straightforward hack.
I told him that the consumer’s credit card information could be stolen. I said the customer’s service could be interrupted, that company services could be stolen, or that porn could end up in innocent customer homes. Further, I added that the simple exploit I was using could easily be wormed and turn those million devices into a bot army for attacking other targets.
The bug was added to the resolution database the next day. If you’re not a trained security person or if you don’t practice reasonable threat modeling, don’t attempt to guess for yourself how bad the bug is.
“We don’t need strong encryption.”
This is normally said when the developer needs to obfuscate plaintext data for some confidentiality reason. Instead of implementing widely used, industry-accepted, decade-trusted real cipher algorithms, they make up their own hashing or encryption routines. Some are very obvious, using Base64 encoding, simple substitution replacement, or the computer’s IP address or computer name as the private key.
Problem is, although strong encryption may not be needed now, it may be needed later. Once coded, it may never be changed. Legacy code often ends up in newer programs and applications, and what once was nice to have becomes a central, mission-critical need. If you must protect the confidentiality of data, use trusted cipher algorithms and routines. Forget about the halfway attempts.
The most startling point in all of these statements is that they were said to me by career computer security professionals, not by unknowing outsiders. Sometimes it makes rational sense to skip real security or to lower the security bar, but most of the time it is just incorrect rationalization and laziness. Don't fall into the trap.