Why your security sucks

A conversation with InfoWorld security expert Roger Grimes reveals why the latest burst of attacks is just business as usual

Security breaches are getting more mainstream media play than ever before, mainly thanks to organizations like Anonymous and Lulz Security hitting high-profile targets. The cost of the destruction, particularly to Sony, has been high. But ironically, attackers who do it for the glory rather than money may be providing a kind of public service.

"Security is always this bad," says Roger Grimes, InfoWorld's stalwart security expert and author of our Security Adviser blog. But when criminals compromise financial institutions and other corporate targets, as they do all the time, the victims like to keep it as quiet as possible. At least the new wave of very public assaults shines a bright light on the awful state of security.

[ Stay up to date on the latest security developments with InfoWorld's Security Central newsletter. | Get a dose of daily computer security news by following Roger Grimes on Twitter. ]

Roger knows firsthand how terrible it is. As a security consultant, he would tell clients he could break into their network within 24 hours, but it almost always took him less than an hour. Nor has he ever failed to obtain a CEO's password in a just few minutes with a little social engineering. "Hacking hasn't changed in 20 years," he says. "I guess that's the sad thing. In the past 20 years, hacking is no harder and defending is no better."

End-user security holes
The maddening thing is that simple measures, well known for years, would prevent most attacks. According to Roger, 90 percent of exploits involve users downloading and installing items they shouldn't. Often, these exploits begin with scareware messages that tell users their system has been compromised and that they should install an antivirus program to remove the infection, which of course turns out to be malware itself.

It's hard to train users to ignore fake alerts, Roger says, especially when they don't know what a real virus alert looks like:

I've asked every company: Do you give a picture to your employees of what your antivirus program looks like when it finds a virus? Never. They never do. Ever. If this is the No. 1 problem in most environments today -- and it is -- why are we as defenders not even doing the simple stuff? Is it too hard take a picture and tell an end-user, "this is what your product looks like?" It's not. I'm not sure if it's lethargy or what. Every company I've ever said that to ... none of them have ever taken the picture. They're like "Oh, you're right, good idea," and then they don't do anything.

The lack of even elementary training is one problem. Another is that people don't get penalized for failure. Roger says, "In my entire career, and I've been doing this since 1987, [I only know] of one department that had some firings because of a horrific hacking event -- that was from SQL Slammer." In the vast majority of cases, neither end-users nor IT professionals face penalties for their role in a security disaster.

Putting off patching
Perhaps even more frustrating is IT's failure to keep software patches up to date. This lapse, Roger notes, was practically an open invitation for the Sony hack: "You had Web servers that were knowingly unpatched for months. A Web server, the most attacked thing on the planet, knowingly unpatched. That borders on negligence."

1 2 Page 1