Kevin Finisterre isn't the type of person you expect to see in a nuclear power plant. With a beach ball-sized Afro, aviator sunglasses and a self-described "swagger," he looks more like Clarence Williams from the '70s TV show "The Mod Squad" than an electrical engineer.
But people like Finisterre, who don't fit the traditional mold of buttoned-down engineer, are playing an increasingly important role in the effort to lock down the machines that run the world's major industrial systems. Finisterre is a white-hat hacker. He prods and probes computer systems, not to break into them, but to uncover important vulnerabilities. He then sells his expertise to companies that want to improve their security.
[ Also on InfoWorld: Stuxnet attack more effective than bombs. | InfoWorld's Roger Grimes said Stuxnet marks the start of the next security arms race. | Master your security with InfoWorld's interactive Security iGuide. | Stay up to date on the latest security developments with InfoWorld's Security Central newsletter. ]
Two years ago, Finisterre, founder of security testing company Digital Munition, found himself swapping emails with a staffer at Idaho National Laboratory's Control Systems Security Program, a project funded by the U.S. Department of Homeland Security that is the first line of defense against a cyber attack on the nation's critical infrastructure.
Finisterre caught the attention of INL in 2008, when he released attack code that exploited a bug in the CitectSCADA software used to run industrial control environments. He'd heard about the INL program, which helps prepare vendors and plant operators for attacks on their systems, and he thought he'd drop them a line to find out how good they really were.
He was not impressed.
Is INL already working with the hacker community? Finisterre wanted to know. He received an off-putting response. The term "hacker" denotes a person of a "dubious or criminal nature" who would "not be hireable by a national laboratory," an INL staffer told him via email.
"He basically lectured me about how INL doesn't interact with hackers and I should be very careful throwing that word around," Finisterre recalled. "I was like, 'Dude, I really hope you're joking, because you're supposed to be at the forefront of the research on this."
Call it an early skirmish in a culture clash between two worlds: the independent security researchers accustomed to dealing with tech firms such as Microsoft and Adobe, who have learned to embrace the hacker ethos, and the more conservative companies that develop and test industrial control systems, who often act like they wish these white-hat hackers would go away.
Earlier this year, Dillon Beresford, a security researcher at the consultancy NSSLabs, found a number of flaws in Siemens' programmable logic controllers. He had no complaints about the U.S. Department of Homeland Security's Industrial Control Systems cyber Emergency Response Team, run out of INL. But he said Siemens did a disservice to its customers by downplaying the issues he'd uncovered. "I'm not pleased with their response," Beresford said earlier this year. "They didn't provide enough information to the public."