Dear Bob ...
I'm a consultant helping organizations to achieve and maintain certification to the ISO 27001 Information Security standard. This requires (among much else) that they define KPIs (key performance indicators) for security.
This seems a very reasonable request and good management practice, but I've not been able to identify satisfactory (to me -- they seem to keep the auditors happy) indicators.
The problem is that security is, in some respects, like a fire or burglar alarm -- failures are (hopefully) rare events. The fact that you haven't had a fire or break-in this month may be a worthy achievement, but (possibly) doesn't tell you much about what will happen next month. So keeping a record of rare events, while certainly important, doesn't help you assess the effectiveness of your security.
Equally, there are plenty of indicators that are straightforward and inexpensive to gather (good!): the number of spam e-mails blocked, the number of virus infections trapped, the number of packets blocked at the firewall. The problem with these statistics is that the trends are not helpful. If the number of detected viruses has gone up, is that because there are more attacks going on or because a new version of the software is more effective at detecting infection?
I've asked many industry gurus for good examples of security metrics, without success. Have you any thoughts?
Dear Unsatisfied ...
Many thoughts. Not sure if any of them are useful thoughts, but many thoughts nonetheless.
Starting with this: Requiring KPIs is probably a useful idea, so long as everyone understands you can't start with the question of what constitutes a good KPI.
KPIs are metrics. As is always the case with metrics, the question of "how should we measure this" has to come after the question, "When we use the word 'this,' what does it refer to?"
So never mind the security KPIs. The place to start is to ask, what are the organization's security goals?