Stupid consultant tricks

Metrics are dull unless you do them wrong. If you do, the disaster that ensues will make you wish for dull

"Do you promise measurable improvement?"

Many of our consulting competitors make measurable improvement a centerpiece of their sales pitch. We don't, which is why the CIO I was speaking to asked if I'd make that promise.

Having long ago learned to answer most questions with questions of my own, I said, "What measure would you like us to improve?" His reply: "Aren't you supposed to tell me?"

[ Also on InfoWorld.com: Get Bob Lewis's continuing IT management wisdom in his Advice Line blog and newsletter. | Find out why running IT as a business is a train wreck waiting to happen. ]

Advice Line is about what it's going to take to become a next-generation IT organization. Sooner or later, we're going to have to tackle the eye-glazing, mind-numbing, incredibly boring, and absolutely essential subject of metrics. So now that you've recovered from your Thanksgiving triptophan-induced drowsiness (a myth, by the way), we're here to restore the effect, without the use of any amino acids.

Starting with this: If you decide to engage consultants to help you with any aspect of organizational effectiveness, it's perfectly valid to insist on measurable improvement. Accepting the consultant's offer to define the measure, in contrast, is a sucker bet. Consultants who can't improve a measure they themselves choose are consultants who -- sorry, sarcasm has temporarily failed me. Let's just say they long ago left ineptitude in the dust and are sprinting toward utter incompetence.

Stupid consultant trick No. 1: Switching metrics midrace

I once watched another consultant fail to improve a measure he'd chosen himself (cycle time). Rather than figure out what was going wrong, he announced that he'd made a mistake. What really mattered wasn't cycle time, he explained. It was how accurately the department in question predicted cycle time -- how close its actual cycle times came to their original estimates. This measure (a quality metric) had improved, he proudly pointed out.

It was intellectually dishonest, but the rubes usually fall for it. To be fair, the consultant in question wasn't lacking in integrity. What he lacked was an understanding of how metrics work and what they're for. What was particularly embarrassing was that after all this took place, a member of the project team discovered a bug in the reporting program. Once that was fixed, it turned out cycle time had improved after all. Oops!

Stupid consultant trick No. 2: Cutting cycle time to spite throughput

Another enjoyable consultant pastime is improving the measure with which they're most comfortable -- most often, quality or cycle time. In a particularly egregious case, the consulting team did an exceptional job, cutting the cycle time of the process in question by nearly two-thirds. It was a huge success, supported by inarguable metrics.

From that, they concluded all of the employee complaints about the new process were nothing more than the result of their natural resistance to change. However, it turned out that their "natural resistance to change" had more to do with the new process having destroyed throughput -- cut in half -- than with any emotional attachment they might have had to the old way of doing things.

Stupid consultant trick No. 3: Getting what you measure

As long as I'm telling tales of metrics-challenged management consultants, here's another: the exceedingly nice fellow who suggested his client establish employee turnover as a core measure of management quality. When a member of the management team asked if undesired turnover might not be a better metric, the consultant responded that the difference between the two metrics would probably be in the decimal places, which meant the additional complexity wouldn't be worth the trouble.

1 2 Page
Recommended
Join the discussion
Be the first to comment on this article. Our Commenting Policies