Wednesday

Overcoming cognitive bias

I haven't yet written much about cognitive biases - anchoring, availability, framing, the sunk cost fallacy, to name just a few. (They do come up in some of the books I've reviewed.) Harvard Business Review has just published an article that helps managers identify cognitive biases in others -- like the people the work with, or the people who work for them. The article, "The Big Idea: Before you Make that Big Decision . . . " by Daniel Kahneman (one of the originators of the idea of cognitive bias), Dan Lovallo, and Olivier Sibony is free on the Harvard Business Review website through July 4.

In it, the authors set out a checklist to help managers reviewing the work of others avoid getting trapped by someone else's cognitive bias. The checklist is easy to reproduce, and I've set it out below. The article's examples, not surprisingly, are from the corporate world, but I have no trouble thinking of examples from my experience in government and non-profits. The checklist is a good shorthand for critical thinkers to keep in mind, but the article explaining it is extremely helpful.

Here's the checklist, from the article "The Big Idea: Before you Make that Big Decision . . . " by Daniel Kahneman, Dan Lovallo, and Olivier Sibony:
A. Preliminary questions to ask yourself:
1. Is there any reason to suspect errors driven by self-interest in the recommendation?
2. Have the people making the recommendation fallen in love with it?
3. Were there dissenting opinions within the recommending team?

B. Questions to ask the recommenders:
1. Could their read of the situation be over-influenced by salient analogies?
2. Have they considered all the credible alternatives?
3. If they had to make this decision again in a year, what information would they want? Can they get any of it now?
4. Where did their numbers come from?
5. Are the team's assumptions justified, or are they amplified by a halo effect, a tendency to see a story as simpler than it really is?
6. Are the people making a recommendation doing it because they're attached to past decisions?

C. Questions to ask about the proposal
1. Is the base case overly optimistic?
2. Is the worst case bad enough?
3. Is the recommendation overly cautious?

Some of the considerations here are reminiscent of ideas raised in How to Measure Anything by Douglas Hubbard. Interested readers can click on the link to learn more about that fascinating and complex book.

No comments:

Popular Posts