I haven't yet written much about cognitive biases - anchoring, availability, framing, the sunk cost fallacy, to name just a few. (They do come up in some of the books I've reviewed.) Harvard Business Review has just published an article that helps managers identify cognitive biases in others -- like the people the work with, or the people who work for them. The article, "The Big Idea: Before you Make that Big Decision . . . " by Daniel Kahneman (one of the originators of the idea of cognitive bias), Dan Lovallo, and Olivier Sibony is free on the Harvard Business Review website through July 4.
In it, the authors set out a checklist to help managers reviewing the work of others avoid getting trapped by someone else's cognitive bias. The checklist is easy to reproduce, and I've set it out below. The article's examples, not surprisingly, are from the corporate world, but I have no trouble thinking of examples from my experience in government and non-profits. The checklist is a good shorthand for critical thinkers to keep in mind, but the article explaining it is extremely helpful.
Here's the checklist, from the article "The Big Idea: Before you Make that Big Decision . . . " by Daniel Kahneman, Dan Lovallo, and Olivier Sibony:
A. Preliminary questions to ask yourself:
1. Is there any reason to suspect errors driven by self-interest in the recommendation?
2. Have the people making the recommendation fallen in love with it?
3. Were there dissenting opinions within the recommending team?
B. Questions to ask the recommenders:
1. Could their read of the situation be over-influenced by salient analogies?
2. Have they considered all the credible alternatives?
3. If they had to make this decision again in a year, what information would they want? Can they get any of it now?
4. Where did their numbers come from?
5. Are the team's assumptions justified, or are they amplified by a halo effect, a tendency to see a story as simpler than it really is?
6. Are the people making a recommendation doing it because they're attached to past decisions?
C. Questions to ask about the proposal
1. Is the base case overly optimistic?
2. Is the worst case bad enough?
3. Is the recommendation overly cautious?
Some of the considerations here are reminiscent of ideas raised in How to Measure Anything by Douglas Hubbard. Interested readers can click on the link to learn more about that fascinating and complex book.
Wednesday
Subscribe to:
Post Comments (Atom)
Blog Archive
-
▼
2011
(99)
-
▼
June
(10)
- Free fundraising report from GuideStar
- Fascinating presentation from Web Science 11 confe...
- Flight data visualizations
- Sometimes all you have to do is count
- Overcoming cognitive bias
- Context matters
- "A Singular Woman", by Janny Scott
- Miles to go
- Keeping an eye on the big picture
- Fun movie gadget from Slate
-
▼
June
(10)
Popular Posts
-
Here's a link to a series of charts The Atlantic.com has put together titled "10 Ways to Visualize How Americans Spend Money on He...
-
It's still in beta, and not all the data are loaded yet, but even so the website Mapping Gothic France , put together by art historians ...
-
I've mentioned Edward Tufte, the statistician and political scientist before. Now I've read Tufte's 2003 essay "The Cog...
-
Like many other people, I am constantly on the lookout for useful organizing tools. Here are a couple to ponder, and play with, over the T...
-
"Rethinking a Lot: The Culture and Design of Parking," Eran Ben-Joseph's unexpectedly lyrical ode to the humble parking lot, d...
No comments:
Post a Comment