Over a year ago, I blogged about intuition by comparing the viewpoints of two of my favorite authors – Ayn Rand of Atlas Shrugged fame and Malcolm Gladwell of Outliers fame. Rand was not a fan of intuition and famously said:
- “As a human being, you have no choice about the fact that you need a philosophy. Your only choice is whether you define your philosophy by a conscious, rational, disciplined process of thought and scrupulously logical deliberation – or let your subconscious accumulate a junk heap of unwarranted conclusions, false generalizations, undefined contradictions, undigested slogans, unidentified wishes, doubts and fears, thrown together by chance, but integrated by your subconscious into a kind of mongrel philosophy and fused into a single solid weight: self doubt, like a ball and chain in the place where your mind’s wings should have grown.”
By contrast, Gladwell in his book Blink describes intuition as mental processes that work rapidly and automatically from relatively little information – something he calls thin-slicing. He believes that many spontaneous decisions are as good as—or even better than—carefully planned and considered ones. Gladwell believes experts are especially able to thin-slice, but this ability can be corrupted by their likes and dislikes, prejudices, and stereotypes (even unconscious ones). Experts can also be overloaded by too much information (e.g., analysis paralysis).
Thinking Fast and Slow takes essentially the same position as Gladwell’s book Blink, but it is much more in-depth and comprehensive. Gladwell is relatively a subject-matter dilettante compared to Kahneman, who has won a Nobel Prize in Economics and has been studying this subject (decision-making) his entire life.
Kahneman does an especially good job in describing how fast and efficient intuition is, which he calls System 1. But he also vividly describes the slower, lazy System 2, which monitors the functioning of System 1 and overrules it where necessary. This overruling of System 1 by System 2 is necessary because, although System 1 is generally accurate, it makes lots of fundamental mistakes.
Individuals can’t do much to make their System 1 decisions more accurate, so ultimately they can improve their thinking and decision-making only by training their System 2 to be vigilant for common System 1 mistakes and to stop being so lazy. The book is replete with examples of common System 1 errors, such as:
- WYSIATI (what you see is all there is). Instead of thinking about how limited our knowledge is, we assume that we already know all that is needed to make a decision.
- Priming and framing. Certain words prime our thinking, and the framing of an issue can control our decision.
- The law of small numbers. Our decisions are not restrained even though we have only small numbers. Our bias is toward confidence instead of doubt.
- Anchors. Our thinking is significantly affected by the anchor number (starting point). That is why you feel better about buying a $20 shirt that was previously $40 as compared to a $15 shirt that was previously $25, even though both shirts are essentially the same.
As I said, the book is filled with examples too many to mention. The example that totally confuses me is called, “Causes trump statistics” or the Bayesian inference:
- A cab was involved in a hit-and-run accident at night. Two cab companies, the Green and the Blue, operate in the city. You are given the following data:
- 85% of the cabs in the city are Green and 15% are Blue.
- A witness identified the cab as Blue. The court tested the reliability of the witness under the circumstances that existed on the night of the accident and concluded that the witness correctly identified each one of the two colors 80% of the time and failed 20% of the time.
- What was the probability that the cab involved in the accident was Blue rather than Green?
According to Kahneman, “The two sources of information can be combined by Bayes’s rule. The correct answer is 41%. However, you can probably guess what people do when faced with this problem: they ignore the base rate and go with the witness. The most common answer is 80%.” Count me the same as most people.
Early in his career, Kahneman was asked by the Israel government to create a high school textbook for teaching kids how to improve their decision-making. For a bunch of bureaucratic reasons, the project was never completed. Sounds like a great idea to me. Thinking Fast and Slow is one of the most interesting, insightful books that I have read in a long time, and I believe most kids would benefit immensely from being exposed to the concepts.