Mike Kueber's Blog

March 17, 2012

Sunday Book Review #67 – Thinking Fast and Slow by Daniel Kahneman

Filed under: Book reviews — Mike Kueber @ 6:45 pm
Tags: , , , ,

Over a year ago, I blogged about intuition by comparing the viewpoints of two of my favorite authors – Ayn Rand of Atlas Shrugged fame and Malcolm Gladwell of Outliers fame.  Rand was not a fan of intuition and famously said:

  • As a human being, you have no choice about the fact that you need a philosophy.  Your only choice is whether you define your philosophy by a conscious, rational, disciplined process of thought and scrupulously logical deliberation – or let your subconscious accumulate a junk heap of unwarranted conclusions, false generalizations, undefined contradictions, undigested slogans, unidentified wishes, doubts and fears, thrown together by chance, but integrated by your subconscious into a kind of mongrel philosophy and fused into a single solid weight: self doubt, like a ball and chain in the place where your mind’s wings should have grown.”

By contrast, Gladwell in his book Blink describes intuition as mental processes that work rapidly and automatically from relatively little information – something he calls thin-slicing.  He believes that many spontaneous decisions are as good as—or even better than—carefully planned and considered ones.  Gladwell believes experts are especially able to thin-slice, but this ability can be corrupted by their likes and dislikes, prejudices, and stereotypes (even unconscious ones).  Experts can also be overloaded by too much information (e.g., analysis paralysis).

Thinking Fast and Slow takes essentially the same position as Gladwell’s book Blink, but it is much more in-depth and comprehensive.  Gladwell is relatively a subject-matter dilettante compared to Kahneman, who has won a Nobel Prize in Economics and has been studying this subject (decision-making) his entire life.

Kahneman does an especially good job in describing how fast and efficient intuition is, which he calls System 1.  But he also vividly describes the slower, lazy System 2, which monitors the functioning of System 1 and overrules it where necessary.  This overruling of System 1 by System 2 is necessary because, although System 1 is generally accurate, it makes lots of fundamental mistakes.

Individuals can’t do much to make their System 1 decisions more accurate, so ultimately they can improve their thinking and decision-making only by training their System 2 to be vigilant for common System 1 mistakes and to stop being so lazy.  The book is replete with examples of common System 1 errors, such as:

  • WYSIATI (what you see is all there is).  Instead of thinking about how limited our knowledge is, we assume that we already know all that is needed to make a decision.
  • Priming and framing.  Certain words prime our thinking, and the framing of an issue can control our decision.
  • The law of small numbers.  Our decisions are not restrained even though we have only small numbers.  Our bias is toward confidence instead of doubt.
  • Anchors.  Our thinking is significantly affected by the anchor number (starting point).  That is why you feel better about buying a $20 shirt that was previously $40 as compared to a $15 shirt that was previously $25, even though both shirts are essentially the same.

As I said, the book is filled with examples too many to mention.  The example that totally confuses me is called, “Causes trump statistics” or the Bayesian inference:

  • A cab was involved in a hit-and-run accident at night.  Two cab companies, the Green and the Blue, operate in the city.  You are given the following data:
    • 85% of the cabs in the city are Green and 15% are Blue.
    • A witness identified the cab as Blue.  The court tested the reliability of the witness under the circumstances that existed on the night of the accident and concluded that the witness correctly identified each one of the two colors 80% of the time and failed 20% of the time.
  • What was the probability that the cab involved in the accident was Blue rather than Green?

According to Kahneman, “The two sources of information can be combined by Bayes’s rule.  The correct answer is 41%.  However, you can probably guess what people do when faced with this problem: they ignore the base rate and go with the witness.  The most common answer is 80%.”  Count me the same as most people.

Early in his career, Kahneman was asked by the Israel government to create a high school textbook for teaching kids how to improve their decision-making.  For a bunch of bureaucratic reasons, the project was never completed.  Sounds like a great idea to me.  Thinking Fast and Slow is one of the most interesting, insightful books that I have read in a long time, and I believe most kids would benefit immensely from being exposed to the concepts.

Advertisements

10 Comments »

  1. you would go with zero… really? obviously while the percentage was low that it was a blue cab, it clearly wasn’t zero…

    my dad always said no lawyer would want me on a jury because i turn off my emotions and use data and logic – as a trial lawyer he always sought peopel that emoted so he could use their emotions in his client’s favor.

    q

    Comment by Q — March 18, 2012 @ 3:47 pm | Reply

    • Q,

      No, I didn’t go with zero. I went w/ 80%. The 41% reasoning was one layer too deep for me.

      Comment by Mike Kueber — March 18, 2012 @ 6:02 pm | Reply

  2. […] of Education – I have a suggestion that I wish you would consider.  While reading a new book about decision-making called Thinking Fast and Slow by Daniel Kahneman, I was intrigued by the author’s suggestion […]

    Pingback by An open letter to State Board of Education member Ken Mercer « Mike Kueber's Blog — March 29, 2012 @ 10:17 pm | Reply

  3. […] This thesis is very similar to one that described by Daniel Kahneman in Thinking Fast and Slow (Book Review #67).  A major difference, however, is that Kahneman believes that humans can train their strategic […]

    Pingback by Sunday Book Review #73 – The Righteous Mind by Jonathan Haidt « Mike Kueber's Blog — May 5, 2012 @ 8:13 pm | Reply

  4. […] It seems that evolutionary biology increasingly is providing us with solid explanations for human behavior.  And, although this biology provides initial impulses for individuals to act in a certain way, an awareness of what is going on can enable the rational part of the brain to override the instinctual part.  (For a previous posting on this subject, see Thinking Fast and Thinking Slow.)  […]

    Pingback by Hot women and their “bad boys” « Mike Kueber's Blog — May 24, 2012 @ 6:08 pm | Reply

  5. […] something that has greatly interested me since reading Daniel Kahneman’s Thinking Fast and Slow, a book that analyzes decision-making (both intuitive and rational) and suggests ways to improve it.  I […]

    Pingback by Sunday Book Review #112 – The Art of Thinking Clearly by Rolf Dobelli | Mike Kueber's Blog — November 25, 2013 @ 2:23 am | Reply

  6. […] are numerous psychological studies assuring us that, although people may talk altruistically, they are fundamentally directed by their […]

    Pingback by Voting against your interests | Mike Kueber's Blog — December 27, 2013 @ 8:17 pm | Reply

  7. […] Prize-winning economist Daniel Kahneman in his book Thinking Fast and Slow discusses how the brain tends to think quickly and instinctually with amazing success, but sometimes […]

    Pingback by Sunday Book Review #131 – Social by Matthew Lieberman | Mike Kueber's Blog — April 8, 2014 @ 2:33 am | Reply

  8. […] findings provide further support for Daniel Kahneman’s assertion that people need to energetically overrule their oft-mistaken, intuitive System 1 […]

    Pingback by Poker faces | Mike Kueber's Blog — May 1, 2014 @ 12:29 am | Reply

  9. While reading a new book about Kahneman and Amos Tversky – Michael Lewis’s The Undoing Project, I learned more about ignoring the base rate, and now understand why the answer is 41%. Specifically, the green cabs are 85% of the population, and the error rate is 20%, so the witness will incorrectly identify a blue cab 17% of the time. Blue cabs are 15% of the population, and the correct rate is 80%, so the witness will correctly identity a blue cab 12% of the time. Thus, the blue cab will be identified 29% of the time, and the identification will be correct only 41% of the time.

    Comment by Mike Kueber — January 19, 2017 @ 2:55 pm | Reply


RSS feed for comments on this post. TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Create a free website or blog at WordPress.com.

%d bloggers like this: