A Field Guide to Thinking Errors

Thinking Errors: Part Three

Posted: September 15, 2015 Written by: W.B. “Bud” Kirchner

This article is all about (still more) reasons we need to be suspicious of our decision-making skills.

I suppose it is possible, after reviewing the list of biases that I have flagged (Part Two of this series, “The Ironic Magnitude of Cognitive Biases”) as most likely (and perhaps) even after my reference in that post to there being about 150 identified cognitive biases, that someone should conclude it’s not that much to worry about.

Let me burst that balloon with an overview of several other categories of “thinking errors.” These are still more land mines that riddle our thought/decision landscape. My sources include (in addition to some of my personal “favorites” that I have learned the hard way):

Logical Fallacies

As I stated before I have drawn many arbitrary lines – for example the overlap between logical fallacies and social:

Logical fallacies – These are (overly) simply defined as fake or deceptive arguments. Given their nature, I have used more traditional descriptions here relying largely on UNIV 130 University Seminar, A Short Course in Intellectual Self-Defense and The Progymnasmata. Again, I have focused on the business mind perspective so these are my chosen subset.

  • Ad Hominem Argument
    • The fallacy of attempting to refute an argument by attacking the opposition’s personal character or reputation.
  • Bandwagon
    • The fallacy of arguing that because “everyone” supposedly thinks or does something, it must be right.
  • Begging the Question
    • Falsely arguing that something is true by repeating the same statement in different words.
  • The Complex Question
    • The fallacy of demanding a direct answer to a question that cannot be answered without first analyzing or challenging the basis of the question itself.
  • “E” for Effort
    • The contemporary fallacy that something must be right or valuable simply because someone has put so much sincere good faith effort or even sacrifice into it.
  • Guilt by Association
    • The fallacy of trying to refute or condemn someone’s standpoint, arguments or actions by evoking the negative ethos of those with whom one associates or of a group which he or she belongs.
  • The Half Truth (also Card Stacking, Incomplete Information)
    • The fallacy of telling the truth by deliberately omitting important key details in order to falsify the larger picture and support a false conclusion.

Now back to my Field Guide to Thinking Errors approach:

Statistics

  • Complications generated by the almost universal misunderstanding of statistics.
  • Confusing correlation with cause and effect.
  • Treating opinion as fact.
  • Confusing relative vs. absolute risk.

Philosophy

  • Falsifiability, as described by Karl Popper, confirms we can never prove anything true but only false (think “All swans are white until you see a black one.”). However, (and this is the error) not being able to prove something is wrong does not mean it is right.
  • Thinking glass is always half full.
  • Thinking in absolute terms.
  • A positions merit does not depend on how many hold it.

Memory

  • Unusual events are better remembered than usual ones.
  • Nobody ever caught a fish as big as they remembered.
  • We tend to forget unpleasant events quicker than pleasant.
  • Almost everyone almost always “knew it all along” – called 20/20 hindsight.
  • Failure blamed on external forces while. success attributed to internal forces.

Behavioral Economics

  • Nobody understands how to determine utility value.
  • We are bad at estimating both value and odds.
  • We are suckers for the default position.
  • We think value is based on what we compare an item to.
  • Loss aversion is more important than gain perception.

Heuristics

  • Labeling things give credibility (I can’t help but think of a blog using ‘brain’ in the title).
  • Basing a decision on an emotional reaction (affect).
  • Emotions distorting view of world.
  • Focus on obvious solution/cause.
  • Mistake from action hurts more than inaction.
  • You get what you pay for.

Social

  • Success has many fathers – failure is an orphan.
  • Overly personalizing success or failure.
  • Distorting issues – answer the question you have an answer to, not the one you are asked.
  • Shifting comparisons.
  • Overgeneralizing from isolated cases.

In closing, because this category is so relevant to the theme of our Business Brain Model℠ – I will go into some more detail around an example of decisions and the related errors.

This example is based on Dan Gilbert’s TEDTalk – “Exploring the Frontiers of Happiness.” He starts with his version of Bernoulli’s Formula: 

Expected gain = odds of gain x value of gain

Then some illustrations where such a simple analysis can breakdown when we miscalculate:

  • Odds: because we base it on what we’re familiar with or can easily see in our mind, whether it’s from the media or our own experiences.
  • Value: because we compare the scenario to the past instead of the possibilities. We compare at the time of purchase but not our actual experience.

Ulysses S. Grant the Army general was a Civil War hero who commanded all the U.S. armies. Grant as 18th president of the United States led the Reconstruction efforts. Yet, Grant the civilian was a failed investor who lost his life savings when “he invested in a brokerage firm that went bankrupt….” So what happened to Grant’s judgment? Not sure if someone asked or if he just offered an assessment, but in his words: “My failures have been errors in judgment, not of intent.”

Ulysses – I know how you feel!

About the Author: W.B. “Bud” Kirchner is a serial entrepreneur and philanthropist with more than 50 years of business success. He is not a scientist or an academic but he does have a diversified exposure to neuroscience, psychology and related areas. Generally speaking, the ideas he expresses here are business-angled expansions of other people’s ideas, so when possible, he will link to the original reference.