This Nobel Prize-Winning Psychologist Reveals the Cognitive Biases that Lead to Bad Decisions.

Mayo,

 In the fall of 1969, Amos Tversky and Daniel Kahneman—two rising stars in the psychology department at the Hebrew University of Jerusalem—formed a formidable friendship that would change how we think about how we think.

Together, the pair would create the field of behavioral economics and revolutionize large parts of cognitive psychology.

After Tversky died in 1996, Kahneman carried the mantle and in 2002, was awarded the Nobel Memorial Prize in Economic Sciences.

Central to Kahneman’s revolutionary work was the discovery of cognitive biases that affect our decision-making.

Here are the top five major cognitive biases that lead to bad decisions in life and work.

Amos Tversky and Daniel Kahneman in the late 1970s, photographed in the garden of Tversky’s house in Stanford, California.

5 Cognitive Biases That Affect Your Decision-Making

1. Anchoring.

Anchoring refers to the idea that we are easily swayed by irrelevant information presented to us prior to making a decision.

In a classic experiment, Kahneman and Tversky recruited students from the University of Oregon as subjects, spun a wheel of fortune and asked them to write down the number on which the wheel stopped. 6

The catch however, was that the wheel of fortune had been rigged to stop only at numbers 10 or 65, rather than a range of numbers between 0 to 100.

Kahneman and Tversky then asked the subjects two questions: Is the percentage of African nations among UN members larger or smaller than the number you just wrote? What is your best guess of the percentage of African nations in
the UN?

The results were astonishing.

The average estimates of the participants who saw 65 on the wheel of fortune was 45%, whereas those who saw 10 had a lower average estimate of 25%.

In other words, the subjects were ‘anchored’ by the wheel of fortune numbers, which swayed their final estimations.

Just like the subjects in this experiment, we tend to fall prey to anchoring on a daily basis.

For example, you’re more likely to buy a “special deal” sandwich for $5, if you previously saw another sandwich that cost $20.

Even though the price of the more expensive sandwich is irrelevant to your buying decision, the cognitive bias of anchoring will lead you to perceive the lower priced sandwich as a cheap bargain, when it may not be.

2. The availability heuristic.

The availability heuristic refers to our tendency to make judgments based on information that can be easily recalled from memory.

For example, let’s say you’re driving on a highway at 70 mph like everyone else, but then you see a catastrophic accident and the other motorists slow down to 50 mph.

Because the possibility of an accident is more ‘available’ in your mind, you’re more likely to slow down to 50 mph as well, even though the probability of an accident occurring remains unchanged.

The availability heuristic also explains why certain catastrophic events occur, like divorce. As noted by Kahneman:

“In judging the likelihood that a particular couple will be divorced, for example, one may scan one’s memory for similar couples which this question brings to mind. Divorces will appear probable if divorces are prevalent among the instances that are retrieved in this manner.” 2

In short, the availability heuristic leads to bad decision-making because misleading information tends to come to mind more easily than accurate ones.

3. Representative heuristic

The representative heuristic is a cognitive bias the leads us to make judgments based on comparisons to something else in mind. 3

A classic example of this cognitive bias in action, is the recruitment process of the majority of organizations around the globe.

Whilst interviewing and assessing job applicants, hiring managers are more likely to hire a person who fits the stereotype of what someone who typically holds that job looks like, even though another applicant with a different gender or race, is better suited for the role.

Smart people are the most susceptible to falling into the trap of the representative heuristic: for example, statisticians tend to leap to conclusions that any given sample of a large population was more representative of that population than it actually was.

4. Regression to the mean

Within the context of psychology, regression to the mean describes our tendency to make biased predictions about the future, because we fail to take into account the power of regression.

For example, whilst helping the Israeli Air Force to train fighter pilots, Kahneman noticed that instructors—like most coaches and managers—believed that criticism was more useful than praise.

Their basis for this belief was that whenever they praised a pilot for having performed well, he would always perform worse the next time out, whereas the pilot who was criticized always performed better.

Kahneman however, noticed that both groups of pilots simply regressed back to the mean—in other words, they performed better (or worse) regardless of praise and criticism from the instructor.

Failure to take into account the regression to the mean often leads to faulty predictions based on misleading information. Here’s an example explained by Kahneman in his book, Thinking Fast and Slow (Audiobook):

“Depressed children treated with an energy drink improve significantly over a three-month period. I made up this newspaper headline, but the fact it reports is true: if you treated a group of depressed children for some time with an energy drink, they would show a clinically significant improvement.

It is also the case that depressed children who spend some time standing on their head or hug a cat for twenty minutes a day will also show improvement.”

5. Hindsight bias

Hindsight bias refers to our tendency to look back at past events, adjust our worldview to accommodate the surprise and hold the position that “I-knew-it-all-along.”

This cognitive bias is especially pervasive amongst experts, who tend to look back at surprising political elections and technological innovations, and reframe their wrong predictions to match reality.

Within organizations, hindsight bias rewards and promotes reckless risk seekers, who tend to take crazy gambles that work. Kahneman explains this phenomenon in Thinking Fast and Slow:

“Leaders who have been lucky are never punished for having taken too much risk.  Instead, they are believed to have had the flair and foresight to anticipate  success, and the sensible people who doubted them are seen in hindsight as mediocre, timid, and weak. A few lucky gambles can crown a reckless leader with a halo of prescience and boldness.”

Don’t Trust Your Intuition

We’d like to think that our expertise and intuition are enough to make smart decisions, but Kahneman’s work on cognitive biases carries a message with a strong warning: left to its own devices, the human mind tends to make systematic errors that lead to costly mistakes and bad decisions.

It’s a humble reminder that even the smartest person in a room can make the worst decision, if they rely on intuition alone.

Ultimately, the ability to identify and manage these cognitive biases will shape our decision-making and quality of our lives.

Read Next


FOOTNOTES

1. Michael Lewis (2017). The Undoing Project: A Friendship That Changed Our Minds.

2. Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science (New Series), 185, 1124-1131.

3. Kahneman, Daniel, and Amos Tversky. “Subjective Probability: A Judgment of Representativeness.” Cognitive Psychology 3 (1972): 430–54.