How Cognitive Biases Distort Our Thinking: Heuristics, Stereotypes, and Decision-Making

Stereotypes vs. Objectivity: Why the Brain Distorts Information and Makes Bad Decisions

Based on an excerpt from psychologist Tom Chatfield’s book

Heuristic rules are simplified mental shortcuts that help us quickly assess situations and make decisions. While these rules evolved to help us survive, in the modern world they often distort our perception of news, slow down our work, and reinforce stereotypes. Here’s how basic heuristics work and what can help us evaluate our experiences more objectively.

Four Types of Heuristic Rules

1. The Affect Heuristic

Imagine you’re hospitalized with a rare, potentially fatal illness and must choose between two experimental therapies. Clinical trials involved 20,000 patients. Which would you choose?

  • Method A, which resulted in 4,900 deaths.
  • Method B, which saved lives in 70% of cases.

If you do the math, Method A actually saved more people (75.5% survival rate) than Method B (70%). But for many, the vivid information about 4,900 deaths outweighs the mathematical odds. This is the affect heuristic: the tendency to let emotional reactions guide our choices, even when those emotions are misleading.

Psychologist Paul Slovic and others have shown that people’s judgments are often swayed by their likes and dislikes. For example, if you identify as conservative, you’re more likely to favor conservative arguments and dismiss opposing ones, and vice versa for liberals. This isn’t just radical or simplistic thinking—it’s our brain’s way of making the world seem more orderly. If something feels good, we downplay its negatives; if it feels bad, we ignore its positives. When making choices, emotional impact often overrides all other factors.

Affect Heuristic: The tendency to use a simplified decision-making scheme based on the intensity of positive or negative emotional reactions to options.

For example, if you want to donate $10 a month to help save the oceans, which appeal is more persuasive?

  • “Are you willing to donate $10 a month to spread awareness about the ecological degradation of the Pacific Ocean?”
  • “Are you willing to donate $10 a month to protect a pod of dolphins threatened by the ecological degradation of the Pacific Ocean?”

The second appeal is more emotionally compelling, even if it’s manipulative. The hard question (“How can my donation do the most good?”) is replaced by an easier one (“Do I prefer raising awareness or saving dolphins?”).

2. The Availability Heuristic

Which are there more of in English:

  • Words that start with the letter “K”
  • Words with “K” as the third letter

Most people pick the first, but the correct answer is the second—there are about three times as many words with “K” as the third letter. It’s just harder to recall them, so our brains default to the easier option. This is the availability heuristic: we judge the importance or likelihood of something based on how easily it comes to mind.

For example, people overestimate the risk of dying in a terrorist attack (because such events get massive media coverage) and underestimate the risk of heart disease or car accidents, even though the latter are far more common. For the average American, the risk of dying from heart disease is about 35,000 times higher than from terrorism.

One vivid story can outweigh real statistics. If a celebrity dies from a rare cancer, many people start fearing that specific cancer more than common ones. The easier it is to recall something, the more likely we think it is. This is why repeated statements, even without evidence, start to feel “true.”

Consider these questions:

  • Do you spend more, less, or about the same amount of time reading as the average person?
  • Do you use your mobile phone more, less, or about the same as the average person?

Most people overestimate their own efforts because they remember their own actions better than others’. In a study of married couples, each partner was asked to estimate their share of household chores. The combined total almost always exceeded 100%—each person overestimated their own contribution because it was easier to recall.

The availability heuristic replaces a complex question (“How do we really split chores?”) with a simpler one (“How easily can I recall my own efforts compared to my partner’s?”).

Recency Effect: The tendency to overvalue recent events because they’re more vivid and easier to remember. For example, if you’re asked to name the five greatest musicians of all time, you’ll likely pick more from the last 50 or 100 years, simply because you know more about them.

This effect is harmless in entertainment, but in areas like politics, technology, or economics, it can distort our understanding. To see the world clearly, we must compensate for our instinct to focus on recent events just because they’re fresh in our minds.

3. The Anchoring Heuristic

Read these two paragraphs and fill in the blanks with an age that seems appropriate:

  • “I used to shop at ‘Tasty Wine’ at 997 High Street, next to ‘Meat Shop’ at 999 and ‘Greenest Veggies’ at 995. It was great to buy a nice bottle of wine on the owner’s recommendation. Sadly, he died suddenly last year at age _____.”
  • “After that, I started shopping at ‘Elite Wine’ at 12 High Avenue, between ‘Veggie Empire’ at 10 and ‘Hair Today, Gone Tomorrow’ at 14. It was great to buy a fine bottle of wine on the owner’s recommendation. Sadly, he died suddenly last week at age _____.”

Which age did you pick for each? If the first number is higher, you’ve experienced the anchoring effect. The large street numbers in the first paragraph unconsciously influenced your estimate, even though they’re irrelevant.

Anchoring Effect: The influence of an initial value or reference point on subsequent judgments, even when it’s unrelated to the topic.

We always judge things in context, not in isolation. For example, a price seems reasonable if it’s lower than an initial high price. That’s why salespeople start with a high price, or why restaurants list expensive dishes to make others seem affordable.

A related phenomenon is the focusing effect: the tendency to pay too much attention to one obvious characteristic, leading to an unbalanced assessment. For example, if your friend wants to move to California just for the weather, they may be ignoring other important factors.

Focusing Effect: The tendency to focus excessively on one striking aspect, neglecting the full range of relevant factors.

Again, a complex question (“Should I move to California?”) is replaced by a simple, emotional one (“How do I feel about California’s weather?”).

4. The Representativeness Heuristic

Consider the famous “Linda problem” from psychology:

Linda is 31, single, outspoken, and very bright. She majored in philosophy and was deeply concerned with issues of discrimination and social justice, participating in anti-nuclear demonstrations. Which is more likely?

  • Linda is a bank teller.
  • Linda is a bank teller and active in the feminist movement.

Many people pick the second option, but mathematically, the first is more likely. The second is a subset of the first, so it can’t be more probable. This is the representativeness heuristic: we judge probability based on how much something matches our stereotypes, not on actual statistics.

Representativeness Heuristic: The tendency to judge the likelihood of something based on how much it fits a familiar pattern or story, rather than objective data.

For example, if you hear about a young, fit Englishman who loves the outdoors and tea, you might guess he works in agriculture. But statistically, he’s much more likely to work in healthcare or social services, simply because more people do. Still, we’re drawn to the stereotype that fits the description.

Again, a complex question (“How many people work in each field?”) is replaced by a simple, emotional one (“Which stereotype does this person fit?”). Stereotyping is a universal feature of perception, affecting not just strangers but even those we know well. It’s just the tip of the iceberg of social biases, which, combined with structural inequality, can lead to serious injustice.

Conclusion

Heuristics are natural and essential for everyday life. But when misapplied, they lead to cognitive biases and poor judgments. This is especially likely when we’re rushed, inexperienced, overwhelmed with information, manipulated (by advertising, media, or sales), or influenced by stereotypes and social prejudices. Recognizing these mental shortcuts is the first step toward more objective thinking.

Leave a Reply