Cognitive Biases in Profiling and Lie Detection: Key Types and Examples

Cognitive Biases in Profiling and Lie Detection Practice

The Wikipedia list includes 175 cognitive biases. Of course, this is far from a complete list of the ways our brain deceives itself. Such self-deception is quite easy, since a significant part of our mental processes occur unconsciously. This makes it possible to address these core processes directly, bypassing the conscious mind.

Cognitive biases play a major role in the work of profilers and polygraph examiners, as we often fall under their influence and make mistakes when evaluating people and information about them.

The large Wikipedia list of cognitive biases is classified rather vaguely, with four thematic groups:

  • Biases related to behavior and decision-making
  • Biases related to probabilities and stereotypes
  • Socially conditioned biases
  • Biases related to memory errors

This classification doesn’t clearly explain the causes of these biases or how they can be exploited, nor why they arise. Many biases are also duplicated under different names.

There is another way to classify biases, focusing more specifically on the cause of the mental malfunction that leads to incorrect perception of reality. If we classify them by cause, biases can also be divided into four groups, but now they become more logical and understandable.

Four Problems That Cause Cognitive Biases

  1. Too much information
  2. Not enough meaning (ambiguity)
  3. The need to act quickly
  4. Filtering information for memory: the brain always prefers to remember a simpler and clearer concept rather than a complex and ambiguous one, even if the latter is more accurate and objective

Perhaps the most interesting is the first group—biases related to information overload. The other groups are conceptually connected to it. Instant filtering, censorship, and selection of information for memory seem to be the main problems we face in the modern era, when the amount of information is overwhelming. This likely causes most cognitive biases and incorrect perceptions of reality.

Subgroups of Biases Related to Information Overload

1. We Notice Things That Are Already Familiar or Frequently Repeated

This is a large group of biases often exploited in television. Repeating the same thing multiple times almost guarantees that a person will miss a detail mentioned only once. Repeated lies are more likely to be believed.

Examples:

  • Availability heuristic — judging as more probable what is more easily recalled from memory.
  • Attentional bias — perception depends on recurring thoughts; if you constantly think about a topic, you notice news about it more often.
  • Illusion of truth effect — tendency to believe information is true if we’ve heard it many times.
  • Mere exposure effect — tendency to express unwarranted liking for something simply because it’s familiar.
  • Context-free forgetting — difficulty recalling information without context; a cue can trigger a whole chain of memories. This also works emotionally: it’s easier to recall information if you evoke “anchor” emotions linked to it.
  • Frequency illusion (Baader-Meinhof phenomenon) — after learning about something new, it seems to appear everywhere. This happens because your mind starts tracking mentions of it, reinforcing the impression that it’s suddenly everywhere.
  • Empathy gap — underestimating the influence of visceral factors (hunger, thirst, sexual desire, addiction, pain, strong emotions) on one’s own behavior. People may rationalize their actions, ignoring the true subconscious cause.
  • Omission bias — underestimating the consequences of inaction compared to action with the same result. For example, anti-vaccine parents may prefer the risk of disease complications over the much lower risk of vaccine side effects.
  • Base rate fallacy — ignoring the overall frequency of an event and focusing on specific information. For example, if a breathalyzer gives a false positive 5% of the time but never a false negative, and a driver tests positive, what’s the actual probability they’re drunk?

2. We Notice and Remember Unusual, Bizarre, or Funny Images More Than Ordinary Ones

The brain exaggerates the importance of unusual or surprising information, while ignoring mundane or expected information, even if it’s important.

Examples:

  • Von Restorff effect (isolation effect) — in a series of similar items, the one that stands out is easier to remember (e.g., a number among letters).
  • Picture superiority effect — images are easier to remember than words; confirmed by many studies.
  • Self-reference effect — people remember information better if it relates to them personally. In advertising, people respond better if the ad features someone similar to them. Birthdays close to one’s own are easier to remember.
  • Negativity bias — negative things are perceived more strongly than positive ones of equal intensity. This applies to thoughts, emotions, relationships, traumatic events, etc. TV audiences pay more attention to negative news. One negative trait can overshadow many positive ones in a person. Interestingly, some studies show this bias fades with age, and older people may even develop a positivity bias, perceiving positive information more strongly and negative as a given.

3. We Notice Changes—Dynamics, Not Statics

The brain incorrectly evaluates the value of new information based on the direction of change (positive/negative), rather than objectively assessing it regardless of previous data.

Examples:

  • Anchoring effect — bias in estimating numerical values toward an initial reference point. Used in retail (e.g., showing the price for several items even without a bulk discount) or in donation requests (suggesting a large example donation increases average donations).
  • Money illusion — tendency to perceive the nominal value of money rather than its real value, leading to misjudgment of price changes and inflation.
  • Framing effect — different reactions to the same choice depending on how it’s presented (as a gain or a loss). For example, penalties for lateness are more effective than rewards for punctuality. In court, defendants are more likely to confess if it’s framed as a step toward release rather than the last step before imprisonment.
  • Weber–Fechner law — the perceived intensity of a stimulus is proportional to the logarithm of its actual intensity. For example, a chandelier with eight bulbs seems as much brighter than one with four as four is to two, but this isn’t objectively true.
  • Conservatism (psychological) — bias against new information that contradicts established beliefs. This bias can be considered a separate category.

4. We Are Drawn to Information That Confirms Our Beliefs

This is a large and important category, also related to how we filter new data. When overwhelmed with information, people mostly select what confirms their opinions.

Examples:

  • Confirmation bias — seeking out and favoring information that confirms one’s beliefs.
  • Choice-supportive bias — tendency to retroactively attribute positive qualities to a chosen option, finding “rational” reasons for the choice after the fact.
  • Selective perception — paying attention to elements that match expectations and ignoring the rest.
  • Ostrich effect — ignoring negative information related to a choice already made.

5. We Notice Mistakes in Others More Than in Ourselves

Even with this list of cognitive biases, it seems like others are more affected than we are.

Examples:

  • Bias blind spot — recognizing biases in others but not in oneself; well-studied by Emily Pronin.
  • Naive cynicism — expecting others to be more selfish than they really are. The reasoning goes: “I have no biases—if you disagree with me, you’re biased. Your actions reflect your selfish biases.” This is the opposite of naive realism.
  • Naive realism — believing we see the world objectively as it is, and those who disagree are uninformed, irrational, or biased. In science, this means believing that a theory accepted by the scientific community is absolutely true and fully describes reality.

This classification of cognitive biases related to information overload seems more logical than Wikipedia’s. At least the main causes of biases are immediately visible. However, this classification is still somewhat conditional, as many biases are explained by several causes at once.

Leave a Reply