The Problem of Consciousness: Who Controls Our Thoughts?

The Problem of Consciousness in Psychology and Philosophy: Who Controls Our Thoughts?

Whatever you may think, it’s not a given that these are your own thoughts. English scientist, philosopher, and writer Keith Frankish explains how the problem of consciousness is addressed today in psychology and philosophy, why we are often mistaken about our own beliefs, and whether we can truly be responsible for our decisions if our understanding of our own thoughts and actions is a product of self-interpretation—and often incorrect.

Do We Really Know Our Own Thoughts?

Do you think racial stereotypes are false? Are you sure? I’m not asking whether the stereotypes are actually false, but whether you are certain that you are sure. This question may seem odd. After all, we all know what we think, right?

Most philosophers who study the problem of consciousness would agree, believing that we have privileged access to our own thoughts, which are largely protected from error. Some claim that we have an “inner sense” that monitors consciousness just as our external senses monitor the world. But there are exceptions. Mid-20th-century behaviorist philosopher Gilbert Ryle argued that we learn about our own consciousness not from an inner sense, but by observing our own behavior—and that our friends might know our minds better than we do ourselves. (Hence the joke: two behaviorists have just had sex; afterward, one turns to the other and says, “That was great for you, dear. How was it for me?”)

Modern philosopher Peter Carruthers offers a similar view (though for different reasons), arguing that our ideas about our own thoughts and decisions are products of self-interpretation and are often mistaken.

Evidence from Social Psychology

There is evidence for this in experimental work in social psychology. It’s well known that people sometimes think they have beliefs that they actually don’t. For example, when given a choice between several identical items, people tend to pick the one on the right. But when asked why they chose it, they invent reasons, claiming that it seemed to have a nicer color or was of better quality. Similarly, if someone performs an action in response to a prior (now forgotten) suggestion, they will make up a reason for doing it. It appears that subjects are engaged in unconscious self-interpretation. They have no real explanation for their actions (choosing the right side, responding to a suggestion), so they infer a likely reason and attribute it to themselves. They don’t realize they are interpreting, but explain their behavior as if they were truly aware of its causes.

Other studies support this explanation. For example, if people are instructed to nod their heads while listening to a recording (supposedly to test headphones), they express more agreement with what they hear than if they are told to shake their heads side to side[1]. And if they are required to choose between two items they previously rated as equally desirable, they later say they prefer the one they chose[2]. Again, it seems they subconsciously interpret their own behavior, taking their nodding as a sign of agreement and their choice as a revealed preference.

The Interpretive Sensory-Access Theory

Based on such evidence, Carruthers makes a strong case for the interpretive view of self-consciousness, as outlined in his book The Opacity of Mind (2011). It starts with the claim that humans (and other primates) have a special mental subsystem for understanding the thoughts of others, which, based on observing people’s behavior, quickly and unconsciously generates knowledge about what others think and feel (evidence for such a “mindreading” system comes from various sources, including the speed with which infants develop an understanding of those around them). Carruthers argues that this same system is responsible for our knowledge of our own minds. People do not develop a second, inward-looking “mindreading” system (an inner sense); rather, they develop self-knowledge by turning the outward-looking system onto themselves. And since the system is outwardly directed, it only has access to sensory channels and must draw its conclusions solely from them.

The reason we know our own thoughts better than those of others is simply that we have more sensory data to use—not only our perception of our own speech and behavior, but also our emotional reactions, bodily sensations (pain, limb position, etc.), and a rich variety of mental images, including a steady stream of inner speech (there is convincing evidence that mental images engage the same brain mechanisms as perception and are processed similarly). Carruthers calls this the Interpretive Sensory-Access (ISA) theory, and he confidently presents a vast array of experimental evidence in its support.

Surprising Consequences of the ISA Theory

The ISA theory has several striking implications. One is that (with some exceptions) we do not have conscious thoughts and do not make conscious decisions. If we did, we would know about them directly, not through interpretation. The conscious events we experience are types of sensory states, and what we take to be conscious thoughts and decisions are actually sensory images—specifically, episodes of inner speech. These images may express thoughts, but they require interpretation.

Another consequence is that we can be sincerely mistaken about our own beliefs. Returning to my question about racial stereotypes: I think you said you believe they are false. But if the ISA theory is correct, you cannot be sure that you really think that. Research shows that people who sincerely say that racial stereotypes are false often continue to behave as if they are true when they are not paying attention to what they are doing. Such behavior is usually described as a manifestation of implicit bias, which contradicts a person’s explicit beliefs. But the ISA theory offers a simpler explanation. People think the stereotypes are true, but are also sure that admitting this is unacceptable, so they say they are false. Moreover, in their inner speech, they tell themselves this and mistakenly interpret it as their belief. They are hypocrites, but not consciously so. Maybe we all are.

If all our thoughts and decisions are unconscious, as the ISA theory suggests, then moral philosophers have a lot of work to do. We tend to think that people cannot be held responsible for their unconscious attitudes. Accepting the ISA theory does not mean abandoning responsibility, but it does mean radically rethinking what responsibility means.

Sources

  1. Wells, G. L., & Petty, R. E. (1980). The effects of overt head movements on persuasion: Compatibility and incompatibility of responses. Basic and Applied Social Psychology, 1(3), 219–230.
  2. Brehm, J. W. (1956). Postdecision changes in the desirability of alternatives. Journal of Abnormal and Social Psychology, 52(3), 384–389.

Leave a Reply