What Is Social Consciousness?
Humans truly have the ability to sense each other’s emotional states. However, we don’t engage in deliberate observation that can be pieced together for an intellectual understanding of what’s happening with others. (Although sometimes we try, we rarely succeed.) Instead, we rely on finely tuned intuition. It’s as if we just know what others are thinking and feeling. Sometimes this knowledge feels so tangible that it seems like we directly sense others’ thoughts and feelings as some kind of radiation. Of course, that’s not really what’s happening. But millions of years of evolution have given us the ability to pick up on subtle cues and build detailed models of each other’s mental states—mostly intuitively, not consciously.
We attribute the full spectrum of mental content to each other: emotions, intentions, interests, beliefs. The brilliant and complex process of reconstructing another’s inner world is called building a model of the mind. In Russian scientific psychology, the term “Theory of Mind” is commonly translated as “model of the mind” or “model of mental state” (Theory of Mind). This isn’t a theory in the intellectual sense; it’s a process that happens automatically and inevitably—we simply can’t help but do it. But in building models of each other’s minds, one component is especially important: reconstructing another’s attention.
How can you know if someone will reach for an apple if you don’t know whether they’ve noticed it? And even if you know they’ve noticed the apple, can you predict what they’ll say or do next without understanding the consequences of that attention?
My first step in reconstructing your mental world is to understand that mental processes can be focused on something, that this focus can be narrow or broad depending on circumstances, it can shift from object to object, and it has predictable consequences. Without this, no model of mental state is possible. I need not just a model of the content of the mind, but a model of what the mind is.
If I’m standing in front of you, I can see where you’re looking. An entire scientific field has grown around how the brain processes the direction of another’s gaze. But to reconstruct your inner world, it’s not enough to know where your explicit attention is directed. I also need to figure out your implicit attention. I have to pick up all possible clues from the context, including your movements, facial expressions, words, and my overall knowledge of you. It’s not so important where your gaze is directed or whether I can see your eyes—I need to reconstruct the information that has filtered through your cortical hierarchy and reached higher levels of processing, and understand how these higher levels might influence your behavior.
I guarantee that no one, upon looking at another person, immediately has an intuitive thought like: “The cortical visual pathways of my counterpart are currently processing multiple stimuli; neurons reflecting the shape of the apple have increased activity in response to signals from the frontal lobe; as a result, this increased activity has partially inhibited neighboring neural representations through lateral inhibition based on local interneurons using gamma-aminobutyric acid as a neurotransmitter…”
You could go on like this, but in reality, no one ever attributes true, physiological, neural attention to another person. We don’t need to, especially not in such detail. Instead, my brain builds a much more schematic and efficient model. I intuitively understand: “Right now, this person’s consciousness is perceiving the apple, which could have many different consequences.” Once again, we return to consciousness as a simplified, practical model of attention. But now, the attention schema is used in social intelligence mode: we’re modeling not ourselves, but another person.
The Sally-Anne Test: Understanding False Beliefs
Sally and Anne come to the park with two closed picnic baskets and settle down to relax. After a while, Sally puts her sandwich in basket A and goes to the restroom. While she’s gone, Anne secretly moves the sandwich to basket B and closes the lids again. Which basket will Sally look in first when she returns to find her sandwich? This simple question is commonly used to test a person’s ability to build a model of another’s mental state.
To pass this test, you need to consider the information present in Sally’s inner world. Her knowledge about the sandwich’s location is correct when she puts it in basket A, but becomes false when Anne moves it to basket B. This problem can’t be solved without representing Sally’s mental world as a separate entity that contains information (possibly false) which determines her actions. That’s why this test is sometimes called the “false belief task.” The correct answer: she will look in basket A and see that her sandwich is gone.
Children under five often don’t give the correct answer. In their view, if the sandwich is in basket B, that’s where Sally should look. Why would she open a basket where the sandwich isn’t? When children pass the five-year mark, their social thinking adjusts, and the task becomes intuitively clear to them. As adults, we usually do a pretty good job tracking others’ mental states.
Do Animals Have a Theory of Mind?
Some success in the false belief task has also been shown by chimpanzees. Let’s play out the Sally-Anne scenario for them. Sally puts a fruit in a box, leaves, and Anne moves it to another box. Sally returns to get the fruit. By tracking the chimpanzees’ eye movements, researchers see that they look more at box A (where Sally left the fruit and where she’s likely to look first) than at box B, where the fruit actually is. Apparently, chimpanzees take into account Sally’s mental content and anticipate her actions.
Crows also seem capable of solving similar tasks. They often hide food but dislike when other birds steal their stashes. One bird hides a treat in front of another. Then the watching bird flies away. The first crow then carefully re-hides the food—probably so it won’t be stolen later. The hoarding bird seems to understand that the other crow saw where the food was hidden—so when it returns, its belief will be false and it will look in the wrong place.
In principle, very few non-human animals can solve the false belief task. Even the exceptions above are debated. But I think it’s premature to conclude that other animals lack a model of mental state. The false belief task is a very high bar. Watching several boxes with various contents is an intellectually complex task, comparable to a shell game. It’s no surprise that only humans can consistently handle it. I’m interested in something simpler: the concept of an inner world. We know that Sally has an inner mental world, and that this world contains information and will guide her behavior based on that information. Do other animals have the same intuitive understanding? Do they know what it means for another to “realize something”?
Scientists who study animal behavior prefer simple explanations. Instead of assuming an animal has a concept of another’s consciousness, it’s easier to suppose the animal has simply learned a set of simple rules. For example, a zebra doesn’t need to know that a lion has noticed it. It just needs to run from anything big and toothy. Stimulus in, reaction out. If a zebra has a large enough database of such associations, it will survive. However, it’s worth noting that this hypothesis, so typical of “stimulus-response” psychology, is actually quite naive.
A large database of learned associations is not the simplest or most effective way to navigate a complex environment. From a computational perspective, a model-based approach would be simpler, since one model can handle a wide variety of situations. For a zebra, it might be easier and computationally cheaper to build a schematic model in which the lion has mental content, objects from the environment enter that content, and when that happens, the lion’s mental model can guide its behavior.
The idea that a zebra “understands” another animal’s consciousness seems implausible to us, but that’s only because we consider consciousness a noble trait—linked to culture and unique to humans. Zebras lack complexity and poetry. But such thoughts are just our ego talking. I believe consciousness is an ancient component of the model of mental state—a simple and effective model needed to predict the behavior of other animals, and it likely evolved long before humans. I wouldn’t be surprised if zebras, other mammals, birds, and maybe even some reptiles use this convenient construct—consciousness (in varying complexity)—to predict how others will behave.
Is Human Consciousness Unique?
A common stereotype is to think that only we humans possess higher consciousness. We assume that other animals either lack consciousness altogether or have a less developed form. This view fits the widespread belief that consciousness arises from complexity. Since humans have the most complex brains among animals, we must have the best consciousness. But of all the mental talents we like to boast about—language, math, tool use, etc.—consciousness is probably the most primitive and the least distinguishing.
I’ll admit that the content of consciousness—thoughts, ideas, beliefs, insights, awareness of death—is probably more complex in humans than in other animals. But the very fact of having consciousness, the ability to experience subjective states and attribute similar experiences to others, is such a basic necessity that many other members of the animal kingdom may share it with us. If the attention schema theory is correct, then consciousness definitely isn’t unique to humans.