Encyclopedia of Propaganda Methods: Part 3
Feedback Mechanisms
People who participate in an event are more likely to change their views in favor of the opinion promoted by the event’s script than passive observers. Numerous psychological experiments have confirmed this. The illusion of participating in a discussion on a current issue leads to greater changes in opinions and attitudes than simply receiving information passively. To prevent the audience from feeling one-sided influence and a sense of “recipient detachment,” modern media widely use various forms of so-called “feedback”: live call-ins, phone voting on questions, and more. This “makeup” is designed to create the illusion of participation in the information process for the mass audience.
If you’ve ever tried to call into a live TV show to ask a question, you know that before you go on air, a producer will ask about the nature of your question. You’ll only be allowed on if your question aligns with the channel’s editorial policy. If you start saying something off-script, you’ll be cut off mid-sentence—this has been proven in practice.
Pseudo-sociological polls (where viewers vote by phone during a broadcast) are often just a way to shape public opinion, not reflect it—a form of propaganda manipulation. Questions are worded to create the “right” perspective on an issue, steering our thinking in a specific direction. The main principle: “Never ask a question unless you can get the answer you want.” Framing issues in terms of “gains” is more convincing than in terms of “losses.” Here’s a simple, exaggerated example:
- Question: “What role do you want your country to play in the modern world?”
- Answer options:
- A) I want my country to be a raw materials donor for developed nations and a source of cheap labor for multinational corporations.
- B) I want my country to become a dumping ground for chemical and nuclear waste from around the world.
- C) I want my country to become a powerful state with a developed economy and a high standard of living.
Guess which answer most people will choose? Almost all “interactive phone polls” are structured this way.
Poll organizers have long known that subtle changes in wording can lead to very different answers. For example, psychologists B. Lockerbie and S. Borrelli found that the percentage of Americans supporting aid to the Contras in Nicaragua from 1983 to 1986 ranged from 13% to 42%, depending on how the question was phrased. Mentioning Ronald Reagan or using ideological labels increased support, while mentioning the dollar amount or presenting both sides reduced it.
A striking example of this manipulation was carried out by a Moscow radio station in the late 1990s. After repeatedly reporting on gasoline shortages in St. Petersburg, they asked listeners, “Should we supply fuel to Yugoslavia?” without mentioning expert opinions or the loss of export revenue for Russia if supplies were cut. As a result, 75% of callers said no.
When the communicator’s unfavorable opinion dominates, the goal of “feedback” becomes to correct and change public attitudes. Displaying rigged poll results, filtering calls, and organizing “public opinion” through planted individuals are all aimed at making dissenters feel like “black sheep”—that most people think differently. Sometimes, disagreement is deliberately or accidentally allowed, such as airing an angry call or publishing a protest, to highlight the “objectivity” of the source.
Staged Events
A type of feedback is the so-called technique of staged events, especially interactions between high-ranking officials and “ordinary people.” This can be direct (answering citizens’ questions by phone, chatting with “random passersby”) or indirect (press conferences, media briefings). Most often, this “communication” is a well-rehearsed performance. For example, French President de Gaulle never received a question at a press conference he wasn’t prepared for. U.S. presidents also answer pre-submitted questions. Political consultants always prepare their leaders for expected questions. During the Monica Lewinsky scandal, President Clinton was made to rehearse press conferences to appear calm and confident. Boris Yeltsin’s press secretary, D. Shevchenko, noted that all questions were pre-approved, and Yeltsin even knew the seating chart of correspondents.
TV programs are organized to make it seem like the head of state is spontaneously answering questions from citizens by phone or online. The best improvisation is a well-prepared one. Watching a leader confidently answer dozens of tough questions in a short time, it’s clear the answers are rehearsed and prepared by a team. Questions are chosen based on sociological research to reflect current social issues or to build a positive image. The public then watches a TV show called “The Head of State Answers Citizens’ Questions Live,” followed by approving comments from “ordinary people” (“I never realized our president was so democratic and approachable!”).
Flanking Maneuvers
Western propaganda theorists have found that at U.S. campaign rallies, Republicans don’t encounter Democrats and vice versa. Most people turn off radio or TV programs that defend views contrary to their own. This is a psychological defense mechanism that maintains internal balance and confidence, protecting against cognitive dissonance. Thus, election propaganda rarely recruits many new supporters; it mostly reinforces existing views. The conclusion: to succeed, a propagandist must get people to listen and break through or bypass negative predispositions.
During WWII, the BBC included weather reports in broadcasts to Nazi Germany. These neutral, easily verifiable facts were meant to lend credibility to all BBC broadcasts. Such tricks, designed to create an impression of impartiality and objectivity, have evolved into a sophisticated system of “flanking maneuvers.”
The main tactic is “propaganda by information” (factual propaganda): providing accurate, verifiable information in measured doses. This includes names, street addresses, and other details that “package” propaganda messages. For example, during WWII, British intelligence obtained lists of all German submarine commanders, their families, and even their mistresses. Propaganda broadcasts would address specific crews, mentioning personal details to demonstrate knowledge, then deliver anti-war messages. Later, during the Cold War, American and British stations broadcasting to the USSR would mention real names and addresses from phone books to increase credibility.
Major Western media like CNN, BBC, and DW have built reputations for objectivity using factual plausibility. Most of their airtime is devoted to news programs that present politically oriented viewpoints as impartial news. The impression of “objectivity” is created by including opposing opinions or facts that seem unfavorable to their own countries’ official positions.
Other flanking methods include mimicking the audience’s views and presenting ideas as natural extensions of accepted beliefs, aiming for gradual, evolutionary involvement in new ideological or political perspectives. Western researchers found that slow, subtle exposure to propaganda is effective for people with unsettled views or no strong group affiliations.
American psychologists have described how people adopt unfamiliar views: someone stumbles upon new information, finds it interesting, and only later learns it’s from an “outsider.” If they’d known earlier, they might not have read it, but now their predisposition is breached. With repeated exposure, interest grows, and they may even seek out similar information. If new information resonates with their needs and isn’t addressed by traditional sources, they may completely shift perspectives—often believing they haven’t changed at all.
Sometimes, people knowingly engage with hostile propaganda out of curiosity, and over time, their prejudice may fade. Believing they’ve formed their own position, they actually come under the influence of a different ideology.
Another recommendation for flanking tactics is to hide true intentions and avoid conflict with the audience’s accepted values and worldview. “Propaganda only fails when it looks like propaganda,” experts say. It’s important to flatter the audience, praise national qualities, express “compassion” for certain groups, and do so without overtly criticizing opponents or openly praising “our own.”
For example, the CIA’s 1950s document “Psychological Warfare Against the USSR” recommended highlighting shared values between Russians and the free world, emphasizing truthfulness, compassion, family, and hospitality, and assuring Russians that the West seeks only their freedom and prosperity. Similar themes were suggested for stories about the U.S., such as America’s peacefulness, respect for sovereignty, and cultural ties with Russia. American propaganda still operates in this vein, adjusted for current political realities and new “enemies of the civilized world.”
Distraction Techniques
For propaganda and any manipulation, suppressing psychological resistance is key. Most experts believe propaganda should combine entertainment, information, and persuasion. Entertainment means anything that sparks interest and masks the true message, blocking critical perception.
In the 1960s, it was found that messages opposing someone’s beliefs are more effective if the recipient’s attention is distracted (for example, by playing popular music). This makes it harder to process the information and form counterarguments—the basis of resistance. These findings increased the effectiveness of manipulation in the press and on TV. Newspapers began using “kaleidoscopic” layouts, mixing important news with gossip, rumors, sensational stories, photos, and ads. TV started carefully selecting distracting visuals. Today, almost all news broadcasts are a kaleidoscope of attractive images and unrelated news, making it hard for viewers to process information critically. As I. Kalinauskas notes, when there’s too much diverse information, people can’t process it meaningfully and must accept it as a whole, uncritically.
Sometimes, traditional events are used to distract from political actions that would otherwise draw public attention. For example, Boris Yeltsin’s resignation on December 31, 1999, went largely unnoticed as people were preparing for New Year’s and then recovering from celebrations.
Unprecedented and unusual events (murders, disasters, terrorist acts, scandals) are especially effective distractions, often used by politicians to push through questionable actions. Distraction techniques also include campaign concerts and public festivities, where pop stars urge people to “Vote or lose!” or “Defend Russian culture!”—often for a fee.
“Eyewitnesses”
This is a highly effective technique for creating emotional resonance. Many random people are interviewed, and their words are used to construct the desired emotional and semantic narrative. The most powerful effect comes from crying elderly women, children, or young people with disabilities.
A classic example is from the Gulf War. In October 1990, global media circulated the story of a 15-year-old girl who claimed to have seen Iraqi soldiers remove 15 babies from incubators and leave them to die on the cold floor. Her identity was hidden for her family’s safety. President George H.W. Bush cited the story ten times in the forty days before the invasion of Iraq, and the Senate repeatedly referred to it during debates. Later, it was revealed that the girl was the daughter of the Kuwaiti ambassador to the U.S., and most other “witnesses” were prepared by the PR firm Hill & Knowlton. By then, it no longer mattered.