Encyclopedia of Propaganda Techniques: Part 2

Boomerang Effect

When official media outlets in the late 1980s launched a coordinated attack on Boris Yeltsin, he unexpectedly became a national hero and won the presidential election in June 1991. Later, Vladimir Zhirinovsky became the target of intense ridicule, which only fueled his success in the December 1993 Duma elections. In December 1995, the Communist Party (CPRF)—another target of TV attacks—won the majority of Duma seats. In the fall of 1999, a smear campaign by ORT against Moscow Mayor Yuri Luzhkov only strengthened his support among Moscow voters. A similar situation occurred a year earlier in Ukraine with Kyiv Mayor Oleksandr Omelchenko. American commentators discussed a similar problem half a century ago after Franklin Roosevelt’s unexpected presidential victory. The conclusion: if you want to gain popularity with the masses, create the image of a justice fighter persecuted by the authorities.

The boomerang effect is a trap that those in power often fall into. By organizing a total smear campaign against an opponent, they push him to the point where he starts to evoke sympathy and pity from the public. The same happens when authorities try to fight negative rumors circulating in society—trust in those rumors only grows.

The boomerang effect can also work in the opposite direction. For example, oversaturating the airwaves with ads for a particular candidate can irritate the audience.

Halo Effect

The halo effect is based on a psychological trait: the human tendency to think in false analogies. It consists of two common stereotypes:

  1. “Being near means being together.” Being seen next to a famous or high-ranking person raises one’s status in the eyes of others. That’s why people love to show off photos with “important” people. Politicians love to be seen with popular artists or athletes, hoping some of the public’s affection for celebrities will rub off on them.
  2. Success in one area means ability in others. People often believe that someone who has achieved success in one field is capable in others. In reality, many people who excel in one area are completely helpless in others. Nevertheless, this stereotype is widely exploited in politics and business advertising. Political parties often recruit popular artists, athletes, journalists, and writers before elections. Businesspeople seeking political careers use the message: “We are successful, self-made people who have proven ourselves and are ready to bring our success to society as a whole!” In reality, this is just a rebranding of the old Bolshevik slogan that anyone can run the state—only now, instead of a cook, it’s a successful businesswoman who opened a chain of laundromats.

Former Olympic champions, generals, bakery owners, or even bank managers who have succeeded in their fields are not necessarily good lawmakers. This simple idea is often ignored by media outlets that create propaganda hysteria around the next “savior of the nation.”

Primacy Effect

Dr. Goebbels introduced a key principle to modern propaganda: the person who speaks first is always right. Later, psychologists found that the candidate who first convincingly presents himself as a winner during an election campaign is the one the public recognizes as such. Researchers at Yale University and others concluded that a propagandist’s success is largely ensured if their information reaches the audience before their opponents’. When conflicting information is received (and can’t be verified), people tend to believe what they heard first. Changing an already formed opinion is very difficult.

This effect is used in massive “dirt-dumping” campaigns. In the end, the person smeared is always seen as guilty—after all, they have to defend themselves. The accuser’s status is perceived as higher than the accused. As Hitler said: “The public will always prefer to believe a 90% unproven accusation over a 100% proven refutation.”

As early as 1925, American psychologist M. Lundt formulated the “law of precedence,” stating that the first message about an event has a stronger impact than subsequent ones. The first to report information satisfies the audience’s need and forms the initial psychological attitude toward the event. This makes the source more attractive and builds future preference for it over slower competitors.

Because of this, every media outlet today tries to be the first to present its version of events. This leads to problems, such as journalists’ rush to broadcast “breaking news” during terrorist attacks, which can actually help terrorists coordinate their actions based on media reports.

Presence Effect

This technique, introduced by Nazi propaganda, is now described in all journalism textbooks. It includes tricks to imitate reality, often used in “battlefield reports” and crime stories, such as staging the “real” capture of criminals or car accidents after the fact. The illusion of a “combat situation” is created by shaky cameras, blurred focus, people running, gunshots, and screams—making it seem as if the cameraman is filming under fire.

From a journalist’s memoir: “Before the American bombings in Afghanistan, the Taliban and Northern Alliance rarely shot at each other, mostly firing into the air. Tanks fired at fixed points to avoid hitting anyone. Journalists would arrive at the front lines to film combat, but there was none. So, they paid for a staged shootout, filmed it, and then global news agencies reported that journalists had come under fire.”

This illusion of authenticity has a powerful emotional impact, making us feel as if we’re witnessing real events, not realizing it’s just a cheap trick. The same technique is widely used in commercial advertising, with staged “accidents” to create the image of simple, “ordinary” people. Commercials often feature “Aunt Sally” speaking in a professional actress’s voice, trying to mimic “real people”—with fake pauses, stumbles, and uncertainty. It’s a primitive but effective way to capture the audience.

Information Blockade

A quote from a Russian Interior Ministry memo sums it up: “Where the price of information is measured in human lives, there is no place for democracy, transparency, or so-called objective coverage…” If you’re used to following “anti-terrorist operations” through CNN or Russian channels, you encounter information blockades daily. For example, after the September 11, 2001 attacks, the U.S. government issued “recommendations” to American media, urging them not to cover Al-Qaeda’s activities in detail or quote Osama bin Laden. Denying the enemy a public platform is a key goal of propaganda warfare.

Information blockades are closely tied to information dominance. They are used in both wartime and peacetime (such as during elections). Authorities create an information vacuum on an issue, then fill it with biased information. Often, authorities block information on a problem, then release it in a one-sided way. Since public interest is high, the information favorable to the authorities spreads widely, ensuring a unified interpretation and, essentially, mass brainwashing. Alternative viewpoints are practically inaccessible to the public.

Information blockades accompany almost all military-political conflicts. For example, the 1991 Gulf War was portrayed as “clean” and just, with Western media praising “surgical strikes” and downplaying or ignoring evidence to the contrary. Even former U.S. Attorney General Ramsey Clark and a well-known American cameraman were censored when they brought footage of civilian suffering from Baghdad. No American TV network aired the footage.

During NATO’s airstrikes on Yugoslavia, the UK was holding European Parliament elections. The small Socialist Labor Party included footage of NATO bombing devastation in its campaign ad, but the BBC simply cut the segment, despite its reputation for objectivity.

Why such censorship? As British War Minister Lloyd George said during World War I: “If people knew the truth about the war, it would be stopped tomorrow. But they don’t know, and never will.”

G. Pocheptsov, analyzing the Chechen war, classified information control as follows:

  1. Control of verbal labels (e.g., “carpet/precision bombing,” “territory cleansing”) to obscure deadly realities.
  2. Control of visuals—no images of wounded, dead, or lost equipment on TV.
  3. Control of interpretation—e.g., a minister’s order banning interviews with Chechen fighters on TV.

Information blockades are also common during election campaigns. Biased media create hype around a favored candidate, while opponents are denied a platform. At best, they are shown “without sound,” with the footage accompanied by slanted commentary. Instead of letting Mr. N speak, journalists talk about him, interpret his actions, and so on. Thus, information blockades are closely linked to another propaganda technique: commentary.

Information blockades are used not only for political but also economic purposes. In the early 1990s, during the market reforms of Yegor Gaidar and Anatoly Chubais, there was a real information blockade—experts warning about the dangers of “Chubais-style privatization” were kept from the public. In the mid-1990s, information blockades covered up numerous banking scams and financial pyramids. Electronic media, broadcasting aggressive ads, did not provide warnings or give airtime to experts who could have explained the risks. Similarly, in 1997-1998, information blockades covered up the Russian government bond (GKO) crisis, which led to the 1998 financial crash. The public was not informed about parliamentary debates on the issue, which could have at least mitigated the consequences.

Use of Mediators

This technique is based on two principles. First, research shows that the strongest influence on the average person’s opinion comes not from mass media campaigns, but from circulating myths, rumors, and gossip. Second, effective information influence is not direct from the media, but through respected, familiar people—“opinion leaders”—who transmit opinions and rumors. Personal communication is more significant than “official” media messages.

Most people form opinions through conversations with family, friends, and colleagues, developing a shared approach based on familiar values and norms. Opinions on everything from which detergent to buy to whom to vote for are shaped by certain authorities—either formal leaders (parents, bosses) or recognized experts (opinion leaders). Thus, media influence is always indirect.

Gleb Pavlovsky, a key architect of modern Kremlin information policy, put it this way: “You can’t fetishize electronic media. The growing chorus of praise (for Putin) broadcast nationwide has the opposite effect. People never repeat official slogans among themselves—they talk about what they find important and interesting. Media matter only insofar as they provide content and talking points for real grassroots political discussion.”

Today, entire newspapers, magazines, and TV programs, under the guise of news, are devoted to creating and spreading plausible rumors that can be transmitted through interpersonal communication. In recent years, specialized internet projects have emerged for this purpose (e.g., SMI.Ru, VERSII.Ru, GAZETA.Ru, RusGlobus, etc.).

Let’s look at how this works. After receiving a message, the recipient doesn’t immediately accept or reject it. Consciously or subconsciously, they seek advice from those around them, especially the “opinion leaders” in their group. These are highly respected group members whose opinions are especially valued. They play a decisive role in shaping the group’s attitude toward the issue in the propaganda message.

This phenomenon is reflected in the two-step flow of communication model developed in the 1950s by Paul Lazarsfeld in the U.S. The model considers the interaction between the source and micro-social authorities (opinion leaders or “mediators”), and between mediators and individual group members.

In practice, this led to propaganda and advertising messages being targeted more at micro-group leaders, and to the use of people whose opinions matter to others. American experts believe that to shape public opinion, it’s enough to influence just 10% of the group—the opinion leaders—who will then transmit the message to the masses. Mediators can be informal leaders, politicians, religious figures, cultural icons, scientists, athletes, military personnel, celebrities, etc.—there’s an authority for every audience. In psychology, this is called “fixation on authorities.”

Most people tend to imitate the behavior of their opinion leaders. That’s why celebrities are chosen for advertising and political campaigns—they have a wide fan base, many of whom don’t bother to assess their idols’ competence in politics or other areas. On the other hand, commercials and propaganda often use “ordinary people like us.” It depends on the product or idea. For example, to convince us a medicine is necessary, a professional doctor (or someone who looks like one) is most effective. For everyday products, we rely on the opinions of friends, relatives, and neighbors.

The main goal of all advertising and PR campaigns is to use “fixation on authorities” to get the target audience to buy a product or service. For example, by giving a product to an influential group leader, you make it fashionable and prestigious. A simple example: to promote a certain brand of watch among teenagers, organizers held club parties where the watches were given to top DJs, pop group members, and TV hosts. Contest winners also received watches. This created the illusion that all the “best” people have the “best” watches, prompting many teens to want the same models as their idols.

Classifiers

Psychologist Gordon Allport wrote that the essence of any language is to classify and categorize the “continuous hum of information” we encounter every second. This is what gives language its power to persuade. When we name something, we highlight a specific feature and ignore others. The way an object is described and the manner in which an action is presented direct our thoughts and emotions in a certain direction. Classifiers format information so that the recipient accepts the imposed definition of the situation. They are, in essence, “seasoning words” for any message—either to describe one’s own “positive and constructive position” or to negatively characterize the opponent (see Principle of Contrast).

Classifiers in Russian media usually reflect the current policy direction of the authorities. For example, news reports about protests might describe participants as “elderly people, the unemployed, criminals, alcoholics, drug addicts, or radical youth with extremist views.” The action might be said to be “funded by” (Berezovsky, criminal groups, foreign intelligence, international terrorists), with the “goal” being “destabilization,” “creating a negative image of the country,” or “hindering government work.” Government agencies are described as “offering civilized solutions,” “ready for constructive dialogue,” “taking a positive and pragmatic stance,” or “not accepting ultimatums.” Law enforcement “strictly follows the law,” “acts appropriately,” “uses necessary force,” and “bravely opposes destructive forces.” Ordinary citizens “do not support the protest,” “condemn the instigators,” “understand law enforcement actions,” and “approve of the president’s actions.” To enhance the effect, classifiers are usually accompanied by appropriate visuals.

We won’t discuss the use of classifiers in commercial advertising, as it’s obvious (just recall any overused ad slogan).

Commentary

The goal is to create a context that guides people’s thoughts in the desired direction. A news report is accompanied by a commentator’s interpretation, offering several plausible explanations. The commentator’s skill is in making the desired explanation seem most credible. As American expert O’Hara wrote in “Information Media for the Millions,” a news anchor’s message may seem objective, but their tone, pauses, and facial expressions can have the same effect as an editorial.

Several additional techniques are used. First, “two-sided messages” present arguments for and against a position, preempting the opponent’s arguments and building immunity against them. Second, positive and negative elements are balanced—adding a bit of criticism to a positive assessment makes it more believable, while including praise in a negative assessment increases its effectiveness. All critical remarks, facts, and comparisons are selected to make the desired conclusion obvious. Third, facts are chosen to strengthen or weaken statements, but conclusions are left for the audience to draw. Fourth, comparative materials are used to emphasize importance, trends, and scale. The desired effect can also be achieved by structuring the message (see “Poison Sandwich”).

Stating as Fact

The desired situation is presented by the media as an accomplished fact, essentially hypnotizing the audience. Examples:

  • “The opposition camp is in disarray!”
  • “The head of the presidential administration is rapidly losing influence…”
  • “More and more deputies are leaving the agrarian faction in parliament…”
  • “The ‘Our Home is Russia’ movement is rapidly losing supporters. Its electorate dropped from 10% to 8% last month…”
  • “The ‘New Generation Team’ electoral bloc is gaining momentum and will surely win seats in the next parliament…”

This technique is used to create certain moods in society. Most people think in stereotypes: “Where there’s smoke, there’s fire,” “If everyone’s talking about it, it must be true.” People are made to feel like they’re in the minority, making them passive and more likely to support whoever seems to represent the “majority.”

This kind of propaganda is usually presented as news or the results of sociological research, reducing critical thinking—people don’t realize they’re being fed disinformation. To add authority, “opinion leaders” are widely used: popular journalists, political scientists, sociologists, etc. It’s psychologically hard to reject authority—it takes courage and free will. After all, those in power can always hire a likable host, a beloved actor, a respected academic, or even a “sex symbol”—there’s an authority for every audience.

This technique is also used in business advertising. For example, a business magazine headline: “Demand for prestigious downtown offices is starting to exceed supply!” (Artificially creating a sense of scarcity.)

False Analogy

One of the most dangerous psychological traps is the tendency to think in analogies and build pseudo-logical sequences. Most people are used to thinking in “cause and effect” terms, which seem logical and under conscious control. But we often extrapolate a “specific cause—specific effect” link to unrelated objects, which is where the trap lies.

Propagandists actively exploit this trait. For example, during the “perestroika” era, market reformers used the metaphor: “You can’t be a little bit pregnant”—meaning the planned economy had to be completely destroyed in favor of the market. But there’s no real similarity between pregnancy and economics. Real economies are “a little bit pregnant” with many different systems. Warnings from experts about such logical errors were ignored, showing a deliberate attack on logic. For example, “The British Empire collapsed, so the USSR had to collapse too!”—with no justification for the analogy. Why compare to Britain and not China or the USA? If the USSR’s collapse was inevitable, then so is Russia’s, since it’s also an empire. The use of false analogies is the basis for the propaganda technique of creating associative links.

Leave a Reply