Information as a Weapon: How and Why It Can Be Dangerous
Within the three realms—physical, informational, and virtual—there are unique units that reflect the properties of each space. These realms also differ in their characteristics. In the physical world, only one object can exist in a single spot at a time. For example, if there’s a monument in a square, there can’t also be a tree in the same place. In the informational space, a news story about the monument can change over time, creating multiple versions. In the virtual space, an infinite number of versions can coexist—stories, movies, or paintings about the monument can be endless.
To sum up:
- Physical space: one version
- Informational space: multiple versions
- Virtual space: infinite versions
Propaganda can erect a monument in a square, but it’s even more interested in the informational and virtual spaces because of their limitless possibilities. While a monument remains unchanged until it’s replaced (which is rare), informational and virtual spaces allow for endless changes, providing a constant “informational massage” to both individual and collective consciousness.
Why Is Information Dangerous?
Information can contradict an existing worldview, and its main danger lies in destroying that worldview. When people’s mental models are shattered, chaos can ensue both in their minds and in society at large.
To prevent this, societies and states establish physical, informational, and virtual boundaries. Physical borders are visible, while informational sovereignty is maintained through national news systems that curate content for domestic audiences. Virtual boundaries exist because virtual products have commercial value and are not freely distributed—they require payment. However, these products can also be used as tools of soft power, and their commercial value often helps them cross borders more easily. For example, “Harry Potter” spreads internationally with ease, while a scientific monograph does not, simply because the latter isn’t as profitable.
Social media and the internet as a whole break down all types of boundaries, helping to create a unified worldview for the “global citizen.” At the same time, countries with stronger informational and virtual influence—like those in the First World—can more easily spread their content abroad. Their news, movies, and celebrities often seem more attractive than local ones. When the worldview behind this content clashes with a nation’s own, bans may be imposed. For example, radical Islam resists globalization and Westernization, sometimes even through terror, while Iran creates its own competing products—its own animation instead of Disney, its own dolls instead of Barbie, and so on. As historian Klyuchevsky once said, by adopting a foreign product, we also adopt the mindset of its creators.
Powerful states promote their ideas in religion and ideology, giving rise to propaganda—a term that originally described the Vatican’s missionary work. Today, even in the West, it’s openly acknowledged that Fox News, Press TV, and Russia Today are forms of propaganda, continuing the tradition of politically controlled media from the Cold War era. As a result, counter-propaganda measures are taken: the UK banned Iran’s Press TV, and Ukraine banned Russian TV channels.
Social networks have introduced more democratic models of influence, greatly increasing the accessibility of information that can harm society. They have become the main channel for Russian influence in the West, as Russia cannot compete as effectively in traditional media.
Researcher Waltzman has demonstrated the variety of Russian attacks using social media. Watts, in turn, described how this information spreads: “There is a systematic process of spreading introduced information online: like-minded individuals, supporters, aggregators (gray accounts), and covert agents of influence (black accounts) share coordinated Russian propaganda with key network nodes, either one-on-one or one-to-many.” The goal is to transmit, amplify, and cement influential content and themes in the minds of targeted voters. Often, this content appeals to the far left or right, or to anti-government and social issues. Profit-seekers and political propagandists aim for high viewership within narrow target audiences.
All of this demonstrates the basic idea that information is power. But for information to have power, people must believe it—accepting it as true or at least possible. In the first case, the targeted worldview is destroyed; in the second, doubt is introduced.
The Cognitive Target of Information
The recipient of information is crucial, as the impact is on their cognitive system. The goal in all three spaces (physical, informational, virtual) is to influence the cognitive system of both individuals and the masses. Our thinking is the main target of any information attack.
Military experts also emphasize the cognitive aspect: “For any practitioner of information warfare, the cognitive element is of utmost value. All activity is pointless unless it changes the understanding and perception of the chosen audience.” This is especially true in the British approach to information operations, where the goal is to change the target’s behavior.
Frames and Agenda Setting
Let’s look at specific elements in messages that affect cognition. These can be built into the message itself or arise during transmission. Sometimes, information acts as a trigger, launching its destructive effect.
The first such element is the frame. Scholars like Lakoff and Iyengar have written extensively about frames. Frames are structures that shape how news is presented, focusing on selection and emphasis. In the 1990s, Entman defined a frame as: “A frame selects some aspects of perceived reality and makes them more salient in a communicating text, to promote a particular problem definition, causal interpretation, moral evaluation, and/or treatment recommendation.”
Frames become independent players in information transmission. They are not seen as separate from the news but as part of it. The average person cannot distinguish the frame from the news itself. Thus, all coverage is inherently non-neutral—it carries the worldview of the journalist or publication, often in a very concrete way. This serves two functions: no one can say information wasn’t provided, but at the same time, the “weapon” is delivered. Readers or viewers rarely realize that two messages are being implanted in their minds: the news and the frame. Moreover, people tend to consume media that matches their political beliefs and dismiss opposing outlets as unreliable, so the additional information seems natural—it fits their worldview.
This is evident in Iyengar’s classification of news into episodic and thematic. Thematic news requires more journalistic effort, expert input, and context, while episodic news simply reports isolated events. TV news is mostly episodic, which leads us to blame individuals for problems rather than institutions. For example, if a pedestrian is hit at a crosswalk, we blame the individual. But if we’re told the crosswalk is poorly lit and accidents are frequent, we blame city authorities. This is another reason episodic news dominates.
As Iyengar said: “People are given episodic coverage of many issues—crime, terrorism, poverty, racial inequality. There’s a tendency to attribute responsibility to individuals rather than institutions or broader social forces. Poverty is seen as a result of laziness; crime as a manifestation of antisocial traits. But when people encounter thematic frames, they’re more likely to hold society and politics accountable. Thematic framing helps people hold society accountable, and vice versa.”
Frames can be seen as points of view. We see events from a certain perspective, and without a point of view, it’s hard to understand an event. Multiple perspectives create thematic news; a single perspective creates episodic news.
According to Entman, frames can come from the communicator, the text, the recipient, and the culture. Communicators have frames (schemas) that organize their worldview. Texts have frames in the form of keywords, stereotypes, sources, and specific sentences that reinforce interpretations. Recipients’ frames may not match those of the communicator or the text. Cultural frames reflect the thinking of most people in a social group.
De Vreese proposed two types of frames: issue-specific (problem frames) and generic frames that apply to various issues. Frames always promote not just the news but a particular worldview. Lakoff noted that the first frame introduced is very hard to replace; it’s better to build a new frame alongside the existing one than to fight it. That’s why politicians are urged to comment on events before their opponents do—the first interpretation is usually accepted by the public, and alternative interpretations require much more effort to spread.
Another tool of information management is agenda setting. Our brains can only focus on a small fraction of events. Governments aim to get their information into the “top five” issues that attract public attention. Creating, spreading, and maintaining such information is a priority. “Spin doctors” are professionals who prepare the public for upcoming events, draw attention during the event, and keep it afterward—much like coverage of a major political congress.
McCombs, in his research on agenda setting, wrote: “The media must be successful not only in telling us what to think about, but also in telling us how to think about it.” This is a deeper goal than simply informing, and it’s why we talk about attacks on thinking.
McCombs identified two levels of agenda setting. The first is “objects”—problems, events, people, etc.—where the agenda determines their importance. The second is “attributes” of those objects—some are highlighted, others downplayed. Our mental image of an object is constructed from the attributes we’re told about.
Weaver noted the similarity between the second level of agenda setting and framing. Both focus on how issues or objects are represented, not the objects themselves. However, framing also involves moral, causal, and recommendation aspects.
The Weaponization of Media
We can now see media as a kind of weapon, where its informative and dangerous sides are united. This is the focus of Nissen’s research on the “weaponization of social media.” Nissen argues that modern wars target not territory, but populations and political decision-making. Donbas, rather than Crimea, is an example of modern warfare.
Another characteristic: new wars are fought not for geopolitical interests or ideology, but for economic interests and identity. As Kaldor said: “Old wars were fought for geopolitical interests or ideology (democracy or socialism). New wars are fought in the name of identity (ethnic, religious, or tribal). The goal is access to the state for specific groups, not broad public policy. The rise of identity politics is linked to new communication technologies, migration, and the erosion of inclusive ideologies like socialism or postcolonial nationalism. Most importantly, identity politics is constructed through war. Political mobilization based on identity is now the goal of war, not just a tool.”
Indeed, the existence of an external enemy helped build a new Ukrainian identity, though this process later stalled, possibly due to a struggle over language.
Another warning: “In the past, war was about seizing territory with military means. In new wars, battles are rare, and territory is seized politically by controlling the population. A typical technique is forced displacement of those with different identities or opinions. Violence is often directed at civilians as a way to control territory, not at enemy forces.”
Kaldor also distinguishes between identity politics (seeking power for a specific group) and ideological politics (seeking power to implement a program). In new wars, competition for power is based on identity, not policy debates.
Economist Paskhaver described Ukraine’s economy as a network of monopolies, where owners, in alliance with bureaucracy and politicians, receive monopoly rents and share them with their “protectors.” This social matrix is deeply embedded in daily life, including ordinary citizens who must use corruption to realize or defend their rights. This matrix is stable because it fits the dominant survival strategy.
Kaldor showed that identity politics can be based on any identity—national, clan, religious, linguistic. Today, it has largely replaced old understandings of war and international relations. Social constructivism adds that how you frame objects determines how you find solutions in social sciences. Much depends on how you define human motivations, which can’t be established objectively. All you can do is find an interpretation that lets you act and see how useful it is. One interpretation leads to one set of actions, another to a different set, even if the situation itself doesn’t change.
Frames are worldviews. Whoever controls them, controls the world. Whether a president is good or bad is a matter of interpretation. Media can strengthen a weak position or undermine a dominant one. As Kaldor said: “While old wars were associated with state-building, new wars tend to dismantle states.” The main tools for this? Words and images.
Information as Part of Larger Structures
There are two more levels where information becomes dangerous. The first is above information, where it’s part of more complex structures—cognitive or semantic levels. The second is the process of communication itself, where circulation can unintentionally spread dangerous information. For example, at a U.S. Senate Intelligence Committee hearing, “unintentional” spreaders were identified: WikiLeaks, Twitter, and journalists chasing sensationalism who spread unverified information. Facebook can also serve this function.
Information can be part of ideological, scientific, or other concepts that may work to undermine the structure of a nation-state. While this is more about ideological than cognitive defense, the broader idea of cognitive security is also relevant—protecting against foreign influence.
The Future of Information as a Weapon
In 2013, the U.S. military held a symposium on using neuroscience in influence and deterrence. Even the concept of deterrence can be reinterpreted in this context. Scholars suggested that deterrence should focus on decision-makers, not states, and consider the neurotraits of leaders. For example, 60% of men have the MAOA gene, which is linked to a predisposition to violence, and the environment plays a key role in activating it.
Another view challenges the idea that ideology causes radicalization. Instead, radicalization arises from anger at perceived injustice, shame over inaction, and the search for status. Jihadist videos radicalize by provoking emotional reactions. Chats and social media act as echo chambers, increasing polarization. When like-minded people gather, they become more radical.
One practical suggestion: to create an effective message, start by identifying whom the audience will trust. Therefore, begin not with the message, but with preparing the person or context that will make the message credible.
The military and business sectors are moving faster in using new methods, thanks to better funding for both real and potential threats. For example, the Pentagon has even developed instructions for a zombie attack.
Information flows targeting foreign populations are created and spread by organizations. It’s hard for individuals to resist these industrial-scale flows—it’s professionals versus amateurs. That’s why counter-structures are created, like Poland’s Center for Countering Russian Propaganda. However, there’s often a mismatch between the large budgets for such organizations and their modest results, since readers focus on the information itself, not on counter-propaganda about it. Other aspects of the fight are also difficult, such as the frequent change of legal entities behind certain media outlets to avoid tax debts.
To conclude, Kaldor’s observation is worth noting: “Negotiations belong to old thinking. There’s an idea that new wars can be resolved through negotiations, but in new wars, sides often speak in extremist logic. They don’t fight each other—they both kill civilians. Negotiations can legitimize both sides. We’re used to thinking you either fight and the other side surrenders, or you negotiate and reach an agreement. In reality, these steps are no longer solutions. First, you can’t ‘win’ a new war. All you do by entering a war is make it worse. That’s what’s happening in Afghanistan and Iraq. Second, negotiations legitimize insurgent commanders. So we need to think differently.”