The Fusion of “Technical” and “Human” Technologies in Modern Politics
We live in a world profoundly transformed by technology. While past inventions were important, they did not interfere with our minds the way modern technologies do. For example, the invention of the steam engine did not impact humanity’s mental processes as much as the invention of the computer. New generations now see and understand the world differently, leading to phenomena like “clip thinking” and other processes we are only beginning to comprehend.
Modern technologies like Facebook not only interfere with our lives but treat us as part of their “production processes.” This began with advertising and public relations, which aimed to build a consumer society by automating human behavior and turning people into consumers—effectively making them part of the production chain. Emotions, too, have been harnessed by technologies that work with the human brain, resulting in a fusion of “technical” and “human” technologies. Today, there is even talk of the “post-human” era, where the “human” aspect diminishes and the “technical” aspect grows within people themselves.
The combination of technology and humanity gives rise to new “human-technical structures,” with social media being a prime example. These platforms target individuals, but users do not control them, even though they may see themselves as creators. Social media exploits a key human trait—the desire to share new information. As a result, users find themselves in an “information paradise,” surrounded by more information than they can process.
Social Media and Discursive Wars
Social media has become a battleground for discursive wars, where opposing viewpoints coexist in the same information space. Previously, these views existed in separate spaces, but now people experience “information schizophrenia,” receiving conflicting messages simultaneously. Discursive wars managed by foreign actors have become a feature of many countries’ internal affairs. Any situation that deviates from the norm is quickly used to spark internal discursive conflict, weakening national unity and identity—especially during elections, when external players seek to support candidates favorable to their interests.
Former geopolitical rivals are now adversaries in discursive wars, which are cheap and can be waged anonymously through social media. While no one is physically harmed, minds are transformed in the desired direction. This shift happened rapidly and before our eyes. As Antonio García-Martínez, former head of Facebook ad targeting, said: “If you had told me in 2012 that Russian agents in the Kremlin would buy Facebook ads to undermine American democracy, I would have asked where your tinfoil hat was. Now we live in an upside-down political reality.”
Cyberattacks and Information Operations
Alongside the creation of conflicting discourses, cyberattacks on servers and data destruction have become common. The leading American tech magazine Wired detailed several such cyberwars initiated by Russia. In 2015-2016, attacks targeted the Democratic National Committee’s servers in the U.S.; in 2017, Ukrainian government structures were hit with the NotPetya malware. The U.S. Department of Justice published a 448-page report on Russian interference in the 2016 presidential election.
However, the effectiveness of Cambridge Analytica’s work during Trump’s campaign is now questioned. García-Martínez said, “What they did was mostly nonsense. This has been done before. There’s no reason to think it was especially effective.” Another expert, E. Wilson, noted, “Psychographic data is only as good as the creativity you can generate.”
Chris Wylie, a well-known figure from Cambridge Analytica, told NBC News: “Large-scale information operations are about distorting people’s perceptions. The company [Cambridge Analytica] was born from military contracts. The goal isn’t always standard political communication, where people know they’re being persuaded. It’s about changing what people think and perceive in the real world.”
Regarding Steve Bannon, Wylie said: “Steve wanted a weapon for his culture war. We gave him a way to achieve his goal, which was to change American culture.” Meanwhile, the Clinton campaign used broader targeting based on age, location, and gender. García-Martínez described this as “old-school work—like television on Facebook.”
The Intersection of Culture, Politics, and Technology
Bannon, Trump’s strategic advisor during the campaign, believed that controlling culture would allow control over politics—managing one sphere through another. Traditionally, media was seen as influencing politics, but Bannon introduced the idea that culture could also shape politics.
Wylie, who originally wrote his dissertation on fashion, explained: “Politics and fashion are both about identity. They’re both cyclical. There are many aesthetics in both. Broadly, both are products of our culture. Political movements can be seen as fashion trends, where many people suddenly adopt a new idea. How we accept fashion and politics is tied to who we are—our personalities. For me, they’re just different expressions of ourselves in different contexts.”
He described his work as “profiling people and targeting them to dominate the information environment around them. Once you break their connections to other information sources, you place them in an environment where you have more control over what they see. This is important because they still feel in control, thinking they’re making their own decisions to click, share, or interact with random accounts. They don’t see the thought process and strategy behind it all.”
This is a vivid description of discursive process management that occurs outside the awareness of those being managed. Everything seems natural to them.
Election Interference and Social Media Manipulation
Wylie predicted that in the 2020 election, threats would come from a broader range of countries, not just Russia, but also China, North Korea, Iran, and even some U.S. allies. “If you have a trade dispute, who’s to say Mexico won’t interfere?”
Another detail from the 2016 election was the controversy over Trump paying less than Clinton for Facebook ads. The reality was that Trump’s content was more provocative and generated more engagement, so Facebook’s click model favored him. Clinton’s campaign paid 100-200 times more to reach the same number of people. Social media was also flooded with negative comments, and studies show that a negative first comment often leads to a cascade of negativity.
Ukraine’s Role and Information Channels
Ukraine became entangled in the U.S. election through Paul Manafort, Trump’s former campaign manager. Senate intelligence reports mention Manafort instructing his deputy Rick Gates to provide internal campaign data to Konstantin Kilimnik, who was expected to share it with others in Ukraine and with Oleg Deripaska. Kilimnik, a graduate of Russia’s Military University for military intelligence translators, was Manafort’s close associate and facilitated communications with Russian and Ukrainian oligarchs. He was known as “the Cat” and was fluent in English and Swedish.
The Senate report found that African Americans were the main target of Russian social media operations, with the Internet Research Agency in St. Petersburg as the source. Russian activity increased after Election Day: up 59% on Facebook, 238% on Instagram, 84% on YouTube, and 52% on Twitter. Committee chairman Richard Burr stated, “Russia is waging an information war against the U.S. that did not begin or end with the 2016 election.”
One St. Petersburg troll reported earning $1,400 a week for writing posts, saying, “I was much younger and didn’t think about the moral side. I just liked to write.” However, this “internet machine” was indeed trying to change the world, covering topics from Putin and Obama to Ukraine and the Ebola virus, targeting every corner of American mass consciousness.
The goals of the Trump campaign and Russian trolls coincided: to “suppress the African American vote.” Russian trolls aimed to confuse, distract, and ultimately prevent African Americans and other pro-Clinton voters from supporting her, spreading false information such as claims that Clinton received money from the Ku Klux Klan. Oxford researchers found that Black Americans received more Facebook and Instagram ads than any other group, with over 1,000 different messages reaching 16 million users. Fake accounts like Blacktivist, created by Russians, received 4.6 million “likes” and encouraged voters to support Green Party candidate Jill Stein or not vote at all.
Analysts called this an “immersive influence ecosystem,” where different pages supported each other, creating deep audience penetration from multiple sources. During elections, audiences crave more information, making them especially vulnerable. The result was a systemically organized information flow that no one could escape, even though everyone believed they were immune to influence.
Profiling, De-anonymization, and the Power of Data
Many court documents include Manafort’s emails recommending detailed “talking points” for Ukrainian politicians. Manafort wrote that he spent five hours with Yanukovych on election night, and recent articles identify Kilimnik as a GRU (Russian military intelligence) representative. Both Manafort and Kilimnik promoted the narrative that Ukraine, not Russia, interfered in the 2016 U.S. election, and that the “black ledger” of Manafort’s payments was fake. The Senate report states that Kilimnik “almost certainly helped organize the first public messages that Ukraine interfered in the U.S. election,” a narrative later repeated by President Trump and Rudy Giuliani.
Kilimnik had several nicknames, including “Kostya, the GRU guy” in Moscow and “Manafort’s Manafort” in Kyiv. His closeness to Manafort was due to his language skills, as Manafort did not speak Russian or Ukrainian and needed someone to interpret at meetings. Kilimnik also shared internal Ukrainian information with the U.S. embassy and, as it turned out, with the Italian embassy as well. Numerous articles describe Kilimnik as a Russian intelligence officer who was both a source and recipient of information on Ukrainian affairs, making him a channel for influencing U.S. politics.
The Senate report also mentions another Soviet-born figure, Aleksandr Kogan, who emigrated as a child and later worked with Cambridge Analytica and Lukoil. Kogan gave presentations in Russia and claimed that social media data could not effectively predict individual behavior, though he also stated in lectures that Facebook “knows more about you than anyone in your life.”
Russian researchers have published studies on linguistic models of stress, well-being, and dark personality traits in Russian Facebook texts, as well as on profiling the age and gender of online users based on their writing. These efforts amount to de-anonymizing Facebook texts to determine personal characteristics through linguistic analysis—a form of verbal recognition technology akin to facial recognition.
General F. Bobkov of the KGB once wrote that they successfully identified the authors of 90% of anonymous letters, often because the pool of suspects was small. However, he also noted that every terrorist act was preceded by anonymous letters, but in these cases, there were no such clues.
Technologies exploit a human weakness: on Facebook, people feel completely free and do not realize that modern methods are actively analyzing their “likes” to infer psychological traits. Likes and shares become a form of constant psychological testing, allowing people to be grouped by susceptibility to certain messages. People open their minds, and technology provides the keys. One might want to shout: “Guys, you’re visible and audible, even when you’re sitting in a dark room with your computer or smartphone!”
Conclusion
The main conclusion is that technology is increasingly prevailing over humanity, and the most effective steps are taken without regard for ethics, solely to achieve the desired result. The post-truth era hides well, presenting itself as truth.
References
- Russia, China, Iran exploit George Floyd protests in U.S.
- Intelligence and Security Committee of Parliament Russia
- Newman L.H. The Russian Disinfo Operation You Never Heard About
- Dziedzic S. et al. Morrison Government plans to set up taskforce to counter online disinformation
- Madrigal A.C. Russia’s Troll Operation Was Not That Sophisticated
- Madrigal A.C. What Facebook Did to American Democracy
- Greenberg A. The Untold Story of NotPetya, the Most Devastating Cyberattack in History
- Greenberg A. How an Entire Nation Became Russia’s Test Lab for Cyberwar
- Greenberg A. Petya Ransomware Epidemic May Be Spillover From Cyberwar
- Report On The Investigation Into Russian Interference In The 2016 Presidential Election
- Allen J. et al. Cambridge Analytica’s effectiveness called into question despite alleged Facebook data harvesting
- Full transcript: Former Facebook ad targeting boss Antonio García-Martínez on Too Embarrassed to Ask
- Cafolla A. The whistleblower: Chris Wylie on fashion, culture wars & the alt-right
- ‘The Capabilities Are Still There.’ Why Cambridge Analytica Whistleblower Christopher Wylie Is Still Worried
- Oremus W. Did Facebook Really Charge Clinton More for Ads Than Trump?
- Trump and the weird attention economy of Facebook
- How reading online comments affects us
- The psychology of social media
- Radulova N. Fear and Loathing in Runet
- Report of the select committee on intelligence United States Senate on Russian active measures campaigns and interference in the 2016 U.S. election. Volume 5: counterintelligence threats and vulnerabilities
- Herb J. et al. Bipartisan Senate report details Trump campaign contacts with Russia in 2016, adding to Mueller findings
- Mazzetti M. et al. G.O.P.-Led Senate Panel Details Ties Between 2016 Trump Campaign and Russia
- Investigation: Konstantin Kilimnik—possible link between Manafort and Russian intelligence
- Court record: Konstantin Kilimnik attended Trump’s inauguration
- Pisnya N. New victim of Mueller’s investigation—Manafort’s “Russian brain” Konstantin Kilimnik
- Zholobova M. et al. Absolutely Soviet Man. Portrait of Konstantin Kilimnik, Russian patriot who worked for Trump’s circle
- Nemets A. Who is Konstantin Kilimnik?
- “GRU Agent” Kilimnik turned out to be an FBI informant and State Department agent
- AP published details of Manafort and Kilimnik’s cooperation
- Mccabe D. G.O.P.-Led Senate Panel Affirms Russia Attacked Election, and Urges Action
- MacFarquhar N. Inside the Russian Troll Factory: Zombies and a Breakneck Pace
- Graham D. Trump’s ‘voter suppression operation’ targets black voters
- Swaine J. Russian propagandists targeted African Americans to influence 2016 US election
- Washington J. African-Americans see painful truths in Trump victory
- Bush D. Inside the Trump campaign’s strategy for getting Black voters to the polls
- Supplemental Motion in Limine Exhibits — July 25, 2018
- Konstantin Kilimnik
- Vogel K.P. et al. Russian Spy or Hustling Political Operative? The Enigmatic Figure at the Heart of Mueller’s Inquiry
- Stone P. Konstantin Kilimnik: elusive Russian with ties to Manafort faces fresh Mueller scrutiny
- Winter T. et al. Manafort associate is Russian spy, may have helped coordinate e-mail hack-and-leak, report says
- Vogel K.P. Manafort’s man in Kiev
- Schmidt M.S. Trump and Miss Moscow: Report Examines Possible Compromises in Russia Trips
- Solomon J. Key figure that Mueller report linked to Russia was a State Department intel source
- Phillips C. et al. Paul Manafort was ‘a grave counterintelligence threat,’ Republican-led Senate panel finds
- Henderson A. et al. 7 damning revelations from the new Senate report on Trump and Russia
- Cadwalladr C. et al. Cambridge Analytica: links to Moscow oil firm and St Petersburg university
- Cambridge Analytica’s Russia Connection
- Pinchuk D. et al. Academic in Facebook storm worked on Russian ‘dark’ personality project
- Lamond J. The Origins of Russia’s Broad Political Assault on the United States
- Hakim D. et al. Data Firm Tied to Trump Campaign Talked Business With Russians
- Cohen M. et al. Cambridge Analytica researcher touted data-mining in Russia speech
- Panicheva P. et al. Towards a linguistic model of stress, well-being and dark traits in Russian Facebook texts
- Panicheva P. et al. Lexical, Morphological and Semantic Correlates of the Dark Triad Personality Traits in Russian Facebook Texts
- Panicheva Polina Vadimovna
- Announcement of dissertation defense
- Litvinova T. et al. Profiling the Age of Russian Bloggers
- The Dark Side of Facebook
- Panicheva P. How to determine mood and schizophrenia by text. Podcast
- Litvinova T.A. Profiling the author of a written text
- Litvinova T.A. On the problem of establishing the characteristics of the author of a written text