The Russian invasion of Ukraine has been full of surprises, starting with Vladimir Putin’s blatant lies that he is not making war on the country. The Ukrainians’ heroic resistance has changed the conversation about declining enthusiasm for democracy, and Western allies have gone from chronic disunity to concerted resolve. For the first time in its 72-year history, NATO mobilized its rapid response force.
Something else has also been upended: the scale and effects of propaganda. And here we have grounds for worry as we enter a new era of international tension that will have grave consequences for the future of democracy around the The idea of institutionalized propaganda has a shorter history than baseball or the typewriter; the latter was invented in the mid-19th century.
The 1911 edition of Encyclopaedia Britannica had no entry for “propaganda.” The Great War changed this. The editors concluded that censorship, war bias, and the “perversion of fact” during that conflict had “cut a Grand Canyon gash in the whole intellectual structure of the world.” In a supplement they felt obliged to publish, “propaganda” ran nearly ten pages of small, densely-packed type.
In the decades since, propaganda has been a systematic, pervasive feature of modern government, in both war and peacetime. But in recent years, we have reached an inflexion point for a new series of changes.
Propaganda was once a tool of warfare, like artillery and airplanes. Now it is a battlefield unto itself. This change has crested with the Russian invasion and sets a pattern that will continue long after the war is over.
First is the alienation of time, place, and accountability. The infamous “Great Moon Hoax,” originating in the New York Sun in 1835, reached Europe months later. Thanks to the Internet and social media, a fake news item now can span the entire planet in the time it takes to hit “send.”
Not only does this make the Internet such a potent propaganda weapon, it’s incredibly cost-effective. Tanks and bombers are expensive. Computers are cheap and require little training to use perniciously. They allow for automation through the use of bots – short for robotic software programs.
In the case of the moon hoax, the culprit – the New York Sun – was obvious. Russian web brigades, as its bot units are called, have the ability to cloak their attacks even as they “flood the zone.” A RAND report in 2016 called Russia’s approach the “firehose of falsehood” model. Ukrainian security services believe that a Russian bot farm has created around 7,000 fake accounts to post about the war, but it is hard to know for certain – which is precisely the point.
A second set of changes lies in propagandists’ relationship to truth. In the case of this war against Ukraine, Putin has unflinchingly pushed a narrative that is, on its face, preposterous. He says repeatedly that Ukraine, a functioning democracy with a Jewish president, is a neo-Nazi state. Up until the moment the Russian troops crossed into Ukraine, he claimed they had no war intentions. Even now, Putin calls the invasion a “special military operation.” His ambassador in the United Nations categorially denied that civilians were targeted even as television footage showed bombed-out hospitals and apartment buildings.
Many Russians believe Putin, even when their relatives in Ukraine phone with horrors stories of what is happening to them. We might take some solace from the fact that Russian propaganda seems not to be affecting most countries, though the influence on places like India and China is unclear. Even the most pro-Putin politicians in the United States seem to be changing their tune. But this misunderstands a fundamental aspect of what is taking place and what is so dangerous about the Russian approach.
In the past, the aim of the propagandist was to get people to reach a certain specific conclusion. The most effective propaganda was attached to some facts, as that enhanced the plausibility of the message. The approach was to reduce cynicism about the propagandists and build confidence in their messages and social coherence. Now the end goal is to sow corrosive cynicism and confusion.
This is particularly pernicious because it can be used to make a direct attack on democracy, which depends on public confidence to succeed. A prime example is Russia’s attempt to undermine the 2016 election, not to pick a winner but to pit voters against each other on the basis of race, gender, and ideology in an attempt to erode Americans’ faith in the validity of the system.
Russia’s behavior in Ukraine and elsewhere in recent years should be a wakeup call with regard to the dangers of propaganda. Here are five steps that will help identify the signs.
First, we need to recognize that combatting propaganda is a priority. The Annual Threat Assessment by the U.S. Intelligence Community, released last Tuesday, observed that new technologies “are disrupting longstanding systems and societal dynamics, forcing individuals, communities, and governments to adjust and find new ways of living, working, and managing.” But the report does not go nearly far enough, focusing almost exclusively on military operations.
Second, muckraking journalists used to say that the best cure for bad publicity by special interests was good publicity that shows up the lies. Similarly, the best antidote for bad propaganda is good counter-propaganda that sticks to facts. The Biden administration’s transparency before the Russian invasion exemplified this approach.
Third is the matter of supporting independent media, which is crucial for the provision of verifiable information. There has been a heartening outpouring of private assistance for independent media in Ukraine, but public funding is essential to achieve the scale necessary. On the U.S. side, outlets like RFE/RL need to maintain their approach of modeling fair, fact-based reporting, rather than to weaponize those stations and make them propaganda vehicles, as some have done.
Fourth, social media platforms need to create greater capacity to respond to, and enforce, their terms of service, which would mean changing how they adjudicate content on their platforms. And there needs to be a broader discussion about profiting from propaganda. Why did it take a full-scale invasion of Ukraine in 2022 for platforms to stop recommending the Kremlin-friendly Russia Today and allowing it to receive advertising revenue? Why was the invasion of Crimea in 2014 not enough?
Fifth, individuals need better tools to be able to discern potential propaganda. One crucial aspect in this and other recent wars is the use of fake or repurposed videos and images purportedly showing the conflict. Platforms like TikTok have algorithms that often circulate inauthentic videos to millions of people from accounts newly created or repurposed to profit from war. While rethinking algorithms is one important aspect, another is to enable simpler authentication of images, including easier reverse-image searches.
Since World War I, propaganda has insinuated itself into our lives. But that does not mean we cannot find better ways to dig ourselves out of the mess it creates.
John Maxwell Hamilton is a global fellow with the Wilson Center, serves on the faculty for Louisiana State University, and is author of Manipulating the Masses: Woodrow Wilson and the Birth of American Propaganda (2020). Heidi Tworek, Canada Research Chair and associate professor at University of British Columbia, is author of “News from Germany: The Competition to Control World Communications, 1900-1945.” Originally published by RealClearPolitics. Republished with permission.