Meeting the ‘Conceptual Family’: Propaganda, ‘Fake News’, Mis- and Disinformation

About the author: Florian Binder is a graduate Political Science student at the University of Potsdam. Having fallen in love with Georgia during his first stay in 2019, he is currently undertaking an exchange semester at Ivane Javakhishvili Tbilisi State University. While his research focus normally lies on Democratization & Autocratization, he is also fascinated by newer forms of hybrid warfare and the concepts surrounding them. Apart from studying, he is a co-host of the “Sociological Perspectives on the Corona Crisis” podcast within the scope of his work for the Social Sciences Center Berlin (WZB).

Why Do We Need to Talk about the Differences between Propaganda, ‘Fake News’, Mis- and Disinformation?

Throughout history, but especially over the last few years, few have been able to avoid encountering particular buzzwords: propaganda, misinformation, disinformation, and the newer and increasingly popular ‘fake news’. However, while the scholarship is still debating the exact nature of these different concepts (Bennett/Livingston 2018; Freelon/Wells 2020; Jackson 2017), the public and the media alike tend to use these terms interchangeably, likely by virtue of their similarities: they are, after all, members of the same ‘family’ of concepts used to describe trends in transmission of deceptive and/or misleading information. Understandable as the public and media’s conflation of these terms may be, a better insight into the important distinctions that exist between each of them is necessary in order to understand the different consequences resulting from the occurrence of these phenomena.

Therefore, I offer this insight by introducing the individual ‘members’ of this conceptual family with a focus on the most dangerous one: disinformation. Doing so will help you become better acquainted with their respective idiosyncrasies, and enable to analyze their effects in a more sophisticated and nuanced manner.

Propaganda: Isn’t this place just wonderful? Aren’t we great? Aren’t the others just horrible?

The fact that propaganda has been around for hundreds (if not thousands) of years – think of WWI or poor Napoleon and the British propaganda that has forever made him short (not to mention the Nazi propaganda machine of WWII or Soviet propaganda manipulating its population) – means that a certain degree of familiarity comes with the term. While there is not one single definition for the concept, most scholars agree that it is distinct from disinformation, because it is “designed to serve the interests [...] of the propagandist and their political masters” (Lanoszka 2019: 229) with the aim of “persuade[ing] its subject that there is only one valid point of view and eliminate all other options” (ibid.: 230). Following this train of thought, we can already see that propaganda does not need to be misleading. You should keep this in mind when

considering the other concepts, so that you do not confuse them like other authors have. Additionally, if we follow the classical argument of Carr (1940) who emphasized that ‘good’ propaganda is “skewed information which comes as close to the truth as possible”, we can then realize that – in stark contrast to mis- and disinformation most states utilize propaganda to some degree (Carr/Cox 2016 [1939]; la Cour 2020).

Misinformation: It’s misleading, but I am sorry!

So far, we have seen that propaganda can be quite ambiguous, sometimes a lie, sometimes a bended truth, making it a bit complicated to grasp. Misinformation, on the other hand, is likely the most straightforward member of the conceptual family, referring simply to the unwitting spread of information that is misleading or deceptive (Freelon and Wells 2020), meaning it can result from ignorance (Fallis 2014: 136; Shu et al. 2020: 2) without any malicious intent (Weedon et al. 2017: 5). For instance, if a friend of mine were to tell her co-workers that I am currently in Russia, while I am actually in Georgia (something that has happened to me), she would spread mis- but not disinformation, as she did not intend to mislead her co-workers about the sovereignty of Georgia or my current location.

‘Fake News’: I don’t know exactly what it is, but it’s not true!

Now before we get to disinformation, the center piece of this post, we need to address the concept that has captured the public eye ever since Donald Trump has made it increasingly popular (even though contrary to his own claims he did not invent it) – ‘fake news’. While the person who is actually credited with popularizing the phrase has defined the concept as leaning more towards false content, created to deceive and with an economic motive, it has become more and more ambiguous, leading some authors to “suggest caution when adopting” (Bennett/Livingston 2018: 124) the term. Nonetheless, some scholars still use it interchangeably with disinformation (Asmolov, 2018) even though it very often “does not meet the definition” (Jackson, 2017: 4). Therefore, it is best to treat ‘fake news’ as a social phenomenon (Tandoc et al. 2018) and utilize it with some caution, instead opting for the mother of all concepts in our ‘family’ – disinformation – when trying to discuss intentionally misleading information.

Disinformation: It’s misleading, but that was the point!

Scholars agree that disinformation is “nothing new” (Fallis 2015: 402). But what they cannot seem to agree on is whether the term that comes from the Russian dezinformatsiya appearing

in the 1920 (Rid 2017: 2) was already around in the 1950s (Shu et al. 2020: 2) or appeared much later in the West, around the 1980s (Mahairas/Dvilyanski 2018: 21). Fascinating as this discussion might be, it is not what is most important here. What you, the reader, might needs to know is that although there are various definitions of the concept (Fallis 2009; Floridi 1996; Skyrms 2010), they all share three important features:

A) “Disinformation is a type of information” (Fallis 2015: 404). Uncontroversial, I know! However, this means disinformation only needs to represent something as being a certain way, so it does not have to be a statement, but can come in various forms (Fallis 2014).

B) It resembles a kind of misleading information, used to create false beliefs. But this does not mean the information has to be false (Fallis 2015)

C) The most important feature of disinformation is the fact that there needs to be intent behind its spread (even though the target of course might not actually be misled; Fallis 2015)

Nonetheless, we still need a good definition that shows how disinformation is unique from the other concepts, and luckily, Fallis (2015) provides us with a great one: disinformation is “misleading information that has the function of misleading” (ibid.: 422) and this resembles the intentional feature. Still, you need to keep in mind that spreading disinformation is a participatory endeavor! So if your aunt Karen ‘accidentally’ shares misleading information on Facebook, it can still be disinformation because of the original creator’s intent. This is important to keep in mind, as disinformation campaigns often make use of ‘unwitting agents’ that are not aware of their role in the systematic effort to mislead an audience.

Forms of and Opportunities for Disinformation: Trust me, I am definitely real, and this is valid information!

Traditionally, disinformation – often embedded in bigger frameworks like Soviet active measures – consisted of false texts, doctored photos and other forgeries (Kux 1985: 23f) and aimed to achieve its goals through so-called 4D (Dismiss, Distort, Distract and Dismay) strategies. The strategies have largely remained the same, but the advent of the internet and especially social media has led to more advanced forms of disinformation and also increased the opportunities for states or organizations to hide their malignant intentions and spread disinformation. What has contributed to this is also the fact that the transformation of the media has been accompanied by a “breakdown of trust in democratic institutions” (Bennett/Livingston 2018: 127), so that information intended to sow confusion and discord today often encounters social media users that are more than ready to share it and already struggle with spotting falsehoods (Scheufele/Krause 2019).

Before sharing with you some good news about this concerning development, I want to give an example of one of the – in my opinion – most stunning acts of disinformation of the last years, which is not related to the well-known Russian campaigns like the “Lisa-Case” or anti-vaccine campaigns to give you a better understanding of how disinformation can work nowadays. Last year, Adam Rawnsley from the Daily Beast and others noticed something suspicious. It seemed to be the case that many conservative American news outlets had been falling victim to an ingenious disinformation campaign. Not only did it turn out that some of their top freelancer columnists were spreading false and skewed (propaganda) stories about Iran, Qatar and other middle eastern countries while praising the UAE, they were actually not real at all. While the use of so-called “sock puppets” – fake online identities disguising the intent of a ‘spreader’ (Bu et al. 2013) or troll farms – are not new, there was something special about this disinformation campaign: Not only were the identities of the 19 different personas completely fictious, but some of them were actually AI-generated. This example not only showcases how disinformation campaigns have entered a new era with AI coming into play, but also how disinformation does not need to be false content (some of the articles published contained true information), but that what makes it disinformation is, again, the intent of the people behind the campaign to deceive their audience about the true identity of the “columnists”.

These newer forms of disinformation and campaigns can be frightening, and we definitely should display great caution when dealing with states or organizations or even individuals that employ them. But there is some good news: some authors have pointed out that it remains unclear how effective and influential these campaigns actually are, pointing to multiple barriers – such as uncertainty over information from adversaries, preexisting prejudices, and countermeasures (Lanoszka 2019) – possibly impeding their success.

Therefore, while all of us should aim to acquire more precise knowledge about the subtle differences among the concepts in our ‘family’ – and I hope this blog post has contributed to yours – there is no reason to despair, as there are strategies and tools available to all of us for fighting (and possibly beating) disinformation. Still, especially those among you that might be interested in hybrid warfare or newer forms of media discourse study (but also everyone else) should keep the differences explained in this blog in mind, as they will inevitably come across disinformation as a central concept in these research fields.


Abrams, Zara. 2021. “Controlling the Spread of Misinformation.” Https://Www.Apa.Org. 2021.

Angry Planet. n.d. “Angry Planet - ICYMI: Fake Journalists Are the Latest Disinformation Twist.” Accessed January 10, 2022. cHV3Nob3dzLzNkZDc2NjM0LTdiMWMtNDVjMi1hOWNmLTJmNmY5NmQ0ZT ZTBi/episode/NjFiY2NmMzZkMmM5YWUwMDE0NDU4ODNl.

Asmolov, Gregory. 2018. “THE DISCONNECTIVE POWER OF DISINFORMATION CAMPAIGNS.” Journal of International Affairs 71 (1.5): 69–76.

Beaujon, Andrew. 2019. “Trump Claims He Invented the Term ‘Fake News’—Here’ s an Interview With the Guy Who Actually Helped Popularize It.” Washingtonian (blog). October 2, 2019. invented-the-term-fake-news-an-interview-with-the-guy-who-actually-helped- popularize-it/ retrieved 08.01.2022.

Bennett, W Lance, and Steven Livingston. 2018. “The Disinformation Order: Disruptive Communication and the Decline of Democratic Institutions.” European Journal of Communication 33 (2): 122–39.

Blakemore, Erin. 2020. “How Photos Became a Weapon in Stalin’s Great Purge.” HISTORY. 2020. retrieved 07.01.2022.

Bu, Zhan, Zhengyou Xia, and Jiandong Wang. 2013. “A Sock Puppet Detection Algorithm on Virtual Spaces.” Knowledge-Based Systems 37 (January): 366–77.

Carr, Edward Hallett. 1940. Propaganda in International Politics. Reprinted. Oxford Pamphlets on World Affairs, no. 16. Oxford: Clarendon Press.

Carr, Edward Hallett, and Michael Cox. 2016. The Twenty Years’ Crisis, 1919-1939. London, United Kingdom: Palgrave Macmillan.

Cour, Christina la. 2020. “Theorising Digital Disinformation in International Relations.” International Politics 57 (4): 704–23.

Epatko, Larisa. 2017. “These Soviet Propaganda Posters Once Evoked Heroism, Pride and Anxiety.” PBS NewsHour. July 11, 2017. soviet-propaganda-posters-meant-to-evoke-heroism-pride retrieved 07.01.2022.

Fallis, Don. 2009. “A Conceptual Analysis of Disinformation,” February.

Fallis, Don. 2015. “What Is Disinformation?” Library Trends 63 (3): 401–26.

Floridi, Luciano. 1996. “Brave.Net.World: The Internet as a Disinformation Superhighway?” The Electronic Library 14 (6): 509–14.

Freelon, Deen, and Chris Wells. 2020. “Disinformation as Political Communication.” Political Communication 37 (2): 145–56.

Fürstenau, Marcel. 2020. “How the Nazis Used Poster Art as Propaganda.” Deutsche Welle. 2020. 55751640 retrieved 09.01.2022.

Hopper, Tristin. 2016. “Greatest Cartooning Coup of All Time: The Brit Who Convinced Everyone Napoleon Was Short.” National Post, April 28, 2016. who-convinced-everyone-napoleon-was-short retrieved 06.01.2022.

Jackson, Dean. 2017. “Issue Brief: Distinguishing Disinformation from Propaganda, Misinformation, and ‘Fake News.’” NATIONAL ENDOWMENT FOR DEMOCRACY, October 17, 2017. from-propaganda-misinformation-and-fake-news/ retrieved 12.12.2021.

Jowett, Garth, and Victoria O’Donnell. 2019. Propaganda & Persuasion. Seventh edition. Los Angeles: SAGE.

In The Philosophy of Information Quality, edited by Luciano Floridi and Phyllis Illari (editors), 135–61. Synthese Library. Cham: Springer International Publishing.

Keller, Franziska B., David Schoch, Sebastian Stier, and JungHwan Yang. 2020. “Political Astroturfing on Twitter: How to Coordinate a Disinformation Campaign.” Political Communication 37 (2): 256–80.

Kux, Dennis. 1985. “SOVIET ACTIVE MEASURES AND DISINFORMATION: OVERVIEW AND ASSESSMENT.” The US Army War College Quarterly: Parameters 15 (1).

Lanoszka, Alexander. 2019. “Disinformation in International Politics.” European Journal of International Security 4 (2): 227–48.

Mahairas, Aristedes, and Mikhail Dvilyanski. 2018. “Disinformation – Дезинформация (Dezinformatsiya).” The Cyber Defense Review 3 (3): 21–28.

Meister, Stefan. 2016. “NATO Review - The ‘Lisa Case’: Germany as a Target of Russian Disinformation.” NATO Review. July 25, 2016. target-of-russian-disinformation/index.html retrieved 05.01.2022.

Mejias, Ulises A, and Nikolai E Vokuev. 2017. “Disinformation and the Media: The Case of Russia and Ukraine.” Media, Culture & Society 39 (7): 1027–42.

Owen, Laura Hazard. 2019. “Old People Are Most Likely to Share Fake News on Facebook. They’re Also Facebook’s Fastest-Growing U.S. Audience.” Nieman Lab (blog). 2019. on-facebook-theyre-also-facebooks-fastest-growing-u-s-audience/ retrieved 07.01.2022.

Rawnsley, Adam. 2020. “Right-Wing Media Outlets Duped by a Middle East Propaganda Campaign.” The Daily Beast, July 6, 2020, sec. media. propaganda-campaign 06.01.2022.

Rid, Thomas. 2017. “Disinformation A Primer in Russian Active Measures And Influence Campaigns.” Hearing Transcript. Hearings Before The Select Committee on Intelligence United States Senate. Washington DC.: United States Senate.

Salmon, Natasha. 2017. “Donald Trump Takes Credit for Inventing the Word ‘Fake.’” The Independent. October 8, 2017. inventing-the-word-fake-a7989221.html retrieved 05.01.2022.

Scheufele, Dietram A., and Nicole M. Krause. 2019. “Science Audiences, Misinformation, and Fake News.” Proceedings of the National Academy of Sciences of the United States of America 116 (16): 7662–69.

Shu, Kai, Amrita Bhattacharjee, Faisal Alatawi, Tahora H. Nazer, Kaize Ding, Mansooreh Karami, and Huan Liu. 2020. “Combating Disinformation in a Social Media Age.” WIREs Data Mining and Knowledge Discovery 10 (6): e1385.

Skyrms, Brian. 2010. Signals: Evolution, Learning, and Information: Evolution, Learning, & Information. Illustrated Edition. Oxford ; New York: Oxford University Press, USA.

Snegovaya, Maria. 2015. “Putin’s Information Warfare in Ukraine.” Washington, D.C.: Institute for the Study of War. information-warfare-ukraine-soviet-origins-russias-hybrid-warfare.

Steiner, Zora. 2018. “Use of Propaganda in World War I Postcards.” 2018. retrieved 04.01.2022.

Tandoc, Edson C., Zheng Wei Lim, and Richard Ling. 2018. “Defining ‘Fake News.’” Digital Journalism 6 (2): 137–53. Vanian, Jonathan. 2021. “Russian Disinformation Campaigns Are Trying to Sow Distrust of

COVID Vaccines, Study Finds.” Fortune. July 23, 2021. distrust-of-covid-vaccines-study-finds/ retrieved 11.12.2021.

Weedon, Jen, W. Nuland, and Alex Stamos. 2017. “Information Operations and Facebook.” Facebook. Facebook-Weedon-Nuland/f633771f0f586aaa89120a9003e2b24dddaf4d89.

From series of MCERC Academic Blogs on "Information Warfare and Security" created by Georgian and International students.