Steven Livingston
January 22, 2019

In April 2018, George Washington University’s School of Media and Public Affairs and the Elliott School of International Affairs, Harvard University’s Carr Center for Human Rights Policy, and the World Bank Group sponsored a two-day conference on disinformation, human rights, and peacebuilding.1 The conference, named “Contentious Narratives,” brought together experts from academia, non-governmental organizations, think tanks, and journalism. This special issue of the Journal of International Affairs is an outcome of that conference; in it we focus on disinformation’s effects on liberal institutions and norms. In this introductory article, we consider the nature of digital disinformation and introduce the contributors.

Well before the Brexit referendum in June 2016 and the American elections in November 2016, disinformation campaigns were clearly evident to careful observers. In May 2008, Russia sent hundreds of unarmed troops to the Georgian breakaway region of Abkhazia under the pretense of repairing railroads. Soon, thousands of additional Russian soldiers, fighter aircraft, and elements of the Black Sea Fleet were positioned along Russia’s border with Georgia. In August 2008, when Georgian President Mikheil Saakashvili sent troops into the separatist South Ossetia region, Russia launched airstrikes and a ground invasion that quickly extended into parts of Georgia itself. Robert Kagan notes that “the Russo-Georgian war established a modus operandi that [Russian President Vladimir] Putin would employ against Ukraine almost exactly six years later. In both cases, the Russian attack was preceded and accompanied by extensive cyberwarfare and ‘fake news.’"2

Russia’s 2014 annexation of Crimea and its strange noninvasion-invasion of Ukraine were also wrapped in deception. Despite clear evidence to the contrary, Russia claimed it had no military units in Ukraine.3 As implausible as the assertion was, it followed a certain logic: by denying the obvious, Putin created a unifying narrative at home and a professional dilemma for many American and European news organizations. According to their professional norms, in most instances Western journalists are obligated to report both sides of a story. Doing so here meant including the Kremlin’s implausible denials about the otherwise obvious Russian invasion of Ukraine.

It also set the stage for long-term framing of the Russo-Georgian War of 2008. In August 2018, RT, the Russian state television channel, ran a story entitled “10 Years Since Georgia Attacked South Ossetia and Russia – Not the Other Way Around” in which Georgia was the aggressor and Russia the victim.4 This fits a pattern identified by Timothy Snyder: “Soviet communism was a politics of inevitability that yielded to a politics of eternity. Over the decades, the idea of Russia as a beacon for the world gave way to the image of Russia as a victim of mindless hostility.”5 The politics of inevitability is a habit of mind that understands progress as inevitable and self-evident. The politics of eternity, conversely, “places one nation at the center of a cyclical story of victimhood. Time is no longer a line into the future, but a circle that endlessly returns the same threats from the past."6

In this context, disinformation is an element of a broader strategy, varyingly referred to as nonlinear or hybrid warfare, the Gerasimov Doctrine, or what Hans Klein in this issue calls the Gerasimov narrative.7 It involves the use of trolls, bots, and state media to propagate destabilizing narratives or to deflect blame; it also involves doxing—the hacking of compromising information that is used to feed disinformation. These methods weaponize the openness and deliberately self-critical inclinations of Western democratic society.

There are at least two types of disinformation. One involves attacks on social cohesion that are intended to deepen social distrust. The other is more tactical and focused on undermining truth-claims. In this sense, it is also supportive of broader authoritarian strategic objectives. Facts are what the leader says they are; everything else is “fake news."8

Social cohesion involves shared beliefs among citizens and the sense that they are a part of a common moral community, one that encourages reciprocal trust.9 In simple daily routines, such as taking a bus, one must “trust in the abilities of the experts who invented the bus, in those of the unknown bus driver, and those of all the other unknown drivers on the road."10 Modern society is filled with complexities that demand reliance on assumed competencies and shared norms. By exaggerating existing social divisions and amplifying threats, disinformation can erode social trust and social cohesion. Presumed and exaggerated threats posed by immigrants, guns (their presence or their confiscation), Islam, and LGBT rights are common themes. Gun violence in general has been a favorite troll and bot target. For example, the February 2018 Parkland school shootings were followed by an immediate increase in Russia-linked Twitter accounts pushing shooting-related hashtags and topics.11 The same pattern was also found following the 2017 Las Vegas shooting.12 In this form, disinformation does not necessarily involve false-hoods or “fake news” as much as it does exaggeration and hyperbole.

A second type of disinformation, more tactical and calculated, comes in response to informational challenges presented by human-rights investigators, independent journalists, and foreign government inquiries into Russian or Russian- allied war crimes. It involves the propagation of sometimes dozens of contradictory explanations for the same event. Humans tend to reject contradictory narratives, either by turning attention away from dissonant information or by simplifying it. The former is called selective exposure, the latter motivated reasoning.13 Heavy cognitive loads can also increase stereotyping as a simplification strategy.14 This is related to what psychologists call “Fundamental Attribution Error,” the tendency to ascribe behavioral outcomes to personal character rather than to circumstance.15 Whatever the psychological explanation, the outcome is the atrophy of political agency and interest.

In this special issue of the Journal of International Affairs, our contributors offer a rich array of ideas as to the nature of disinformation and offer suggestions to manage the phenomena. Ambassador John Shattuck offers a cogent review of the rise of illiberal regimes, the central geopolitical feature of the assault on reason and facts found in disinformation. “The Orban model,” referring to Hungarian Prime Minister Viktor Orban’s illiberal regime, “is now being copied across Europe by illiberal movements in Poland, the Czech Republic, Romania, Austria, Italy, France, Britain and the Netherlands.” In the next section, Samantha Bradshaw and Phil Howard remind us that disinformation is not just a Russian practice as they review government actors across regime types. Douglas Guilbeault then urges us to think broadly when considering the nature of disinformation. He argues that disinformation technologies have harnessed the influence architectures already built into profit-maximizing, social-media platforms. Christina Fink points to the responsibility of social-media platforms for propagating violence. Again, our attention is broadened beyond the Russian disinformation narrative. Emma Flaherty and Laura Roselle most directly address the idea that disinformation is meant to erode social cohesion. They find that because Brexit was inherently corrosive to the European Union, question-raising narratives were directed instead to news topics more useful to undermining the European liberal order. Natalia Chaban and Ben O’Loughlin note that “disinformation skirmishes” revolve around larger embedded narratives that societies and their leaders hold about themselves and their identities. Again, social cohesion is at play.

Leading the next group of articles in a section on disinformation in conflict, Gregory Asmolov points to the broader nature and objectives of disinformation. He argues that a major purpose of disinformation is to sabotage horizontal connections among nonstate actors: it is an assault on social cohesion. This strengthens the state’s capacity to construct an image of an external enemy. By breaking the bonds among social-network ties, disinformation and counter-disinformation open spaces filled by state narratives about enemies and victimhood, as Snyder emphasized. Rachel Brown and Laura Livingston bring a practitioner’s eye to questions about how technologies affect conflicts and images of the “other.” With a consideration of election-related violence prevention in Kenya, Brown and Livingston offer a framework for thinking about how to minimize risks while seizing opportunities. Ben Nimmo addresses similar issues concerning a dispute between Persian Gulf powers. Twitter bot campaigns distorted the online debate, artificially inflating some hashtags.

Our final tranche of articles looks at disinformation theory and practice. Michael Bossetta argues that targeted spear-phishing attacks are a key component of contemporary influence operations. Relative to the rest of the articles in this issue, Matthew Flynn takes a refreshingly contrarian view, arguing that network connectivity empowers online activism and challenges authoritarian regimes. States promoting openness online enjoy a permanent, asymmetrical, strategic advantage in cyberspace. Michael Jensen understands disinformation in terms of divisive politicization of identities. Trolls hope to amplify narratives already found in American debates and use them to undermine cohesion. Matthew Levinger offers an argument aligned with Flaherty and Roselle, as well as Chaban and O’Loughlin: disinformation campaigns gain credibility by leveraging “master narratives” of national decline and rebirth. This once again also echoes Snyder’s ideas about the politics of eternity. In the closing article, Hans Klein offers an important corrective to how we speak of the Gerasimov Doctrine. It is better understood as the way Russian experts explain the Arab Spring in the Middle East and the Color Revolutions in Russia’s near-abroad. Russian analysts explain the breakdowns of order in various societies as the fruits of democracy promotion campaigns by rival states allegedly intent on regime change.

Together, these articles help crystalize many of the complex themes that surround disinformation. But what stands out most of all, perhaps, is the question of social cohesion. In the end, the solution to disinformation might be more domestic and less about bots, trolls, and state-sponsored media. The solution might be found in ourselves and in the way we conduct our own politics.

Steven Livingston is a professor of Media and Public Affairs and International Affairs at George Washington University. He holds appointments in the School of Media and Public Affairs and the Elliott School of International Affairs. He is also a senior fellow at the Carr Center at the Harvard Kennedy School at Harvard University. He has had recent fellowships or visiting appointments at the Freie Universität-Berlin, the University of Canterbury in New Zealand, St. Galen University in Switzerland, and the Brookings Institution in Washington, DC. He serves on the Scientific Freedom and Responsibility Committee of the American Association for the Advancement of Science and on the Technology Advisory Board of the International Criminal Court at The Hague. With Gregor Walter-Drop, Livingston edited Bits and Atoms: Information and Communication Technology in Areas of Limited Statehood (Oxford University Press, 2014). Among other publications, including approximately 50 articles and chapters, he has written Africa’s Evolving Infosystems: A Pathway to Security and Stability (NDU Press, 2011) and Africa’s Information Revolution: Implications for Crime, Policing, and Citizen Security (NDU Press, 2013).


1 Contentious Narratives Conference (Washington, DC: George Washington University, 2–3 April 2018),

2 Robert Kagan, “Believe it or Not, Trump’s Following a Familiar Script on Russia,” Washington Post, 8 August 2018, right-into-putins-hands/2018/08/07/c1aec698-9a60-11e8-b60b-1c897f17e185_story.html?utm_ term=.476203c1cc7e.

3 For evidence to the contrary, see “Satellite Imagery Assessment of the Crisis in Crimea, Ukraine, Part One: Sevastopol” (Geospatial Technologies and Human Rights Project, American Association for the Advancement of Science, April 2014); “Satellite Imagery Assessment of the Crisis in Crimea, Ukraine, Part Two: Border Deployments” (Geospatial Technologies and Human Rights Project, American Association for the Advancement of Science, April 2014).

“10 Years Since Georgia Attacked South Ossetia and Russia – Not the Other Way Around,” RT, 8 August 2018,

5 Timothy Snyder, The Road to Unfreedom: Russia, Europe, America (Crown/Archetype, Kindle Edition, 2018), 33.

6 Ibid.

7 Abby Norman, “Welcome to the Age of Digital Warfare,” Futurism, 8 December 2017,; Mark Galeotti “I’m Sorry for Creating the ‘Gerasimov Doctrine,’” Foreign Policy, 5 March 2018.

8 Jason Schwatz, “Trump’s ‘Fake News’ Mantra a Hit with Despots,” Politico, 8 December 2017,9 Christian Albrekt Larson, The Rise and Fall of Social Cohesion: The Construction and De-construction of

9 Christian Albrekt Larson, The Rise and Fall of Social Cohesion: The Construction and De-construction of Social Trust in the USA, UK, Sweden and Denmark (Oxford: Oxford University Press, 2013).

10 Christian Albrekt Larsen, “Social Cohesion: Definition, Measurement and Developments” (2014),

11 Erin Griffin, “Pro-Russian Bots Flood Twitter After Parkland Shooting,” Wired, 15 February 2018,

12 Tim Johnson and Greg Gordon, “A familiar Pattern? Russian Trolls on Twitter ‘Rile’ Americans on Gun Violence,” McClatchy DC Bureau, 22 February 2018, nation-world/national/national-security/article201646339.html.

13 Milton Lodge and Charles Taber, “Three steps toward a theory of motivated political reasoning,” Arthur Lupia, ed., Elements of Reason: Cognition, Choice and Bounds of Rationality (New York: Cambridge University Press, 2000).

14 Monica Biernat, Diane Kobrynowicz, and Dara L. Weber, “Stereotypes and Shifting Standards: Some Paradoxical Effects of Cognitive Load,” Journal of Applied Social Psychology 33, no. 10 (October 2003), 2,060–2,079.

15 Anouk Rogier and Vincent Yzerbyt, “Social Attribution, Correspondence Bias, and the Emergence of Stereotypes,” Swiss Journal of Psychology 58, no. 4 (1999), 233–240.