Russian Trolls and Fake News: Information or Identity Logics?
This article examines Russia’s use of misinformation to influence the 2016 U.S. election by reviewing the content of one of the largest repositories of publicly available Twitter data from Russia’s Internet Research Agency (IRA). The article focuses on assessing whether Russia’s communication strategy appealed to an informational logic of false information or to an identity logic. The evidence shows that Russia, through the IRA, was primarily invested in identity rather than informational claims particularly given that the bulk of Russia-linked communications sought to define then-candidate Donald Trump’s identity and those of his opponents. The article concludes by identifying where future research is needed.
In the immediate aftermath of the 2016 presidential election in the United States, “fake news” became the story. During the closing days of the election, social media disseminated more stories that contained misinformation—claims based on false factual statements—or information from dubious sources rather than well-sourced news.1 Before President Donald Trump took office, the U.S. Director of National Intelligence issued a report that assessed social media played a key role in executing Russia’s campaign to undermine candidate Hillary Clinton and boost Trump’s chances of winning the election.2 The account of the Russian aims would be hard to square with their tactics if their use of social media were limited to the diffusion of fake news as a primary strategy. In informational terms, the factual inaccuracy of fake news would seem to undermine its diffusion and uptake, but evidence shows fake news spreads quicker than the truth. This paper examines the content of Russia’s influence operation to understand the logic of their strategy.
Modern democracy has been about the selection of representatives based on the proximities of parties and candidates to the policy preferences of citizens, and making correct choices depends on obtaining accurate information.3 However, new evidence suggests that increasingly, social differentiation, which had been the basis of preferences, is becoming politicized and that vote choice is more a function of political identity than political preferences.4 While the term fake news suggests informational or epistemic deficits, to the extent that these communications are not just factually deficient but also seek to manipulate political identities, the consequences for democratic politics may be far more damaging. First, the covert nature of these communications means they are manipulative, as the sources and aims of the communications remain hidden.5 Second, it undermines the development of organic political narratives from within a population because it politicizes and colonizes social identities.6 Third, politicization of identities can undermine capacities for collective problem-solving, as social divisiveness prevents persons from coming together to address common problems.7
These factors are amplified when the source of a fake-news campaign is a foreign adversary. Digital networks and social media greatly increase the capacities of a foreign adversary to conduct influence operations on an ongoing basis and considerably lessens political risks because they can operate beyond the reach of domestic law enforcement.8 Although foreign influence campaigns are by necessity public in terms of the communications that are unleashed on a target, they often involve “cut-outs”—persons not directly linked to the instigator—that provide varying degrees of deniability for a perpetrator. This fact can undermine traditional deterrence to the extent that the chain of culpability remains ambiguous and strategies for effective retribution uncertain.9
Intentionally unreliable information sources would normally seem unbelievable, which should undermine their use. Nonetheless, factually untrue claims circulate faster on social media than truths, and dubious sources had greater uptake on social media than established, professional outlets during the 2016 U.S. election.10 To understand how Russia interfered in the election, this research looks at the Twitter “troll” accounts that operated out of the Internet Research Agency (IRA), the subject of a 2018 indictment by Special Counsel Robert Mueller’s investigation.11 It examines whether the communications appeal to an informational logic of false information or if it appeals to an identity logic. The evidence shows that they were primarily invested in identity rather than informational claims.
Fake News: Information and Identity Logics within Political Communication
The term fake news is highly contested. In academic literature, there are at least 34 attempts to define it.12 Here we define it on an informational order in terms of disinformation. That is, “intentional falsehoods spread as news stories... to advance political goals.”13 As an informational problem, fake news is hardly new.
From the earliest days of political parties in the United States, newspapers were closely aligned with political parties pushing propaganda and concerns of “yellow journalism” and fake news surfaced through the nineteenth and early twentieth centuries.”14,15
The nature of fake news has important democratic consequences that necessitate identifying effective strategies to combat it. Access to accurate information has long been recognized as a critical quantity for voters to make “correct” decisions.16 Information is critical to political decision making because it provides “data about the current developments in the status of those variables which are the objects of contextual knowledge” so people may successfully act.17 Voters are thought to make logically correct choices to the extent they select candidates whose positions are closest to their own based on the information available. If the IRA troll accounts are attempting to undermine the U.S. democratic process by tricking voters to make choices predicated on incorrect information, we would expect the following two implications:
1. these accounts will present themselves as news outlets to position themselves as believable sources of information; and
2. the content of these accounts will make incorrect claims about policy positions and current states of affairs, thereby undermining the connection between preferences and choices.
In contrast to understanding communications as a medium for information transfer, research in both constructivist international relations theory and information warfare suggest that communications play a pivotal role in shaping situations and identities. The IRA termed their social media operations during the 2016 U.S. campaign, “information warfare,”18 which uses “information [as] an enabler, a ‘source multiplier,’ a tool that increases one’s ability to shape the operational environment.”19 Historically, the Soviet Union and now Russia have sought to engage what it terms “reflexive control,” which changes decision parameters and thus preferences and the manner in which persons identify with a situation, and moves them “into making a predetermined decision and action.”20 In that sense, influence operations have worked on identities rather than a purely informational order implied by the term “fake news.”
Identities are “a source of meaning and experience.”21 Concern for self-identity stems from a need for ontological security, allowing persons to connect and endow their lives with meaning—a continuity that can become endangered in the face of a changing external environment.22 Russian influence operations often play on these vulnerabilities by creating conditions of existential anxiety to realign political identifications.23 Identities define relationships, indicating both who or what one is identified with and identified against. Underlying the range of activities from persuasion to propaganda and manipulation, there is a rhetorical process at work wherein persons come to identify or fail to identify with objects, events, and actors in the political system; identify their preferences, or fail to, with candidate and policy positions; and even reorder their preferences such that they come to believe their preferences are conjoined or not, independent of facts.24 Today, American identities are increasingly fragile and politicized.25
Within an identity logic, communications seek to shape a target’s disposition toward objects, events, and other agents with a moralized inflection so as to induce action or attitudes to take action, regardless of truth.26 In a world where trust in institutions and the media is declining, people tend to trust those who they identify as being like themselves more than professional and governmental sources.27 In a social-media space, Russia conducts influence operations using traditional spycraft methods adapted for a digital space: assets—in this case citizens—are targeted online at first through innocent conversation to build a rapport before trying to move their beliefs about political entities and events.28 We draw two empirical indicators of the identity logic:
1. early communications will focus more on rapport building than targeting political objects and actors; and
2. communications will predominantly focus on defining actors in moralizing terms rather than states of affairs.
Data and Methods
The analyses are based on a dataset of 203,482 tweets created between 14 July 2014 and 26 September 2017, around the time when IRA employees learned they were under investigation by the FBI.29 Figure 1 contains the distribution of these tweets broken down by type. Although Twitter suspended these accounts and their data are otherwise irretrievable, NBC News received a dataset of these tweets which it posted online.30 This is one of the largest known public datasets of IRA tweets. These accounts were identified using a list of 2,752 Twitter profiles that were exhibits in a public hearing held by the United States House Permanent Select Committee on Intelligence. The profiles were linked to the IRA by Twitter based on a variety of internal metrics that have not been made public.31
This article measures the role of these accounts in pushing information and identity content in two ways. First, we mined Twitter profiles for references to “news,” “journalism,” “journalist,” or “reporter,” and then coded for whether they mentioned any of the candidates that U.S. authorities had identified as targets of support or opposition by the IRA by identifying first or last names and Twitter handles. Second, we used a Latent Dirichlet Allocation (LDA) model to examine topics of tweets produced between 1 October and 8 November 2016, dates inclusive, as that represented the most intense period of activity primarily concerned with the campaign.32 We selected a 26-topic solution because it was most substantively interpretable and it optimized metrics of semantic coherence, topic exclusivity, and the held-out likelihood. Informational claims involve a status update of positions and states of affairs, which are the informational terms thought to, in varying degrees, move voters.33 Therefore, information was operationalized as claims about policy positions or status of contextual aspects relevant to a policy domain. We identified identity logics as communications that sought to define actors—collective or individual—with a moral disposition, which would activate an attitude or induce action.
Findings
If the primary aim of Russian troll accounts was to push fake news, we expect they would present themselves as news outlets independent of whether they were pushing factually incorrect claims about policy or sought to define candidate identities. Only 7.5 percent (34 of 454) of accounts included some mention of news production in their user profiles, usually local news, which might be more identified with than national news. Second, we analyzed tweets for candidate mentions. These findings are presented in Figure 2.
Figure 2 shows that candidate mentions were limited in the early parts of the campaign, only becoming central from August 2016 through the end of the campaign. This suggests a move after the national political conventions to begin defining candidates and their alternatives: Senator Bernie Sanders received more attention after he was out of contention than before, and Green Party nominee Jill Stein received more mentions than even Sanders. The bulk of the messaging was about then-candidate Donald Trump, followed by Hillary Clinton. In contrast, much of the communication between 2014 and early 2015 focused on aphoristic quotes and discussion of popular culture rather than political topics.
We used topic models to analyze the contents of these messages. Figure 3 presents the nine most frequent terms for each topic.34 None of the topics contain terms mentioning policy areas such as trade, jobs, health care, immigration, or the other policy areas mentioned during the campaign. We examine the first five in detail as they play off each other and highlight the wider Russian strategy during the election. The most prevalent topic referred to the FBI investigations into Clinton’s emails and the Clinton Foundation and reporting of scandals related to the Russian hacked emails distributed by WikiLeaks. This topic merges allegations of Clinton scandals, painting her as someone beset by corruption and criminality. The next topic is about the horserace of the campaign and includes claims the polls are rigged to paint Trump’s campaign as more viable than thought at the time.
The third most prevalent topic concerns Trump taking on the political establishment and the campaign in swing states. This topic describes who Trump and his enemies in the media and establishment are rather than their policy positions. The fourth most common topic was built around hashtags not directly concerning politics: #ruinadinnerinonephrase or #makemehateyouinonephrase, which were followed by jokes and other phatic communications that build relationships with persons; in other words, a way of building rapport and a traditional form of Russian spycraft.36 The fifth topic contains references to the Black Lives Matter movement but points to African American supporters of Trump and extant racial stratification, in addition to blasting Clinton as unable or unwilling to improve the situation. This strategy aimed to delink the identification the African American community had with President Barack Obama as well as Hillary Clinton’s husband, former President Bill Clinton. None of these topics discuss candidate policy positions or provide false policy-relevant information claims but they create an image of Trump as a winner and Clinton as ineffectual, corrupt, and criminal.
Conclusions
The Russian intervention in the 2016 U.S. election was a problem in terms of the divisive politicization of identities rather than a fake news problem in the sense of deceptive informational claims. The bulk of the communications sought to define Trump’s identity and those of his opponents. This follows the identity logic historically practiced by Russian information operations to both sow division and cultivate assets, first on neutral grounds to build relationships and later to move their views on political matters. As the campaign reached its height in October and November 2016, the principal topics defined Clinton in negative and moralizing terms whereas Trump was defined in positive ways as a fighter and potentially a winner. This finding helps explain the logic behind the communication strategy of the Russian intervention because it appeals to, reinforces, and connects social and political identities. The identity logic activates moral attitudes in citizens or receivers because it resonates with them, but not necessarily because of exposure to a preponderance of disinformation. Identities can be shifted if existential anxieties cause persons to de-identify with an existing order represented by Clinton and either identify with a hopeless narrative of political reality, leading to nonvoting, or identify with Trump, who can save them.
It is unlikely that this particular manifestation of fake news can be resolved through aggressive fact-checking because the problem lies on an identity order, not on an informational one. Furthermore, is it not simply a matter of curbing foreign influence because much of what the trolls sought to amplify were statements already being made by Americans. Future research is needed to unpack the domestic and foreign drivers of the fake-news problem, how different aspects of foreign and domestic operations reinforce each other, and what this means for the future of news.37 As the news media is no longer dominated by limited options and professional journalism that characterized the latter half of the twentieth century, the future of news might begin to look more like the past.
Dr. Michael Jensen is a senior research fellow at the Institute for Governance and Policy Analysis at the University of Canberra as well as Editor in Chief of the Journal of Information Technology and Politics. His research has been published by Cambridge University Press, Palgrave, and Routledge as well as the International Journal of Press/Politics, the Journal of Public Policy, and the Journal of Political Marketing. Jensen has given invited presentations at Oxford University, the University of California, National Sun Yat-sen University in Taiwan, the Electoral Studies Center in Taipei, and the Autonomous University of Barcelona.
Notes
1 Craig Silverman, “This Analysis Shows How Viral Fake Election News Stories OutperformedReal News On Facebook,” BuzzFeed, 17 November 2016, https://www.buzzfeed.com/craigsilverman/ viral-fake-election-news-outperformed-real-news-on-facebook; Philip N. Howard et al., “Social Media, News and Political Information during the US Election: Was Polarizing Content Concentrated in Swing States?,” ArXiv Preprint ArXiv:1802.03573, 2018; Clint Watts, Fake News and Russian Information Operations - FPRI (Philadelphia, PA, 2017), http://www.fpri.org/multimedia/2017/03/fake-news-rus- sian-information-operations/.
2 DNI, “Background to ‘Assessing Russian Activities and Intentions in Recent US Elections’: The Analytic Process and Cyber Incident Attribution,” (Washington, DC: Director of National Intelligence, 6 January 2017), https://www.dni.gov/files/documents/ICA_2017_01.pdf.
3 Anthony Downs, An Economic Theory of Democracy, 1st ed. (Harper and Row, 1957).
4 Lilliana Mason, Uncivil Agreement: How Politics Became Our Identity (Chicago, IL: University of Chicago Press, 2018); Christopher H. Achen and Larry M. Bartels, Democracy for Realists: Why Elections Do Not Produce Responsive Government (Princeton, NJ: Princeton University Press, 2016).
5 Jürgen Habermas, “Reply to My Critics,” John B. Thompson and David Held, eds., Habermas Critical Debates (Macmillan, 1982), 263-65.
6 Jürgen Habermas, The Theory of Communicative Action: Reason and the Rationalization of Society, trans. Thomas McCarthy (Beacon Press, 1987).
7 Jennifer L. Hochschild and Katherine Levine Einstein, (University of Oklahoma Press, 2015); Henrik Paul Bang, Foucault’s Political Challenge: From Hegemony to Truth (Springer, 2016).
8 Ben Cardin, “Putin’s Asymmetric Assault on Democracy in Russia and Europe: Implications for U.S. National Security” (Washington, DC: Senate Committee on Foreign Relations, 10 January 2018), https://www.foreign.senate.gov/press/ranking/release/cardin-releases-report-detailing-two-decades-of- putins-attacks-on-democracy; Clint Watts, “Disinformation: A Primer in Russian Active Measures and Influence Campaigns” (Washington, DC: Senate Select Committee on Intelligence, 30 March 2017).
9 Martin C. Libicki, “The Convergence of Information Warfare,” Strategic Studies Quarterly 11, no. 1 (2017).
10 Soroush Vosoughi, Deb Roy, and Sinan Aral, “The Spread of True and False News Online,” Science 359, no. 6380 (9 March 2018): 1146–51, https://doi.org/10.1126/science.aap9559; Howard et al. (2018).
11 Robert Mueller, United States v. Internet Research Agency, No. Case 1:18-cr-00032-DLF (U.S. Distrct Court for District of Columbia, 16 February 2018).
12 Edson C. Tandoc, Zheng Wei Lim, and Richard Ling, “Defining ‘Fake News,’” Digital Journalism 6, no. 2 (7 February 2018), 137–53, https://doi.org/10.1080/21670811.2017.1360143.
13 W Lance Bennett and Steven Livingston, “The Disinformation Order: Disruptive Communication and the Decline of Democratic Institutions,” European Journal of Communication 33, no. 2 (1 April 2018), 124, https://doi.org/10.1177/0267323118760317.
14 Harold A. Innis, Empire and Communications (Toronto: Dundurn Press Ltd., 2007); Si Sheppard, The Partisan Press: A History of Media Bias in the United States (McFarland, 2007).
15 W. Joseph Campbell, Yellow Journalism: Puncturing the Myths, Defining the Legacies (Greenwood Publishing Group, 2001)
16 Michael X. Delli Carpini and Scott Keeter, What Americans Know about Politics and Why It Matters (New Haven: Yale University Press, 1997); Bernard Grofman, Information, Participation, and Choice: An Economic Theory of Democracy in Perspective (Michigan: University of Michigan Press, 1995)..
18 Mueller, United States v. Internet Research Agency, 6.
19 Leigh Armistead, Information Operations: Warfare and the Hard Reality of Soft Power (Potomac Books, Inc., 2004), 1
20 Kevin McCauley, Russian Influence Campaigns against the West: From the Cold War to Putin (North Charleston, SC: CreateSpace, 2016), 11.
21 Manuel Castells, The Power of Identity: The Information Age: Economy, Society, and Culture (John Wiley & Sons, 2010), 6.
22 Anthony Giddens, Modernity and Self-Identity: Self and Society in the Late Modern Age (Stanford University Press, 1991), 53.
23 Todd C. Helmus et al., “Russian Social Media Influence,” International Security and Defense Policy Center (Santa Monica, CA: Rand Corporation, 2018), https://www.rand.org/pubs/research_ reports/RR2237.html.
24 Kenneth Burke, A Rhetoric of Motives (Berkeley: University of California Press, 1969), 20–21.
25 Achen and Bartels, Democracy for Realists; Mason, Uncivil Agreement.
27 Daniel Kübler and Hanspeter Kriesi, “How Globalisation and Mediatisation Challenge Our Democracies,” Swiss Political Science Review 23, no. 3 (2017), 231–45, https://doi.org/10.1111/spsr.12265; James R. Clapper and Trey Brown, Facts and Fears: Hard Truths from a Life in Intelligence (New York, NY: Viking, 2018), 267.
28 Clint Watts, Messing with the Enemy: Surviving in a Social Media World of Hackers, Terrorists, Russians, and Fake News (Harper, 2018), 85.
29 Mueller, United States v. Internet Research Agency
30 Ben Popken, “Twitter Deleted Russian Troll Tweets. So We Published More than 200,000 of Them,” NBC News, 14 February 2018, https://www.nbcnews.com/tech/social-media/now-available- more-200-000-deleted-russian-troll-tweets-n844731.
31 There was a total of 454 user names in the dataset. This reflects both an incomplete survey of all troll accounts as well as the possibility that the remaining accounts may have been tasked with other programs as the US is not the only theater where the IRA is engaged.
32 Although tweets are textually limited, LDA has successfully been used by computer scientists to analyze a very similar dataset of tweets to the one used here. Savvas Zannettou et al., “Disinformation Warfare: Understanding State-Sponsored Trolls on Twitter and Their Influence on the Web,”ArXiv:1801.09288 [Cs], 28 January 2018, http://arxiv.org/abs/1801.09288. In the October-November 2016 subset of tweets examined in these models, only 72 of 33,828 tweets were not in English.
34 Space constraints prevent us from a detailed discussion of the 28 terms that define each topic.
35 The topic model returns words stemmed to their roots, allowing comparability of words in topics independent of grammatical positions.
36 Watts (2018); Alice E. Marwick and Danah Boyd, “I Tweet Honestly, I Tweet Passionately: Twitter Users, Context Collapse, and the Imagined Audience,” New Media & Society 13, no. 1 (1 February 2011), 114–33, https://doi.org/10.1177/1461444810365313.
37 The IRA did not just post on social media, they organized and funded offline political events (Mueller, United States v. Internet Research Agency).