Social media has emerged as a powerful tool for political engagement and expression. However, state actors are increasingly leveraging these platforms to spread computational propaganda and disinformation during critical moments of public life. These actions serve to nudge public opinion, set political or media agendas, censor freedom of speech, or control the flow of information online. Drawing on data collected from the Computational Propaganda Project’s 2017 investigation into the global organization of social-media manipulation, we examine how governments and political parties around the world are using social media to shape public attitudes, opinions, and discourses at home and abroad. We demonstrate the global nature of this phenomenon, comparatively assessing the organizational capacity and form these actors assume, and discuss the consequences for the future of power and democracy.


The manipulation of public opinion via social media platforms has emerged as a critical issue facing contemporary digital society. There are many dimensions to this problem: junk news is spreading like wildfire via social media platforms during key episodes in public life, bots are amplifying opinions at the fringe of the political spectrum, nationalistic trolls are harassing individuals to suppress speech online, the financial model that supports high-quality news and journalism is facing increasing competition from social media advertising, strategic data leaks targeting political campaigns are undermining the credibility of world leaders and democratic institutions, and the lack of transparency around how social media firms operate is making regulatory interventions difficult. Evidence is beginning to illuminate the global impact of the darker side of political communication, including disinformation campaigns, negative campaigning, and information operations.

Governments and political parties around the world are spending significant resources to generate content, direct public attention, and manipulate the opinion of foreign and domestic audiences via social media.1 These “cyber troops” are state- sponsored organizations tasked with conducting disinformation campaigns on the internet. Disinformation takes many forms,2 but cyber-troop activity involves the purposeful distribution of fake, misleading, fabricated, or manipulated content. These actors rely on “computational propaganda”—the use of automation, algorithms, and big data analytics—to influence or deceive social-media users.3 Unlike lone-wolf coders, hacker collectives, or nonstate actors who also use social media to express speech or achieve political goals, cyber troops are publicly funded and often highly coordinated government actors who use social media to spread disinformation and attempt to generate false consensus. These strategies erode the quality of democracy by undermining trust in leaders, media, and institutions, in addition to serving as another tool to restrict freedom of expression in repressive regimes.

This paper discusses the global organization of social-media disinformation and compares government actors across regime types. Drawing on data collected from the Computational Propaganda Project’s 2017 investigation into the global organization of social-media manipulation, we examine how governments and political parties around the world are using social media as a tool of information warfare on both domestic and foreign audiences. We highlight the broad trends of this phenomenon, drawing conclusions about the form, organization, and capacity these institutions assume.

Power and Counter-power: Technology and the State

The study of technology and information has been central to the domain of international studies. Scholars have dedicated significant resources to understanding how technology disrupts power and politics, enhances or constrains human rights, reorganizes political interactions and institutions, and elevates cyberspace as a new domain of conflict. Much of the literature assumes that low barriers to entry and reduced communication costs afforded by technology have altered the balance of power in favor of minor political players. This assumption informs a wave of research that has explored issues such as terrorist recruitment and coordination online,4 the disruptive power of marginal social movements,5 and independent coder and hacker collectives performing discursive political actions.6 Indeed, many singular and small-scale actors have achieved impressive political gains by using the internet to catch authoritarian elites off-guard during the Arab Spring.7

At a high level, one component of scholarship on technology, power, and counter-power research has tended to focus on 1) the growing role and capacity of non-state actors, 2) cases that are regionally specific or limited in scope, or 3) cases that are serendipitous in success. While there is a growing body of important research on these modular phenomena, there is no reason we should expect that transferring norms of technology use should only involve social movements and democracy advocates.8 Many kinds of political actors—across geographic borders and regime types—can learn how to adapt technology to implement digital solutions to support their own public interests or goals.9 Indeed, many authoritarian regimes have successfully leveraged the internet and social-media technologies to exert further control and censorship of freedom and information.10 In addition, states—sometimes viewed as disadvantaged by the shift in communication power—can learn to exploit social media architecture to reassert control and sovereignty in cyberspace.

With increasing evidence of Russian involvement in the United Kingdom’s Brexit Referendum and interference with the 2016 United States election, social-media manipulation is clearly a powerful tool for political influence.11 Previously, most public concern and academic inquiries into cyber power had been preoccupied with the “hard power” capabilities that affect both the digital and real-world domains, such as cybercrime and data theft, attacks that damage critical infrastructure, or online surveillance.12 During the past five years, international affairs have become saturated with examples of governments leveraging social media to manipulate public opinion, assigning personnel and financial resources to disinformation and propaganda campaigns online. These campaigns tend to employ what scholars have referred to as “soft power” and persuasion,13 framing and agenda setting,14 ideological hegemony,15 symbolic power,16 or sharp power to achieve desired outcomes.17 This emerging domain of inquiry should be central to the field of international studies, as it provides an opportunity to expand and refine the concept and exercise of cyber power, and acts as a countervailing example of state power in the digital age.

The Organization of Social Media Manipulation

This section draws on the Computational Propaganda Project’s 2017 investigation into state-sponsored social-media manipulation, which presented an investigation of the political economy and organizational behavior of global cyber troops in 28 countries. These countries included: Argentina, Australia, Azerbaijan, Bahrain, Brazil, China, Czech Republic, Ecuador, Germany, India, Iran, Mexico, North Korea, Philippines, Poland, Russia, Saudi Arabia, Serbia, South Korea, Syria, Taiwan, Turkey, Ukraine, the United Kingdom, the United States, Venezuela, and Vietnam. The inventory captured various actors across many regime types that used disinformation campaigns on social media in an attempt to influence public opinion. It relied on open-source information, such as news reports, academic and think tank studies, and publicly available government documents, combined with expert consultations, to publish a country-by-country report of government- sponsored, social-media disinformation campaigns worldwide.18

While national contexts are always important to consider, we suggest it is worth trying to generalize about the experience of organized disinformation campaigns by regime type to develop a broad, comparative understanding of this phenomenon. Table 1 highlights four key trends in the capacity, organization, form, and targets that disinformation campaigns take on across regime type.

Regime Type (countries studied)

Modal Actors

Level of Formal Organization

Level of Capacity

Modal Targets

Democracy (Argentina, Australia, Brazil, Czech Republic, Ecuador, Germany, India, Israel, Mexico, the Philippines, Poland, Serbia, South Korea, Taiwan, the United Kingdom, the United States)

Political Party

Medium

4.63

Domestic

Authoritarian (Azerbaijan, Bahrain, China, Iran, North Korea, Russia, Saudi Arabia, Turkey Venezuela, Vietnam)

Government

High

4.4

Domestic

Crisis State (Ukraine, Syria)

Government

Low

3.5

Domestic

Table 1: A Comparative Redux of Government Capacity for Social Media Manipulation Around the World

Source: Authors, calculated based on data collected from Bradshaw and Howard (2017).

Modal Actors

Column one, modal actor types, presents the modal actor active across the regime types. For each country, we sought evidence that government agencies and political parties were employing computational propaganda, either through their own capacities or working alongside civil society groups, private citizens, or independent contractors. Based on the data we collected, we calculated the modal actor present in each regime by counting the number of actors identified in each case.

In authoritarian regimes, the modal actor types are government ministries. These include cyber troops who work for Vietnam’s Ministry of Education and the Internet Research Agency in Russia. Most of the authoritarian states in our sample (Bahrain, China, Iran, North Korea, Saudi Arabia, Venezuela, and Vietnam) had a government organization responsible for disinformation campaigns. Often, these organizations were part of a larger cybersecurity team tasked with securing cyber infrastructure and content, such as Bahrain’s National Cyber Crime Unit or Iran’s Supreme Council of Cyberspace. This finding fits into the broader work on censorship and repression by authoritarian regimes that leverage technology to exert control of cyberspace.19

In contrast, the modal actor types in democracies are political parties. We found evidence of political parties using computational propaganda in 12 of the 16 countries we identified (Argentina, Australia, Brazil, Germany, India, Mexico, the Philippines, Poland, Serbia, Taiwan, the United Kingdom, and the United States). These actors would often target domestic audiences during elections or other important political events, such as referenda.

Level of Formal Organization

The second column presents the level of formality among cyber troop teams. Here, we collected evidence on the organizational behavior of cyber troops, including coordination across teams, clear levels of hierarchy, and reward structures (such as evidence of performance bonuses or scholarships). We used a simple calculation to determine the level of formal organization by summing and averaging evidence of each variable across regime type. This column is based on a low-medium-high scale: a score of 1-3 represents a low level of organization, 3-7 is medium, and 7 or above is high.

Authoritarian regimes have the highest level of formal organization, where teams often work in a structured environment with managers and reporting structures. Cyber troops are assigned daily or weekly tasks, such as making a certain number of posts or friending real people. Staff will also receive formal training. In some cases, organizations award scholarships and other forms of recognition to encourage more online activity. This is unsurprising given the amount of control such regimes already exert over their populations.

Democracies have a slightly lower degree of formal organization. This is in part because teams that work with political parties often form around the time of an election and dissolve when the campaign has ended. When coordination does occur, it is usually around military interventions that use tools of computational propaganda to combat terrorism or counter extremism online. These psychological operations, often carried out in a military setting, are more formally organized with major investments made in research and development.

Level of Capacity

The third column presents the capacity of cyber troop teams to conduct disinformation campaigns online. Different teams use different tools and techniques to manipulate public opinion, such as political bots, content creation, targeted advertisements, fake personas, and trolling or harassment. We collected evidence of the existence of these capacities in each regime type, then calculated the average number of tools found across regime types to determine the level of capacity.

Democracies have the highest level of capacity to conduct disinformation campaigns. This is unsurprising. Much of the research and innovation in this area began in democratic military settings, as defense organizations invested resources into understanding how ideas go viral on social media. Today, however, most of the innovation around tools and techniques occurs around election cycles, with political parties and strategic communication firms driving innovation. A common tool of democracies is the use of political bots, which amplify follower counts on social media and ensure certain stories or hashtags begin trending.20 In Argentina, Australia, Brazil, Germany, Ecuador, Mexico, the Philippines, Taiwan, the United Kingdom, and the United States we found evidence of political parties using political bots to distort partisan conversations by generating high follower counts or amplifying certain hashtags or narratives over others.

Although they are highly coordinated, authoritarian regimes have a slightly lower level of capacity for conducting disinformation campaigns. While there are a few authoritarian regimes that have invested significant funds and developed sophisticated tools for disseminating disinformation, most authoritarian regimes rely on blunt tools to silence freedom of speech, including trolling, harassment, and targeting journalists or political dissidents with hate speech or threats. Indeed, trolling journalists has been recognized as a new and widespread threat to freedom of the press and freedom of expression around the world.21 In Azerbaijan, Bahrain, Russia, and Turkey we found evidence of state-sponsored trolling behavior.

Modal Targets

Finally, modal targets present the most common target for coordinated disinformation campaigns. We collected evidence of various incidents where cyber troops were used to shape the public debate. Some incidents focused on domestic audiences, such as the use of computational propaganda as a tool of social control and censorship. Other incidents focused on foreign adversaries, such as foreign influence operations. We then organized this information by regime type to calculate the modal targets of each.

Across all regime types, the modal target was domestic audiences. In democracies, this tended to manifest as political parties using disinformation campaigns during elections. In authoritarian regimes, these campaigns act as another tool of censorship and repression. In only a few instances did we find cyber troops using disinformation campaigns on foreign audiences. Examples here include Russia targeting the United States as well as Baltic and European countries, and foreign operations between China and Taiwan, North and South Korea, and Israel and Palestine.

Limitations

While the findings of this paper serve to inform a broader understanding of the global organization of social media manipulation, there are several limitations to this methodology that should be considered. First, data about disinformation campaigns are spotty at best. Many of these activities occur in secretive military contexts, or behind the proprietary walls of private actors. Thus, painting a complete picture of these activities online by government actors is extremely difficult, and there will be gaps in the data and cases collected. The cases we identified are in no way exhaustive as to the extent to which governments and political party actors use disinformation campaigns. Nevertheless, they are important to begin inventorying to start developing a broader understanding of this phenomenon and the implications for democracy.

Second, the data collected for the analysis were based on open sources in the English language. While expert consultations were conducted to ensure the accuracy of data collected in non-English country-specific contexts, we only consulted individuals once we had identified a case of disinformation. Thus, there are many other cases that might have not been captured because of the language limitations.

Finally, we used very simplistic measures to calculate the modal actor types, level of formal organization, level of capacity, and modal targets. Although a more rigorous model could be adopted with more data points, this simplistic model demonstrates a few important differences that are already emerging between regime types and how different actors use social media to manipulate public opinion. These insights are important for beginning to establish insights into this phenomenon and encourage further research on this topic.

Conclusion

Cyber troops invest significant funds and resources in an attempt to sway public opinion over social media. Increasingly, governments and political parties around the world are investing in the tools and techniques of computational propaganda in order to shape the outcomes of elections,22 disrupt diplomatic efforts,23 and undermine peacebuilding efforts.24 It is valuable to know this because it means investigators of such phenomena have bigger objects and subjects of study. But it also means that even the most traditional ways of analyzing the causes and consequences of modern peace, conflict, trade, diplomacy, and a myriad of other international processes must consider that political actors have a significant new tool for political communication and for disrupting the political signals of rival actors.

By taking a global perspective on computational propaganda and the actors involved in spreading disinformation on social media, we can identify emerging trends and track the evolution of this phenomenon over time. It is clear that more state actors are attempting to exploit social media to exercise power in the digital age. From 2017-18, the size of our sample has grown as more political parties and governments begin to experiment with computational propaganda.25 Domestic and international affairs have become proliferated with examples of state actors using and abusing social media to achieve political goals. These examples demonstrate the changing nature of cyber power and the addition of computational propaganda to the arsenal of cyberwarfare. As innovation continues in areas such as artificial intelligence, machine learning, the Internet of Things, and big-data analytics, the nature and strategy of computational propaganda will also evolve. Scholarship must understand how power is assumed and exercised through computational propaganda, and the consequences of this phenomenon for society and democracy. 

 

Philip N. Howard is a statutory Professor of Internet Studies at the Oxford Internet Institute and Balliol College at the University of Oxford. He has courtesy appointments as a professor at the University of Washington’s Department of Communication and as a fellow at Columbia University’s Tow Center for Digital Journalism.

Samantha Bradshaw is a DPhil. candidate at the Oxford Internet Institute, a Researcher on the Computational Propaganda Project, and a Senior Fellow at the Canadian International Council. Prior to joining the COMPROP team, she worked at the Centre for International Governance Innovation in Waterloo, Canada, facilitating the Global Commission on Internet Governance. Samantha holds an M.A. in global governance from the Balsillie School of International Affairs and a joint honors B.A. in political science and legal studies from the University of Waterloo.

 

NOTES

1 Samantha Bradshaw and Philip N. Howard, “Troops, Trolls and Troublemakers: A Global Inventory of Social Media Manipulation,” Computational Propaganda Project Working Paper (2017). Samantha Bradshaw and Philip N. Howard, “Challenging Truth and Trust: A Global Inventory of Organized Social Media Manipulation,” Computational Propaganda Working Paper (2018).

2 Claire Wardle, “Fake News, It’s Complicated” (2017), https://firstdraftnews.org/fake-news-complicated/

3 Samuel C. Samuel and Philip. N. Howard, “Automation, Algorithms, and Politics: Political capabilities of the Islamic State,” Journal of Cyber Policy 2, no. 2 (2017), 255–65.

4 Rose Bernard, “These are not the terrorist groups you’re looking for: an assessment of the cyber capabilities of the Islamic State,” Journal of Cyber Policy 2, no. 2 (2017), 255–65.

5 Taylor Owen, “Disruptive Power: The Crisis of the State in the Digital Age,” Oxford Studies in Digital Politics (2015).

6 Jessica L. Beyer, “Expect Us: Online Communities and Political Mobilization,” (Oxford: Oxford University Press, 2014); Gabriella Coleman, “Hacker, Hoaxer, Whistleblower, Spy: The Many Faces of Anonymous,” (Singapore Books, 2014).

7 Philip N. Howard and Muzammi M. Hussain, “Democracy’s Fourth Wave? Digital Media and the Arab Spring” (Oxford: Oxford University Press, 2013); Helen Margetts et al., “Political Turbulence: How Social Media Shape Collective Action” (Princeton: Princeton University Press, 2015).

8 Mark R. Beissinger, “Structure and Example in Modular Political Phenomena: The Diffusion of Bulldozer/Rose/Orange/Tulip Revolutions,” Perspectives in Politics 5, no. 2 (2007), 259–76; Philip N. Howard, “The Digital Origins of Dictatorship and Democracy” (2010).

9 William Dobson, The Dictators Learning Curve: Tyranny and Democracy in the Modern World, (London: Vintage Press, 2012).

10 Rebecca MacKinnon, “China’s Networked Authoritarianism,” Journal of Democracy 22, no. 2 (2011), 32–46; K. E. Pearce and S. Kendzior, “Networked Authoritarianism and Social Media in Azerbaijan,” Journal of Communication 62, no. 2 (2012), 283–98.

11 United States Department of Justice, United States of America v. Internet Research Agency (18 U.S.C. 2,371,1349,1028A), 2018.

12 Ronald Deibert, Black Code: Surveillance, Privacy and the Dark Side of the Internet (Toronto: Signal, 2013); K. Zetter, Countdown to Zero Day (New York: Crown Publishers, 2014); J. Healey, Cyber Warfare in the 21st Century: Threats, Challenges and Opportunities (Washington, DC: House Committee on Armed Services, 2017).

13 Joseph Nye, “Soft Power,” Foreign Policy 80 (1990), 153–71; Joseph Nye, Soft Power. The Means to Success in World Politics (New York: Public Affairs Press, 2004).

14 P. Bacharach and M. S. Baratz, “Decisions and Nondecisions: An Analytical Framework,” The American Political Science Review 57, no. 3 (1963), 632–42.

15 S. Lukes, Power: A Radical View, (London: Palgrave, 1970).

16 P. Bourdieu, “Symbolic Power,” Critique of Anthropology 4, nos. 13–14 (1979), 77–85.

17 C. Walker and J. Ludwig, “The Meaning of Sharp Power,” Foreign Affairs (November 2017).

18 Bradshaw and Howard (2017).

19 Ronald Deibert et al., Access Denied (Cambridge: MIT Press, 2008); Rebecca MacKinnon, Consent of the Networked: The Worldwide Struggle for Internet Freedom (Basic Books, 2012). 

20 Samuel Woolley and Philip N. Howard, “Computational Propaganda Worldwide,” Computational Propaganda Working Paper Series (Oxford: Oxford University Press, 2017), Executive Summary.

21 Reporters Without Borders, “Online Harassment of Journalists: The Trolls Attack” (2018), https://rsf.org/en/news/rsf-publishes-report-online-harassment-journalists.

22 Phillip N. Howard et al., “Social Media, News and Political Information During the US Election: Was Polarizing Content Concentrated in Swing States?,” Computational Propaganda Project Data Memo (28 September 2017).

23 Shawn Powers and Markos Kounalakis, “Can Public Diplomacy Survive the Internet? Bots, Echo Chambers and Disinformation” (Washington, DC: U.S. State Department Advisory Commission on Public Diplomacy, May 2017).

24 Aastha Nigam et al., “Harvesting Social Signals to Inform Peace Processes Implementation and Monitoring,” Big Data 5, no. 4 (2017), 337–55.

25 Bradshaw and Howard (2018).