A New Approach to Assessing the Role of Technology in Spurring and Mitigating Conflict: Evidence From Research and Practice
Ongoing research and discussions examine the role of technology—information communication technologies in particular—in conflict, principally focusing on whether technology is a good or bad thing for peace and social progress. This narrow focus overlooks important considerations about how technology interacts with human nature and surrounding contextual dynamics, and in so doing obscures opportunities to prevent harm and to leverage technology for good. This article explores interdisciplinary research and lessons from violence-prevention efforts in Kenya to propose a new approach for understanding and harnessing technologies to unite rather than divide and to identify and mitigate risks while seizing opportunities that new and emerging technologies may afford.
Recognizing Twitter’s role in the 2009 Iranian Revolution, a former top U.S. National Security Council advisor nominated Twitter for the Nobel Prize, saying “When traditional journalists were forced to leave [Iran], Twitter became a window for the world to view hope, heroism, and horror...Without Twitter, the people of Iran would not have felt empowered and confident to stand up for freedom and democracy.”1 These remarks coincide with inquiries into the role of information and communication technologies (ICTs) in increasing empathy, connection, access to information, and accountability for human-rights abuses.2 Research examining social media’s role in coordinating and publicizing the Arab Spring protests3 has even led commentators to dub events in Egypt, Iran, and Tunisia as “Twitter revolutions.”4
Recent research, however, has begun connecting new ICTs to increased polarization and group-targeted violence. ICTs have been linked to ethnic cleansing in Myanmar,5 expanding white supremacist movements in the United States,6 and accelerating ISIS recruitment globally.7 Furthermore, at least one study has con- cluded that “the availability of cell phone coverage significantly and substantially increases the probability of violent conflict.”8 These examples are a fraction of a debate on the role of technology in conflict. In its simplest form, the debate asks “do new ICTs increase or reduce the risk of violence? Fuel or constrain human conflict?”
As conflict prevention and resolution practitioners focused on the role of communication in intergroup, identity-based violence, we believe the current conversation overlooks key considerations. In focusing on whether ICTs drive conflict or connection, help or hurt democracy, and protect or undermine human rights, the inquiry misses the larger picture of how ICTs interact with broader contextual and human factors—psychological, neurological, political—that are critical to under- standing and effectively responding to the connection between ICTs and conflict.
This paper proposes a new approach to examining the role of ICTs in conflict, which combines interdisciplinary insights on intergroup conflict with a practical, on-the-ground perspective of communication and narrative dynamics. We hope to equip practitioners and researchers to better evaluate existing and prepare for new ICTs, to mitigate risks, and to maximize ICTs’ potential to drive peace and positive social change.
To situate our proposed approach, we highlight that the interplay between communication and conflict dates back throughout history. We then discuss how different features of ICTs extend this trend, changing how information spreads and interacting with and shaping offline events. After reviewing interdisciplinary research on how ICTs might interact with human nature, we examine local actors’ roles in identifying and responding to ICTs in conflict, specifically focusing on work utilizing SMS to avert election-related violence in Kenya. Last, we consider these findings’ practical implications as we look ahead to new and emerging ICTs.
Communications in Conflict
Communication plays a fundamental role in how humans organize and understand themselves as individuals and within groups: how people seek connection, develop relationships, form communities, recognize social norms, and embrace, internalize, and spread narratives to make sense of the world and humans’ place within it. Through communication, individuals develop and share stories about what it means to be “us” and how to interact with “them.”
Communication and intergroup conflict are thus intimately connected. Recognizing the role of communication in inciting Nazi atrocities, the United Nations’ Convention on the Prevention and Punishment of the Crime of Genocide criminalized “direct and public incitement to commit genocide.”9 Research on the role of communication in violence led to the term “dangerous speech,” speech with a “special capacity to catalyze mass violence.”10 Related frameworks go beyond the communication’s content, examining its interaction with the surrounding context: the channels through which it travels, the speaker’s authority vis-à-vis their audience, and how content taps into existing narratives.11
Hardly unique to Facebook, Twitter, or WhatsApp, new communications technologies have long impacted how humans understand and interact with the surrounding world by increasing the speed and distance over which communication travels, democratizing content creation, and enhancing access to information, among other features. As historian Robert Darnton observed, “[t]he marvels of communication technology in the present have produced a false consciousness about the past...[that communication] had nothing of importance to consider before the days of television and the Internet.”12
History is rife with examples of how new ICTs have impacted intergroup relations and even revolution. Adolph Hitler’s chief propagandist identified film as “one of the most modern and far-reaching means of influencing the masses,” and revolutionized the medium to reinforce pro-Nazi narratives.13 While exiled in a Paris suburb, Iranian Ayatollah Khomeini utilized cassette tapes, then cutting-edge, to circumvent censorship and speak directly to his followers.14 And the International Criminal Tribunal for Rwanda noted, “without a firearm, machete, or any physical weapon,” radio—the medium with the widest reach—had contributed to the Rwandan genocide.15
Improvements in communication technologies have also challenged authoritarian regimes. Soviet dissidents used secret photocopiers and fax machines to bypass state regulations and develop and distribute samizdat publications.16 Also, the White Rose society relied on portable typewriters and duplicating machines to circumvent censorship and print anti-Nazi leaflets more quickly than would have otherwise been possible.17
Recent ICTs and Conflict
New ICTs continue to facilitate access, information, and connection at enhanced speed and across broader distances. In recognizing this, we do not view new ICTs as upending the importance of offline conditions in influencing on-the-ground events.18 Instead, we view ICTs as one factor in a web of dynamics—online and offline—that can collectively incite or defuse violence.
Social media and messaging apps enable narratives to instantaneously spread across distance while still appearing credible, magnify trigger events, and allow those not physically present to influence on-the-ground events. In Sri Lanka, offline altercations between Buddhist and Muslim Sri Lankans were recorded and uploaded to Facebook and later WhatsApp, fueling anti-Muslim rumors, deadly protests, and revenge attacks throughout the country.19 In Myanmar, through spreading and amplifying rumors that previously circulated offline, Facebook posts and closed messaging groups have played a “determining role” in the ethnic cleansing of the Rohingya.20 ICTs similarly enable ISIS operatives to coach recruits, cultivate offline networks, and coordinate violence with only an online connection to the attackers.21
These same ICTs have also contributed to democratic transitions. Social media networks laid the groundwork for offline collective action throughout the Arab Spring.22 In Moldova, tweets publicizing anti-government protests, in generating greater and international coverage, contributed to new elections that ushered in its first noncommunist government in more than 50 years.23 Pro-democracy forces in Ukraine’s Orange Revolution used online and mobile networks to bypass a self- censoring media, mobilize grassroots support, and coordinate demonstrations.24
ICTs and Human Nature
In addition to interacting with the surrounding context, ICTs tap into human tendencies: the need to belong, susceptibility to norms, sensitivity to threat, and preference for information that confirms existing worldviews. Accordingly, to understand ICTs’ role in conflict, we must also understand how they interact with human nature. Research from cognitive neuroscience, social and behavioral psychology, and communications, among other fields, offers valuable insights that may inform and enhance conflict mitigation and resolution strategies. Consider the interrelated insights below:
Group Membership and Social Norms: Individuals are motivated to belong to groups to satisfy needs for belonging and identity.25 Threat or uncertainty—conditions common in conflict settings—can strengthen group identification, increasing the likelihood that people will act to protect their in-group.26 Perceptions of normal or expected behavior for one’s groups (social norms) powerfully impact actions.27 Through portraying violence as normal or expected behavior for “good group members” while policing or punishing moderate voices or those advocating against violence, group norms may lead individuals to perpetrate violence even if doing so violates their private beliefs.28
ICTs facilitate the creation of groups across geographies and enable individuals, in broadcasting or sharing certain information or actions, to depict particular viewpoints or behaviors as normal or expected for those (initially) online groups. ICTs that facilitate the creation of groups across geographies enable individuals to transcend local community norms. In broadcasting posts that group members like and share, ICTs such as Facebook provide additional channels for group members to depict particular viewpoints as normal or expected. This is neither inherently good nor bad. It may enable new groups across conflict lines to promote norms espousing peace and tolerance. At the same time, formerly fringe or extremist voices may form new communities and have a bigger platform to depict violence as expected, even desirable.29
Contact, Empathy Gaps, Dehumanization, and Threat Construction: In exposing individuals to out-groups or in tapping into powerful emotions like fear and perceived threat, technologies facilitating intergroup contact may drive empathy across groups or strengthen intergroup divisions and the related willingness to protect one’s in-group.30 Social media posts depicting day-to-day stories may strengthen intergroup empathy and reduce dehumanization and misconceptions. However, in circulating stories of abuses allegedly perpetrated against one’s in-group, ICTs can spread fear and threat, triggering revenge cycles across geographies. Consider how ISIS exploits anti-Muslim discrimination to encourage attacks globally.31
Misinformation, Cognitive Biases, and Emotions: Human emotions and beliefs inform how people interact with information. Through motivated reasoning, people often reject information that challenges and seek and accept informa- tion that confirms existing beliefs. Individuals are also more likely to share information—including misinformation—that evokes fear, surprise, or disgust.32 Falsehoods may continue to influence individuals, even after they are corrected, because misinformation “sticks.”33
In democratizing content creation and enhancing access to information, ICTs enable greater exposure to new and diverse ideas. This may facilitate the spread of content that challenges prevailing conflict narratives and divisions. However, it may also connect individuals to information that legitimizes existing viewpoints, creating echo chambers that reinforce biases and further polarize.34 In removing traditional media gatekeepers, ICTs also reduce barriers to the spread of fear-and shock-inducing information, regardless of its truth, which individuals are more likely to forward. Such content may drive perceptions of threat and related pre-emptive intergroup violence.
The Power of Local Knowledge
This article emphasizes that, to begin anticipating how a new ICT might impact intergroup dynamics, we must understand the ICT itself, as well as how it interacts with human tendencies. Another crucial ingredient is local, contextual understanding of the interplay between ICTs and group identities, salient narratives, and information flows within a concerned environment.
As conflict-prevention practitioners, we have seen firsthand how community leaders have used their local awareness and social capital to analyze and counteract the use of ICTs for violence. This awareness can help guard against well-intentioned efforts backfiring through failing to generate local buy-in or inadvertently reviving historical grievances.35
This article examines local peace activists’ work with Sisi ni Amani-Kenya (SNA-K), a civil society organization,36 in which they analyzed the role of SMS messages in Kenya’s 2007-2008 post-election violence and used that same medium to promote peace during the country’s 2013 elections.37 To better understand how the messages had contributed to violence, the SNA-K team engaged their commu- nities in a context analysis. The analysis revealed how SMS had tapped into long- standing ethnic divisions to spread rumors, create social pressure to participate in violence, accelerate revenge cycles, and facilitate attacks.38
The SNA-K team assessed how the rumors had spread across online and offline mediums, including mobile networks, pamphlets, and in-person interactions. While political speeches and radio had also spread rumors, there was unique value in SMS coming from family and peers, who are trusted messengers. Mobile networks enabled the rumors to circulate quickly across broad geographies, mobilizing violence almost instantaneously.39
The team used their analysis and social capital to utilize SMS to instead prompt Kenyans to reconsider violence. They built a 65,000-person mobile network spanning Kenya, creating the infrastructure for messages emphasizing peace to outpace messages spreading rumors and fear.40 When potential trigger events emerged, the network could immediately send targeted messages warning against misinformation and emphasizing local unity and the consequences of violence.41 The team complemented this online network with offline outreach and civic education events. Subsequent evaluations revealed that the messages helped prevent the spread of rumors and contributed to the elections remaining peaceful.42
Kenya is not an isolated instance. Throughout the above examples, local leaders harnessed their contextual awareness and social capital to address ICTs’ role in conflict, recognizing that ICTs do not exist in a vacuum.
Practical Implications and Next Steps
This article has highlighted how ICTs interact with human tendencies and the surrounding local context to impact conflict. We seek to underscore that, to evaluate and effectively address the role of communication in conflict, we must develop an approach responsive to these underlying factors.
This integrated approach will position us to better anticipate, understand, and respond to new technologies—such as virtual reality (VR), augmented reality, and blockchain—their expansion to broader audiences, and their continued impact on how individuals organize and hopefully cooperate as a society. Indeed, a single technology may interact with surrounding dynamics to fuel violence or drive unity. Consider artificial intelligence’s role in spreading fake and hyper-partisan news and how, using human and contextual insights, it is being programmed to identify and remove that same content.43
We can also look ahead as technologies grow in reach. Anticipating the VR “take off,” we can consider how it will impact conflict and intergroup dynamics. Research is already examining the relationship between VR and empathy.44 How might VR drive empathy across groups to reduce conflict? Alternatively, how might it fuel greater in-group empathy and increased willingness to perpetrate violence to protect one’s own group?45 It is also critical to contextualize the technology: within a community, how might VR interact with information flows, local narratives, and intergroup relations?
If, for each new technology, we can think about how it changes communication and ask how those changes might amplify or constrain violence, we may be better positioned to act proactively, rather than reactively.
Rachel Brown is the Founder and Executive Director of Over Zero, the author of Defusing Hate: A Strategic Communication Guide to Counteract Dangerous Speech, a former Genocide Prevention Fellow at the United States Holocaust Memorial Museum’s Simon-Skjodt Center for Prevention of Genocide, and the Founder and former CEO of Sisi ni Amani Kenya.
Laura Livingston is the Programs Manager at Over Zero. She previously advised human rights, transitional justice, and rule of law programming in the Balkans, Sri Lanka, and East Africa.
Notes
1 Mark Pfeifle, “A Nobel Peace Prize for Twitter?,” The Christian Science Monitor, 6 July 2009, https://www.csmonitor.com/Commentary/Opinion/2009/0706/p09s02-coop.html.
2 Lara Ralston, “Can the Internet Solve Conflict?,” (Washington, DC: The World Bank, October 2014), http://blogs.worldbank.org/futuredevelopment/can-internet-solve-conflict; Jennifer Carolan, “Empathy technologies like VR, AR, and social media can transform education,” Tech Crunch, 22 April 2018, https://techcrunch.com/2018/04/22/empathy-technologies-like-vr-ar-and-social-media-can-transform-education/.
Interview with Wael Ghonim, CNN, 11 February 2011, http://transcripts.cnn.com/ TRANSCRIPTS/1102/11/bn. 02.html; Muzammil M. Hussain and Philip N. Howard, “What Best Explains Successful Protest Cascades? ICTs and Fuzzy Causes of the Arab Spring,” International Studies Review 15 (2013), 48–66, https://pdfs.semanticscholar.org/8086/e350af546f8b587c1f39a71eb- 8d4a6fc3ac8.pdf.
4 Pew Research Center, "Iran and the "Twitter revolution"," 25 June 2009, http://www.journalism.org/2009/06/25/iran-and-twitter-revolution/; Peter Beaumont, “The truth about Twitter, Facebook and the uprisings in the Arab world,” The Guardian, 25 February 2011, https://www.theguardian.com/ world/2011/feb/25/twitter-facebook-uprisings-arab-libya.
5 Marzuki Darusman, 12 March 2018, http://www.ohchr.org/EN/HRBodies/HRC/Pages/NewsDetail. aspx? NewsID=22798&LangID=E; Euan McKirdy, “When Facebook becomes ‘the beast,’” CNN, 6 April 2018, https://www.cnn.com/2018/04/06/asia/myanmar-facebook-social-media-genocide-intl/ index.html.
6 Jessie Daniels, “The algorithmic rise of the ‘alt-right,’ Contexts, Winter 2018, https://contexts.org/ articles/the-algorithmic-rise-of-the-alt-right/; Max Fisher and Amanda Taub, “How Everyday Social Media Users Become Real-World Extremists,” New York Times, 25 April 2018, https://www.nytimes. com/2018/04/25/world/asia/facebook-extremism.html; Vann R. Newkirk II, “White Supremacy is the Achilles Heel of American Democracy,” The Atlantic, 17 April 2018, https://www.theatlantic.com/poli- tics/archive/2018/04/white-supremacy-is-still-americas-biggest-security-threat/557591/.
7 J.M. Berger, “Tailored Online Interventions: The Islamic State’s Recruitment Strategy,” Combating Terrorism Center 8, no. 10 (October 2015), https://ctc.usma.edu/app/uploads/2015/10/CTCSentinel- Vol8Iss1036.pdf; Rukmini Callimachi, “Not ‘Lone Wolves’ After All: How ISIS Guides World’s Terror Plot from Afar,” New York Times, 4 February 2017, https://www.nytimes.com/2017/02/04/world/asia/ isis-messaging-app-terror-plot.html; Emerson Brooking and P.W. Singer, “War Goes Viral,” The Atlantic, November 2016, https://www.theatlantic.com/magazine/archive/2016/11/war-goes-viral/501125/.
8 Jan H. Pierskalla and Florian M. Hollenbach, “Technology and Collective Action: The Effect of Cell Phone Coverage on Political Violence in Africa,” American Political Science Review 107, no. 2 (2013), 207–24.
9 Convention on the Prevention and Punishment of the Crime of Genocide, 9 December 1948, 102 Stat. 3045, 78 U.N.T.S. 277.
10 Susan Benesch, “Countering Dangerous Speech: New Ideas for Genocide Prevention,” (Washington, DC: U.S. Holocaust Memorial Museum, 2014), https://www.ushmm.org/m/pdfs/20140212-benesch- countering-dangerous-speech.pdf; Jonathan Leader Maynard and Susan Benesch, “Dangerous Speech and Dangerous Ideology: An Integrated Model for Monitoring and Prevention,” Genocide Studies and Prevention: An International Journal 9, no. 3 (2016), 74, 78, 87.
11 Jonathan Leader Maynard, “Rethinking the Role of Ideology in Mass Atrocities,” Terrorism and Political Violence 26, no. 5 (2014), 821–41; Maynard and Benesch (2016), 74, 78, 87.
12 Malcolm Gladwell, “Small Change: Why the Revolution Will not be Tweeted,” The New Yorker, 4 October 2010, https://www.newyorker.com/magazine/2010/10/04/small-change-malcolm-gladwell.
13 Jay W. Baird, “From Berlin to Neubabelsberg: Nazi Film Propaganda and Hitler Youth Quex,” Journal of Contemporary History 18, no. 3 (July 1983), 495; Leonard W. Doob, “Goebbels’ Principles of Propaganda,” Public Opinion Quarterly 14, no. 3 (1950), 428.
14 Eric Schmidt and Jared Cohen, “The Digital Disruption,” Foreign Affairs (November/December 2010), http://cddrl.fsi.stanford.edu/sites/default/files/schmidt_the_digital_disruption.pdf; John Rahaghi, “New Tools, Old Goals: Comparing the Role of Technology in the 1979 Iranian Revolution and the 2009 Green Movement,” Journal of Information Policy 2 (2012), 151–82.
15 Prosecutor v. Nahimana, Case No. ICTR 99-52-T, Judgment and Sentence, paragraph 972, 1099 (3 December 2003).
17 Margie Burns, “Sophie Scholl and the White Rose,” (The International Raoul Wallenberg Foundation), http://www.raoulwallenberg.net/holocaust/articles-20/sophie-scholl-white-rose/.
19 Michael Safi, “Sri Lanka accuses Facebook over hate speech after deadly riots,” The Guardian, 14 March 2018, https://www.theguardian.com/world/2018/mar/14/facebook-accused-by-sri-lanka-of- failing-to-control-hate-speech; Taub and Fisher (2018).
20 Tom Miles, “UN investigators cite Facebook role in Myanmar crisis,” Reuters, 12 March 2018, https://www.reuters.com/article/us-myanmar-rohingya-facebook/u-n-investigators-cite-facebook-role- in-myanmar-crisis-idUSKCN1GO2PN; Sheera Frenkel, “This is What Happens When Millions of People Suddenly Get the Internet,” BuzzFeed News, 20 November 2016, https://www.buzzfeed.com/ sheerafrenkel/fake-news-spreads-trump-around-the-world?utm_term=.xjwzoJMXM#.khYMby464.
21 Callimachi (2017); Bridget Moreng, “ISIS’ Virtual Puppeteers: How they Recruit and Train ‘Lone Wolves,’” Foreign Affairs, 21 September 2016, https://www.foreignaffairs.com/articles/2016-09-21/isis-virtual-puppeteers.
23 Some commentators note there were few Twitter accounts within the country, suggesting that the government engineered the protests. Anne Applebaum, “In Moldova, The Twitter Revolution that Wasn’t,” The Washington Post, 21 20 April 2009, http://www.washingtonpost.com/wpdyn/content/ article/2009/04/20/AR2009042002817 .html. Others highlight that the number of twitter users is irrelevant. “On a good network, you don’t need to have the maximum number of connections to be powerful...’” Evgeny Morozov, “Moldova’s Twitter revolution is NOT a myth,” Foreign Policy, 10 April 2009, https://foreignpolicy.com/2009/04/10/moldovas-twitter-revolution-is-not-a-myth/.
24 Joshua Goldstein, “The Role of Digital Networked Technologies in the Ukrainian Orange Revolution,” Berkman Klein Research Publication (2007), 8–9.
25 Rebecca Littman and Elizabeth Levy Paluck, “The Cycle of Violence: Understanding Individual Participation in Collective Violence,” Advances in Political Psychology 36 (2015), 83.
27 Margaret E. Tankard, Elizabeth Levy Paluck, “Norm Perception as a Vehicle for Social Change,” Social Issues and Policy Review 10, no. 1 (2016), 184, 189.
28 Ravi Bhavnani, “Ethnic Norms and Interethnic Violence: Accounting for Mass Participation in the Rwandan Genocide,” Journal of Peace Research 43, no. 6 (2006), 651–69.
30 Mina Cikari, Emile Bruneau, and Rebecca Saxe, "Us and Them: Intergroup Failures of Empathy," Annenberg School for Communication Department Papers (2011), 11–12.
31 Samantha Mahood and Halim Rane, “Islamist narratives in ISIS recruitment propaganda,” Journal of International Communication 23 (December 2016); Simon Cotte, “Why It’s So Hard to Stop ISIS Propaganda,” The Atlantic, 2 March 2015, https://www.theatlantic.com/international/archive/2015/03/ why-its-so-hard-to-stop-isis-propaganda/386216/.
32 Soroush Vosoughi, Deb Roy, and Sinan Aral, “The spread of true and false news online,” Science, 9 March 2018; Steve Lohr, “It’s True: False News Spreads Faster and Wider. And Humans are to Blame,” The New York Times, 8 March 2018, https://www.nytimes.com/2018/03/08/technology/twitter- fake-news-research.html.
Ibid.; Gallup and Knight Foundation, Assessing the Effect of News Source Ratings on News Content” (2018), 15–16, https://kf-site-production.s3.amazonaws.com/publications/pdfs/000/000/256/ original/KnightFoundation_ SRR_Client_Report_061518v4_ab.pdf.
34 Fisher and Taub (2018); Joshua Bielberg and Darrell M. West, “Political polarization on Facebook,” The Brookings Institute, 13 May 2015, https://www.brookings.edu/blog/techtank/2015/05/13/ political-polarization-on-facebook/; John Bohannon, “Is Facebook keeping you in a political bubble?,”Science (May 2015).
35 See, e.g., Ena Dion and Philippe Leroux-Martin, “When Reforms, Meant to Ease Violence, Backfire Instead,” (Washington, DC: United States Institute of Peace, 8 August 2017), https://www. usip.org/publications/2017/08/ when-reforms-meant-ease-violence-backfire-instead.
36 Rachel Brown, one of this article’s authors, founded and directed SNA-K, providing uniqueinsights into its approach and programming.
37 In 2007-2008, a contested Kenyan presidential election resulted in widespread violence. Support for the candidates and the subsequent conflict fell along ethnic lines. Human Rights Watch, “Ballots to Bullets: Organized Political Violence and Kenya’s Crisis of Governance,” March 2018, https://www. hrw.org/report/2008/03/16/ballots-bullets/organized-political-violence-and-kenyas-crisis-governance.
38 For instance: “Fellow Kenyans, the Kikuyus have stolen our children’s future...We must deal with them...[through] violence...” “No more innocent Kikuyu blood will be shed. We will slaughter them ... For justice, compile a list of Luos and Kalus...” Ofeibea Quist-Arcton, “Text Messages Used to Incite Violence in Kenya,” NPR, 20 February 2008, https://www.npr.org/templates/story/story. php?stor yId=19188853.
39 “Programming for peace: Sisi Ni Amani Kenya and the 2013 elections,” GCS Occasional Paper Series on ICTs, Statebuilding, and Peacebuilding in Africa (December 2014), https://global.asc.upenn.edu/app/uploads/2014/12/ SisiNiAmaniReport.pdf.
41 For instance, “We the people of Dandora let us stand strong to promote peace in our commu- nity...” A subscriber replied: “we were planning to avenge the death of our friends....after we received the message...[we] thought that revenge would not bring our friends back.” Neelam Verjee, “Will Kenya’s elections transform text messages from deadly weapon to peace offering,” Quartz, 2 March 2013, https://qz.com/58510/will-kenyas-elections-transform-the-text-message-from-deadly-weapon-to- peace-offering/.
42 “Programming for Peace...” (2014).
43 Anjana Susaria, “How Artificial Intelligence can detect—and create—fake news,” The Conversation, 3 May 2018, https://theconversation.com/how-artificial-intelligence-can-detect-and-create-fake- news-95404; Bernard Marr, “Fake News and How Artificial Intelligence Tools Can Help,” Forbes, 16 May 2018, https://www.forbes.com/sites/ bernardmarr/2018/05/16/fake-news-and-how-artificial-intelligence-tools-can-help/#37fe2150271d.
44 Nicola S. Schutte and Emma J. Stilinović, “Facilitating empathy through virtual reality,” Motivated Emotions 41, no. 6; Beátrice S. Hasler et al., “Virtual Peacemakers: Mimicry Increases Empathy in Simulated Contact with Virtual Outgroup Members,” Cyberpsychology Behavior and Social Networking 17, no. 12 (2014); Chris Milk, “How virtual reality can create the ultimate empathy machine,” Ted Talk, March 2015, https://www.ted.com/talks/chris_milk_how_virtual_reality_can_create_the_ulti- mate_empathy_machine.
45 Emile Bruneau, Mina Cikara, and Rebecca Saxe, “Parochial Empathy Predicts Reduced Altruism and the Endorsement of Passive Harm,” Journal of Social Psychology and Personality Science (June 2017), 934–42.