A blog about social media, datafication and development
Facebook meets Myanmar: How the Digital Revolution Incites Violence

Facebook meets Myanmar: How the Digital Revolution Incites Violence

What’s the headline?

This post aims to reflect upon the role in which Facebook – as a gateway for the spread of disinformation – incites violence and political divide, particularly analyzed within the context of Myanmar’s political history and a recent Internet activity boom as part of the tech giant’s expansion tactics.

How do people in emerging democracies with a history of media censorship, authoritarian rule and ethnic conflict respond to suddenly getting access to a social network like Facebook? What is the strategy behind such conglomerate expansion? And what roles do datafication and digital divide play in this tale?

 

What’s the backstory?

An increasing number of academics are warning us about the likelihood of social platforms escalating political divide and polarization. Recent events in Myanmar reiterate this notion; international (particularly Western) media and human rights spokespeople have blamed Facebook for being used as a platform to spread extreme speech and disinformation, ultimately igniting turmoil targeting the Rohingya Muslim minority.

It should be said that a long historic trajectory plays part in this tale. To get up to speed, this interactive timeline helps underline a few major events.

We start our chapter in 2011, when a semi-elected government keen to bolster its credibility around the world (partly) addressed media-related censorship; the authorities furthermore started pinning matters of freedom of speech and assembly onto the national agenda. Simultaneously, thanks to the welcoming of foreign telecommunications companies and Chinese cellphones, prices for mobile services plummeted (Fink, 2020, pg. 44).

This mobile phone surge was not a one-off occasion. Such technology has reportedly quadrupled between 2005 and 2011 within the top 20 recipient countries of humanitarian aid. And these devices precariously seem to replicate existing power relations by creating a “digital divide” where those without access become the most marginalized (Read et al., 2016, pg. 9).

By striking deals with these companies, Facebook was one of the major influencers allowing for uncontrollable growth in Myanmar. Phones are generally sold with the Facebook app preloaded, and some carriers let you roam freely therein without counting towards phone plan minutes. This democratized transition literally had Facebook user numbers exceedingly multiply in only a matter of years, and it’s fair to say the application today is the definition of the World Wide Web in Myanmar. To provide a comparison: some 21 million Internet users were accounted for in 2019, amounting two fifths of the population – all of them active on Facebook (Kyaw, 2019, pg. 2).

 

How Facebook has been welcomed locally

Essentially, various Myanmarese demographics have embraced Facebook wholeheartedly. And given the circumstances, it’s not very odd. A near 50-year militant government had imposed a system of strict media censorship – both controlling foreign news outlets domestically and requiring international agencies to hire local citizens in order to control the news leaving Myanmar (Leong, 2020, pg. 100). When human rights abuses were acknowledged by pro-democratic forces online, “the military government started requiring permission for website creation, on the one hand, and began bolstering its online news presence on the other” (Leong, 2020, pg. 100).

The term “fake news” has been trending as of late, much thanks to social networks such as Facebook, which raises important issues of digital literacy and skills to use technologies. While the ICT discussed in this text are not new on a global scale, the timing and short timeframe of their expansion in Myanmar provide a distinct glimpse into the construct and affect algorithms have on our everyday lives. Likewise, dangerous speech as a phenomenon is not new to Myanmar, but its societal and cultural contexts provide distinctive factors in understanding why it has been particularly poignant there.

 

The ‘systematic hiccup’

For one, the majority population is Buddhist in which the role of the monks is highly regarded; the government, too, is dominated by Bamar Buddhists by whom many ethnic minority groups traditionally have been marginalized (Whitten-Woodring, 2020, pg. 408). As such, a key player in this controversy is the Buddhist ultranationalist movement that recently surfaced and rose into a viral hub of likeminded thanks to effective use of Facebook. The clear targets in this approach were the minority Muslim community in general, and the Rohingya population in particular.

Arguably, Facebook’s reach, interactivity, and viral potential spurred a climate of fear and divide, which in at least one notable case led to offline violence (Fink, 2020, pg. 44-45). In 2014, a fabricated post about rape of a Buddhist woman by two Muslim men was widely shared, among those by a respected ultra-nationalist monk named Wirathu, who seemingly proposed a call-to-action. Mob violence broke out, which resulted in two casualties and dozens injured (Whitten-Woodring et al., 2020, pg. 410).

This notion can be related to social movement scholars Charles Tilly and Sidney Tarrow, who define what they call contentious data politics as a form of interaction “in which actors make claims bearing on someone else’s interests, leading to coordinated efforts on behalf of shared interest or programs.” This happens, for instance, when “ordinary people – often in alliance with more influential citizens and with changes in public mood – join forces in confrontation with elites, authorities, and [in this case] opponents.”

 

Facebook’s role in monitoring activities

On the monitoring side we find human-rights activists, elected officials, Buddhist monks, and interfaith organizations – all working to reduce or prevent violence. But not without threatening opposition, which, Christina Fink writes, “has had a chilling effect on those who would speak out in defense of social inclusion” (Fink, 2020, pg. 46). International press and UN officials have described the Rakhine State security operations as ethnic cleansing. Yet most Myanmarese disagree. Through state media and Buddhist-nationalist Facebook pages, the only information on their News Feeds have been those of displaced and traumatized Buddhists and Hindus who were justifiably terrified of the Rohingya militants.

As a result, many used Facebook to call for counter-operations on the Rohingya. The Myanmar government ultimately had voter incentives not to push out any counter-narrative measures on Facebook or in State media against the Buddhist nationalist movement, seeing as the majority sentiment reasoned with this provocation. In 2018, however, an official social-media oversight team was set up to identify “cases that harm the stability and tranquility of the country” (Fink, 2020, pg. 47-48).

As of mid-2018, Facebook did not have a content-monitoring office in Myanmar – only individuals in other countries. Following criticism, the social network blacklisted two ultranationalist monks and the largest Buddhist nationalist organization and guaranteed that the number of content reviewers would rise. In hindsight, according to Fink, a “largely technological approach, focused primarily on artificial intelligence searches for key words, is too crude” in assessing the context and players who are engaging in dangerous speech. (Fink, 2020, pg. 48-49)

But in order to react to unwanted behavior, one must also understand how Facebook actually is being used in Myanmar. Recent interview reports with Facebook users in the country have indicated that adding strangers to expand information networks to (a belief of) influence of New Feeds is common. While this does not necessarily show a good understanding of Facebook’s algorithm, it reveals an understanding that Facebook responds to their actions in conditional ways: If one adds valuable “Informative Friends,” then perceived useful posts come into the News Feed. However, by adopting Facebook as their primary (and many times only) source of news, citizens help fuel a collectively created moral space of social conduct and acceptable use of the app (Leong, 2020, pg. 101-102).

 

Countermovements

There are various local actors tied to counter-movements or in other ways opposing the violence: religious key players as well as peace NGOs, Muslim rights groups and social media activists. Some of these formations have prolonged from the multiple civil wars and now shifted interest to the Buddhist-Muslim tensions; others have emerged as a response to the Buddhist mobilization. Hateful speech, disinformation and calls for offline violence all flourish via Facebook. Some Buddhist monks have countered this propaganda with own posts and videos about Buddhism as tolerant and non-violent. Other groups monitor hate speech or strategically use counter-messaging against it (Orjuela, 2019, pg. 141-142).

Civil society players have also made direct interventions to handle or prevent conflict. For example, local community monitors report risks of conflict escalation. Yet local activists describe a situation where few people are willing or able to openly criticize the leading Buddhist nationalist groups. Counter-voices are either shut down as Westernized propaganda; other potentially influential figures remain silent as they, too, attribute similar views to the Muslim community, even if violence might not be their tactic of preference (Orjuela, 2019, pg. 141).

In essence, the Buddhist nationalist movement can be understood as a response to societal change related to globalization, semi-democratization and war-to-peace transition. Buddhist nationalists aim at preserving a status quo of Buddhist (majority) preeminence and devotion perceivably under threat. The resistance, on its end, wants to preserve another status quo – a society characterized by peaceful coexistence between religious and ethnic groups. Both have mobilized recently, but build on preexisting movements. But it should be noted that Facebook, due to its embedded functions of enabling fast dissemination of hateful messages – given its algorithms’ preference to spread disinformation – and creation of filter bubbles where opposing views and source criticism are averted, make the mobilization for peace on such platform a David vs Goliath- type of mismatch. It requires new skills often not held by peace activists (Orjuela, 2019, pg. 146-147).

 

Why users rely on disinformation

As Lorian Leong deliberates, “Facebook was rendered as a symbolic object representative of a multiplicity of information-related concepts and states: as an app of information, as an object validating its users as informed, and as a means to present oneself as informed.” In this self-created environment, the news become highly personalized but, in that sense, also framed, re-contextualized, and one-sided. The Facebook algorithm keeps feeding users more of the same content, and these posts are being further legitimized by friends and gatekeepers who curate and share them (Leong, 2020, pg. 102-104).

Technology sure can have a self-reinforcing logic, which may become troublesome if technologies become neutralized and mainstreamed to the extent that they no longer are subject to fundamental questioning, or else exclude other methodologies (Beraldo and Milan, 2019, pg. 9).

There are many explanations as to why humans rely on disinformation. For one, our psyche is often drawn towards simple answers: we rely on our “gut feeling,” repetitive information and things that circulate in our immediate sphere. Furthermore, we value information that enforces our current beliefs to avoid doubt. Dogmatic or religious people are said to be even more prone to this type of disinformation. Add to that the history of militant rule and the fact that rumors play a big role in Myanmarese culture, and the reasons pile up (Whitten-Woodring et al., 2020, pg. 412).

As Fechter and Hindman note: “technology is co-constitutive of the humanitarian environments it seeks to capture. Rather than merely reflect, or compress, a picture, it has the capacity to construct and define it” (Beraldo and Milan, 2019, pg. 7).

Even those users who do not actively engage in disinformation bias are helped by the algorithms designed by Facebook to do just that – feed people content that mirrors their (un)conscious preferences in order to maximize a user’s time on the site. This potential narrowcasting is challenging on a global scale; although incendiary information targeting a particular group in a country rife with ethnic conflict and lacking in democratic traditions certainly lower the odds of offline violence.

 

Conclusion: The inadequate translation of Facebook into a local context

According to William Gaver’s ‘affordance theory’, it is imperative to note the properties of the Facebook platform as well as how it is being perceived by its users. To be fair, Facebook could indeed afford new voices to be both raised and heard, more forums for open dialogue and bottom-up communication, new opportunities to engage politically for the commoner, various networking possibilities, and increasingly flat and flexible forms of social organization. And while these so-called affordances do exist irrespectively of whether users know of them or not, they cannot be utilized as tools for social change unless understood as such. And such perceptible affordances must essentially be socio-culturally relevant and understood from within a local context (Svensson, 2018, pg. 12).

Communication platforms are not neutral technologies for people to use. They bring norms and values with them in the sense that there are certain preferred ways of interacting with the devices inscribed both in their design and how users use them (Svensson, 2018, pg. 12). In this text, we have touched upon the way Facebook uses its monopoly to influence its customers to do what is good for the enterprise by introducing, for example, the timeline, by augmenting advertisements, by personalizing information, etc. (Krotz, 2017, pg. 107) While difficult to fully assess, the information collected about Facebook users in Myanmar enforces the idea that Facebook is near-identical to the Internet because users are – for various reasons of digital illiteracy, lack of funds or otherwise – only exposed to this medium. Yet, in many cases, Internet as a source outside of the Facebook sphere is not totally inaccessible.

When considering the social and political conditions under which Facebook was unleashed in Myanmar, it seems clear the network did not balance its implications towards its own expansion. Facebook cannot be solely liable for the causes of social cleavages that became so salient on the platform, but it seems to have abetted their violent clashes.

 

 

References

Beraldo, David and Stefania Milan. “From data politics to the contentious politics of data.” Big Data & Society, July-December 2019: 1-11. DOI: 10.1177/2053951719885967

Couldry, Nick & Hepp, Andreas. (2013). Conceptualizing Mediatization: Contexts, Traditions, Arguments. Communication Theory. 23. 10.1111/comt.12019.

Eriksen, Thomas Hylland. Globalization : The Key Concepts, Taylor & Francis Group, 2014. ProQuest Ebook Central, https://ebookcentral.proquest.com/lib/malmo/detail.action?docID=6209142.

Fink, Christina. “DANGEROUS SPEECH, ANTI-MUSLIM VIOLENCE, AND FACEBOOK IN MYANMAR.” Journal of International Affairs, vol. 71, no. 1.5, 2018, pp. 43–52. JSTOR, www.jstor.org/stable/26508117. Accessed 10 Oct. 2021.

Hepp, Andreas, et al. “Mediatization: Theorizing the Interplay between Media, Culture and Society.” Media, Culture & Society, vol. 37, no. 2, Mar. 2015, pp. 314–324, doi:10.1177/0163443715573835.

Krotz, Friedrich. ”Explaining the Mediatisation Approach, Javnost” The Public, 2017, 24:2, 103-118, DOI: 10.1080/13183222.2017.1298556

Kyaw, Nyi Nyi. 2019. “Facebooking in Myanmar: From Hate Speech to Fake News to Partisan Political Communication.” ISEAS Yusof Ishak Institute, no. 36, May 2019, pp. 1-10, http://hdl.handle.net/11540/10254.

Leong, L. “Domesticating Algorithms: An Exploratory Study of Facebook Users in Myanmar.” Information Society, vol. 36, no. 2, pp. 97–108. EBSCOhost, doi:10.1080/01972243.2019.1709930. Accessed 11 Oct. 2021.

McEwan, Cheryl. Postcolonialism, Decoloniality and Development, Taylor & Francis Group, 2018. ProQuest Ebook Central, https://ebookcentral.proquest.com/lib/malmo/detail.action?docID=5606235.

Myint-U, Thant. The Hidden History of Burma: Race, Capitalism, and the Crisis of Democracy in the 21st Century, Center for Strategic & International Studies. YouTube, 19 Nov. 2019. https://www.youtube.com/watch?v=4_EhKWaHThU

Orjuela, C. “Countering Buddhist Radicalisation: Emerging Peace Movements in Myanmar and Sri Lanka.” Third World Quarterly, vol. 41, no. 1, pp. 133–150. EBSCOhost, doi:10.1080/01436597.2019.1660631. Accessed 20 Oct. 2021.

Orlowski, Jeff. “The Social Dilemma.” Netflix Official Site, Netflix, 9 Sept. 2020, www.netflix.com/se/title/81254224

Read, Róisín, et al. (2016): Data hubris? Humanitarian information systems and the mirage of technology, Third World Quarterly, DOI: 10.1080/01436597.2015.1136208

Svensson, Jakob. “Empowerment as Development: An Outline of an Analytical Concept for the Study of ICTs in the Global South” Springer Nature Singapore, 2018. J. Servaes (ed.), Handbook of Communication for Development and Social Change, https://doi.org/10.1007/978-981-10-7035-8_43-1

Tufte, Thomas. Communication and Social Change : A Citizen Perspective, Polity Press, 2017. ProQuest Ebook Central, https://ebookcentral.proquest.com/lib/malmo/detail.action?docID=4854000.

Whitten-Woodring, J., et al. “Poison If You Don’t Know How to Use It: Facebook, Democracy, and Human Rights in Myanmar.” International Journal of Press/Politics, vol. 25, no. 3, pp. 407–425 EBSCOhost, doi:10.1177/1940161220919666. Accessed 11 Oct. 2021.