The Spread of Online Hate Speech against the Rohingya on Facebook and Facebook’s Ineffective Content Moderation
Introduction
In today’s world, increasing hate speech propagators have used online means to convey their evil ideas, thanks to the fast growth of Internet connections. Hate speech’s definition is a speech that expresses, promotes, incites or stirs up hate towards a group of people differentiated by a specific trait or collection of attributes such as gender, sexual orientation, nationality, race, religion and ethnicity (Parekh, 2012, p. 40). In Myanmar, the most important influence of online speech has been performed by Facebook. This is because, for Myanmar’s citizens, Facebook is the most accessible and cheap social media platform and so gradually Facebook becomes Myanmar’s primary social media platform and primary information source (Warofka, 2018). Under this circumstance, the belief that information shared on Facebook is trustworthy has been created in the majority of Burmese due to the spread usage of Facebook by authorities and major media using Facebook as the main communication channel (Independent International Fact-Finding Mission on Myanmar, 2018, p, 340). On the other hand, with online user anonymity, the delay and difficulties in monitoring and removing hate speech content, Facebook’s dominance has become itself a potent platform for the spread of hate speech (Whitten-Woodring et al., 2020, p, 409). As a result, through a series of operations by those who have intentions, hate speech against the Rohingya Muslim minority has been widespread on Facebook increasingly, until it becomes a tragedy offline. And platformed racism generated by the spread of hate speech would amplify the negative effects of hate speech.
The historical background of the Rohingya tragedy
For historical reasons, there has been an ethnic problem against the Rohingya within Myanmar. Most Bamar population refuse to accept the Rohingya Muslim minority (Patel, 2018, p. 219). Myanmar’s government have steadily limited the Rohingya’s rights and refused their citizenship claims since 1962, often treating them as illegal migrants and designating them Bengali (HRW, 2014). The limitations on Rohingya rights affect almost every area of their existence such as healthcare, education, and marriage. The Rohingya had become one of the world’s most populous stateless groups in 2018 (Ibrahim, 2018, p. 53).
The media background of the Rohingya tragedy
Myanmar is one of the last countries to join the Internet revolution. Since 2012, mobile phone SIM cards were decreased in price from hundreds of dollars to as low as one dollar and smartphones’ prices are only less than $10 (Gowan, 2014). Plus, the policy reforms contain new media freedom, and economic liberalization (Lee, 2019, p. 3209). It means that international carriers entering Myanmar’s mobile phone market became possible. Because of these changes, most of Myanmar’s citizens can have access to the Internet, which has changed the way the country communicates.
At the same time as media reform, the aggressive pursuit of market share by Facebook in Myanmar— Facebook’s Free Basics programme was introduced. In Myanmar, practically all smartphones sold are preloaded with the Facebook Free Basics programme, which means that users can access a restricted version of the internet that only has Facebook (Davis, 2020, p. 107). Facebook’s monopoly on the market led to most citizens never hearing of Signal, Twitter, Google, or other similar applications before the military takeover on February 1, 2021 (Tønnesson & Aung, 2021, p. 12). In other words, to a great degree, Facebook is the Internet in Myanmar.
The hate speech from ultranationalists and the Burmese military on Facebook
The nation moved from a state of no expression to one of unrestricted expression and transitioned from a nation and civilisation that was completely cut off from the rest of the world to one that was suddenly linked up and interconnected. Given this dramatic change, it seems to be self-evident that long-slaved people suddenly granted the power of free speech may not utilise it responsibly and correctly without correct guidance (Whitten-Woodring et a., 2020, p. 412). Moreover, in Myanmar, only a third of the population has received a primary education and where religious forces are constantly fighting within the country. So, it seems to be difficult for most citizens there to think critically when facing the storm of online hate speech.
In Myanmar, fake news and fake accounts have regularly been used to disseminate hate speech on Facebook. In the context of the smartphone revolution. A Buddhist ultranationalist movement aiming at the Rohingya Muslim minority has formed and gained traction since 2012. The movement’s development was owing in part to Facebook’s efficient utilising of the platform to establish a virtual community united by shared worries (Fink, 2018, p. 44). On Facebook, the Rohingya Muslim minority has been portrayed by ultranationalists as constituting a personal danger as well as a threat to Myanmar, only since their different religion from the other majority of people (Schissler et al., 2017, p. 379). Ultranationalists have claimed that Muslims including the Rohingya birthrates are rising, that Muslim economic power is growing, and that Muslims are plotting to take over the nation. In reality, the number of Muslims makes up fewer than 5% of the entire population of Myanmar and the number of Rohingya is even smaller, according to census data (Fink, 2018, p. 260). Furthermore, on Facebook, plenty of graphic photographs of ISIS cruelty and photos of community unrest in Myanmar have been selected by Ultranationalists to insinuate that the Rohingya are an unstabilizing factor and terrorists (van Klinken & Aung, 2017, p. 371). Dehumanizing language has also been employed by certain ultranationalists to describe the Rohingya (see Figure 1, Figure 2). Similar speech abounds on Facebook. The venomous postings refer to the Rohingya as rapists, dogs and maggots, and recommend that they be given to pigs, shot, or killed (Fink, 2018, p. 50). The enormous reach of Facebook, the rapid dissemination of information, the influence of demagogic imagery and inflammatory language, and the participatory aspect of sharing and commenting on the site have all contributed to a rise in fear among Burmese Buddhists and other non-Muslims.
Figure 1 Hate speech about the Rohingya on Facebook

source:https://www.adnews.com.au/news/facebook-losing-the-battle-to-combat-hate-speech-in-myanmar
Figure 2 Various abusive posts

source: Facebook
Additionally, for years, hundreds of personnel have been deployed by Tatmadaw, namely the official name of the armed forces of Myanmar, to create fake news, accounts and many popular celebrity pages on the Facebook platform to attract a significant number of people to follow (Mozur, 2018). These Facebook pages have evolved into a channel to distribute gruesome photographs, inflammatory messages and fake news. After then, these fake accounts were utilised to spread misinformation, stifle critics, and generate debate among commentators in order to upset the public (Tønnesson & Aung, 2021, p. 23). As described above, Myanmar exists historical segregation and divisions, leading to stereotypes. Coupled with successive military governments have created rumours, anxieties, and paranoia throughout the years (Kipgen, 2016, p. 62). Hate speech narratives have gained a lot of traction in Myanmar since they take advantage of existing conditions. Consequently, offline attacks against the Rohingya have been escalated as a result of this.
Facebook’s failing content moderation for hate speech against the Rohingya
In 2017, Facebook has recognised that it contributed to the incitement of violence amid Myanmar’s genocide of the Rohingya Muslim minority. Facebook said, “We agree that we can and should do more,” in their own words and claimed they will devote efforts to stop hate speech from spreading in Myanmar (Facebook, 2018). In response to the incident, since then, Facebook has promised to prepare to make adjustments such as hiring additional content reviewers who are fluent in Burmese, strengthening their capacity to detect hate speech using artificial intelligence, especially in Burmese as well as the formation of a specialised team to work in the nation. However, Facebook had just few Burmese-speaking employees until 2018 (Tønnesson & Aung, 2021, p. 4). Moreover, Myanmar did not have a content-monitoring office for Facebook as of mid-2018 (Fink, 2018, p. 45). Additionally, according to News (2022), in a recent investigation of 2022, Facebook failed to identify clear hate speech and violent threats directed against Myanmar’s Rohingya Muslim minority in adverts filed to appear on its platform. The Associated Press was given unique access to the report that showed that Global Witness, a human rights organisation, submitted eight sponsored advertisements to Facebook for approval, each including various forms of anti-Rohingya hate speech. Although eight is not a large number, in terms of what they promote, these posts are alarming and shocking. Nevertheless, It was surprising that Facebook gave its approval for all eight advertisements to be published. As Rosie (2022) states, given this situation, we can conclude that the vast majority of hate speech is likely to pass. The findings indicated that despite Facebook’s assurances that it would improve, Facebook’s porous controls continue to fail to catch relevant hate speech and violent threats on its site to a large degree. In other words, Facebook has not modified or implemented the changes it promised to the public: adequate regulation.
There are many reasons for the above issues. Facebook’s platform rules have been problematic in vetting hate speech. For one, the platform’s hate speech policies are ambiguous, which gives many people with evil motives an opportunity to take advantage of it. Moreover, there are unclear responsibilities of the platform’s users and algorithms concerning violent content (Matamoros-Fernández, 2017, p. 941). It means that in most cases, it seems to be difficult to determine whether humans staff or algorithms are responsible for hate speech on platforms. Furthermore, because platforms monitor material on an ‘ad hoc’ basis, there is some arbitrariness in the application of regulations (Gillespie, 2017, p. 262). Besides the problems of vetting, the root cause of the repeated emergence of hate speech against the Rohingya Muslim minority is the profit-seeking nature of capital. As Matamoros-Fernández (2017) explains, Facebook’s technological infrastructure, regulations, and users’ use of technology to censor material are all part of its editorial processes, which are complicated and widely dispersed. It also entails the labour of outsourced professionals from all around the globe. This means that if Facebook wants to rectify the issues existing in all the parts mentioned above, it has to spend a lot of manpower and material resources. It should be noted that adjustments and taken measures seem unseen by the public and the government in most cases. The ‘unseen work’ tends to be profit-seeking and just meets the platform’s legal demands instead of addressing issues of social justice (Roberts, 2016, p. 150).
Platform regulation’s loopholes cause platformed racism
It is worth being alert that Facebook’s loopholes regarding platform regulation are highly likely to cause platformed racism. Generally, platformed racism has two meanings. The first meaning is platforms as instruments for amplifying and fabricating racist rhetoric; the second is to provide a governing style that might be damaging to certain communities (Matamoros-Fernández, 2017, p. 932). In the Rohingya genocide, platformed racism is reflected in platforms’ ambiguous policies and content moderation as mentioned above and arbitrary regulations enforcement. The aforementioned Myanmar ultranationalists and government are using Facebook as a tool to fabricate and promote racist rhetoric. It also undermines Facebook’s management mode at the same time, which can be verified by Facebook’s response to hate speech against the Rohingya. What is worse is subsequent impacts sustaining. Platforms will likely spread or even exacerbate new hate speech in a situation that has already existed platform racism, creating a vicious circle.
How to avoid a recurrence of the Rohingya tragedy
Social media companies should prioritise combating the propagation of hatred and violence. They should appropriately fund and publicise security and integrity procedures in place for their platform in each nation, ensuring that individuals in all countries with different languages are adequately safeguarded from online hate speech and other online violence.
In areas like Myanmar, where it is apparent that Facebook was utilised to provoke real-world evil that resulted in the deaths of tens of thousands of people, and the loss of thousands of livelihoods and homes. At a bare minimum, Facebook should guarantee that it is not utilised for future provocation and that victims are compensated. However, relying on private lawsuits or expecting businesses to self-regulate is insufficient. Governments must intervene to hold corporations accountable and protect citizens’ rights.
Conclusion
In conclusion, Myanmar ultranationalists and the Burmese military use Facebook to spread online hate speech involving the usage of fake news and fake accounts to stoke the public’s panic against the Rohingya. In the context of Facebook’s ineffective content moderation and a series of regulatory loopholes, these deliberate hate speeches continue to spread and ferment on the Internet in Myanmar until it leads to offline tragedies. Facebook’s incompetent content moderation is responsible for this man-made disaster. Unfortunately, while Facebook has not yet perfected its censorship and supervision system, platformed racism seems to have formed, which seems to have magnified the impact of hate speech and caused a new round of online violence.
References
Christina Fink. (2018). DANGEROUS SPEECH, ANTI-MUSLIM VIOLENCE, AND FACEBOOK IN MYANMAR. Journal of International Affairs (New York), 71(1.5), 43–52.
Davis, A. (2020). Hate Speech in Myanmar: The Perfect Storm. In Disinformation and Fake News (pp. 103–114). Singapore: Springer Singapore. https://doi.org/10.1007/978-981-15-5876-4_8
Facebook. (2018). Facebook approves adverts containing hate speech inciting violence and genocide against the Rohingya. Global Witness. https://www.globalwitness.org/en/campaigns/digital-threats/rohingya-facebook-hate-speech/
Fink, C. (2018). MYANMAR: RELIGIOUS MINORITIES AND CONSTITUTIONAL QUESTIONS. Asian Affairs (London), 49(2), 259–277. https://doi.org/10.1080/03068374.2018.1469860
Gillespie, T. (2017). Governance of and by platforms. SAGE handbook of social media, 254-278.
Gowan, A. (2014, November 22). Cellphone use transforms Burmese life after government opens mobile market. Washington Post. Retrieved from https://www.washingtonpost.com/world/asia_pacific/new-private-companies-spark-mobile-phone-revolutionin-once-isolatedburma/2014/11/21/eb4479c2-6c41-11e4-bafd-6598192a448d_story.html
HRW (Human Rights Watch). (2014). Burma: Government plan would segregate Rohingya. Retrieved from https://www.hrw.org/news/2014/10/03/burma-government-plan-would-segregate-rohingya
Ibrahim, A. (2018). The Rohingyas: Inside Myanmar’s genocide. London, UK: Hurst Publishers.
Independent International Fact-Finding Mission on Myanmar. (2018). Report of the detailed findings of the Independent International Fact-Finding Mission on Myanmar.
Kipgen, N. (2016). Myanmar: A political history. Oxford University Press.
Lee, R. (2019). Extreme Speech in Myanmar: The Role of State Media in the Rohingya Forced Migration Crisis. International Journal of Communication (Online), 3203–.
Matamoros-Fernández, A. (2017). Platformed racism: the mediation and circulation of an Australian race-based controversy on Twitter, Facebook and YouTube. Information, Communication & Society, 20(6), 930–946. https://doi.org/10.1080/1369118X.2017.1293130
Mozur, P. (2018, October 15). A Genocide Incited on Facebook, With Posts From Myanmar’s Military. The New York Times. https://www.nytimes.com/2018/10/15/technology/myanmar-facebook-genocide.html new-private-companies-spark-mobile-phone-revolutionin-once-isolated
News, A. B. C. (2022, March 22). “Kill more”: Facebook fails to detect hate against Rohingya. ABC News. https://abcnews.go.com/Business/wireStory/kill-facebook-fails-detect-hate-rohingya-83576729
Parekh, B. (2012). Is There a Case for Banning Hate Speech? In The Content and Context of Hate Speech (pp. 37–56). Cambridge University Press. https://doi.org/10.1017/CBO9781139042871.006
Patel, C. (2018). The Rohingyas: inside Myanmar’s hidden genocide; Myanmar’s enemy within: Buddhist violence and the making of a Muslim “other.” International Affairs, 94(1), 219–220. https://doi.org/10.1093/ia/iix271
Roberts, S. T. (2014). Behind the screen: The hidden digital labor of commercial content moderation. ProQuest Dissertations Publishing.
Schissler, M., Walton, M. J., & Thi, P. P. (2017). Reconciling Contradictions: Buddhist-Muslim Violence, Narrative Making and Memory in Myanmar. Journal of Contemporary Asia, 47(3), 376–395. https://doi.org/10.1080/00472336.2017.1290818
Tønnesson, S., Zaw Oo, M., & Aung, N. L. (n.d.). Pretending to be States: The Use of Facebook by Armed Groups in Myanmar. Journal of Contemporary Asia, ahead-of-print(ahead-of-print), 1–26. https://doi.org/10.1080/00472336.2021.1905865
van Klinken, G., & Aung, S. M. T. (2017). The Contentious Politics of Anti-Muslim Scapegoating in Myanmar. Journal of Contemporary Asia, 47(3), 353–375. https://doi.org/10.1080/00472336.2017.1293133
Warofka, A. (2018). An independent assessment of the human rights impact of Facebook in Myanmar. Facebook Newsroom, November, 5. Retrieved from https://about.fb.com/wp-content/uploads/2018/11/bsr-facebook-myanmar-hria_final.pdf
Whitten-Woodring, J., Kleinberg, M. S., Thawnghmung, A., & Thitsar, M. T. (2020). Poison If You Don’t Know How to Use It: Facebook, Democracy, and Human Rights in Myanmar. The International Journal of Press/politics, 25(3), 407–425. Retrieved fromhttps://doi.org/10.1177/1940161220919666