Online Harms and Platforms’ Responsibility

Introduction

With the rapid development of the Internet and social media in modern society, more and more people use social media to express their thoughts. According to John Perry Barlow’s “Declaration of the Independence of Cyberspace”, the internet’s initial free speech vision is to establish a place where anyone from anywhere could freely show their brief, whether the brief is mainstream or not, there is no need to worry about being forced to be silent or obedient (Barlow, 1996). People share their lives and connect with other people who are in the distance and various areas through the social media platforms. Social media has become an indispensable core part of people’s daily life. However, the prevalence of social media has also brought some problems that do harm to people, such as online harassment and online hate speech. Platforms make some limited measures for this harmful phenomenon, and most social media companies only take a laissez faire way to alleviate the extremist content on their platform (House of Commons Home Affairs Committee, 2017).

This blog will mention some examples of the online harassment and the online hate speech, and pointed out the problems that exist in the platform governance. In addition, I will also discuss how the platform minimizes the harm caused by people’s non-standard use of social media to people in the future.

 

 Online Harassment Case – Gamergate

Here is a high profile case of online sexual harassment – Gamergate. A female independent game designer’s ex-boyfriend wrote a blog about their terrible breakups and posted it to the SomethingAwful forums in a thread (Massanari, 2015). The ex-boyfriend, Eron Gjoni, wrote a blog post to censure Zoe Quinn for her infidelities, including “sleeping with a journalist at the gaming site Kotaku” (Malone, 2017). This blog post was spread extensively in the 4chan forums. Since the geek culture and STEM see females as the sexual objects and unwelcome interlopers, women get the barrier to entering the spaces that concern the geek culture and STEM interests. After Quinn posted Depression Quest, an interactive fiction game, to the Steam Greenlight service, she became the target of the online harassment and received lots of messages about rape and death threats (Massanari, 2015). Under the situation of Gjoni’s blog post combines with this background, Quinn was implied that her success was due to the continuation of her intimacy with the game journalist. After the blog was posted, the online sexual harassment of Quinn eventually evolved into a terrible campaign to deprive women and their allies in the game industry of their legitimacy and sexually harass them (Massanari, 2015).

 

Online Hate Speech

The wide use of social media also brings the problem of online hate speech. The definition of the hate speech is the speech that “express, encourages, stirs up, or incites hatred against a group of individuals distinguished by a particular feature or set of features such as race, ethnicity, gender, religion, nationalist, or sexual orientation” (Parekh, 2012). According to Bronwyn Carlson and Ryan Frazer’s “Social Media Mob: Being Indigenous Online”, the use rate of social media for Aboriginal people is higher than the non-Indigenous Australians (Carlson and Frazer, 2018). People who express their Aboriginal identity on social media are often questioned. The other users of the social media always use the stereotype of Aboriginal to query whether the user who shares their Aboriginal identity is real Aboriginal or not. For example, the Aboriginals are always queried about their skin colour. The racist stereotype of “inferiority” or “criminality” to Aboriginals always exists in other users’ online interactions (Carlson and Frazer, 2018). Because of the stereotype and discrimination of the Aboriginal identity, some people decided not to show their indigenous identity to keep their personal safety. The discrimination and prejudicial comments always lead to racism. Besides, “social media is not a neutral space for social, cultural and ethnic minorities” (Carlson and Frazer, 2018). There exist the problem that the Indigenous people suffer from racial violence and discrimination in the space of social media. The news media always report the news of online racism against Indigenous people, for example, after an Indigenous 14-year-old boy died, the news media reported he was hit and killed by a non-Indigenous driver (Carlson and Frazer, 2018). This news led to the racial tension between the Indigenous people and non-Indigenous people, and caused a lot of racist remarks. In addition, the online community on Facebook and AirG have become the platform for the young girls fighting over boys to show their jealousy (Carlson and Frazer, 2018). The online discrimination and online violence against Indigenous people, and the fighting between young girls could be considered as the forms of online hate speech. 

 

Platform’s Governance Problem

For the examples of online harms that we mentioned above, the online harassment and the online hate speech, we could find there exist problems in social media governance. The platforms did not do a good job of stopping the harmful messages spread. For the example of Gamergate, 4chan spread Gjoni’s blog post, and Quinn received the rape and death threats because of the blog post. The platform of 4chan was used as the space for the spreading of online harassment. For this case, there was a hashtag created, called “GG”, to support Quinn. However, this hashtag did not work to curb the harassing behaviours. There were a lot of people with ulterior motives who use this hashtag to conduct harassing the game developer, feminist critics and their male allies on Twitter or the other platforms (Massanari, 2015). Although all of these harassments used the hashtag “GG”, these behaviours were considered as personal behaviours. The discussion of Quinn and “GG” on 4chan was not banned Until late September 2014. The harassment continued for more than a month before the relevant content was banned by the administrator. This also reveals how laissez-faire the platform is about harassment. In addition to platform inaction, the platform’s design is also considered related to the culture and politics. The platform that spread the messages of Quinn and “GG” might allow the anti-feminism activist groups to take the main domination (Massanari, 2015). For the examples of online hate speech, the platform has the regulatory issues. For the Indigenous identity, the social media platforms did not promote the expression of the Indigenous identity. The platform did not pay enough attention to the discrimination comments and the racism. In the case of the death of an indigenous 14-year-old boy that I mentioned above, we could see the Indigenous people suffered violence and discrimination online. After the news of the death of the Aboriginal boy was reported, the Facebook appears a post reported that the boy stirred up racial tensions between Aboriginal and non-Aboriginal people before his death (Carlson and Frazer, 2018). This post attract lots of racist, violent and genocidal comments, and then the page was shut down (Carlson and Frazer, 2018). Delayed processing of comments by platforms fuels online racism. For the example of young girls fighting for boys through the Facebook and AirG, the platform is responsible for oversight. The social media should be used to share life rather than spread the gossip. All of the above show the problem that exists in the process of the platform’s governance. 

 

Social Media Platform’s Responsibility

Social media brings many benefits to the modern life, but it also has the negative effects. Social media platforms should be responsible for the probably bad issues. As we mentioned in the section of “Platform’s Governance Problem” in this blog, platforms’ regulating behaviours and environments did not help to solve the harms, but promoted the harms. The platforms did not pay attention to regulation, so the government paid attention to the harmful behaviour online. The government formulated a proposal to regulate the platforms. In the approach, there are two main elements: (1) platforms (software system and business system) should be the main part to be regulated; (2) the operators of platform should do the risk management (Woods and Perrin, 2022). For the first element, platforms’ services’ design, business model, tools provided to users and the resources that provide for users’ complaints and safety should be regulated (Woods and Perrin, 2022). These should be monitored because they will have an influence on the information flow of the platform. The platform should concern with three points that may lead to the online harm, which are “the point at which a user engages with the platform”, “the mechanisms by which content is disseminated”, and “the mechanisms by which recipient users engage with content” (Woods and Perrin, 2022). The complaints and recording system will affect people’s behaviours on social media. For the second element, the operator should be responsible for the risk that is brought by platforms. There are three problems related to the risk. The first is whether harm will be made under the condition (Woods and Perrin, 2022). The platform should take preventive measures to predict the harms. The second is to rely on the evidence that already exists to assess the risk of harm (Woods and Perrin, 2022). The third is the platform should have the relevant risk mitigation measures, and the response measures should be effective (Woods and Perrin, 2022). In addition, operators of the platforms have the responsibility to tell their operations to the regulator. Social media should also respond quickly when problems are urgent, for example, the terrorism, to minimize the harm. From all of the above, the platform should have the responsibility to minimize the online harms that are brought by social media. 

 

Conclusion

In this blog, I talk about the online harassment, online hate speech, governance issues that exist in the social media governance, and how the platform should be responsible for the online harm. For the online harassment, I gave the case of Gamergate. Quinn suffers the online sexual harassment because of her ex-boyfriend’s blog post. The platform did not handle it in time, then this online harassment was developed into the terrible movement of online harassment against all the women in the game industry. For the online hate speech, I mentioned the situation of expressing Aboriginal identity and the racism. Indigenous people choose not to express their Aboriginal Identity to keep their own safety and avoid being harassed about whether they are real Aboriginals. Through the case of the death of the Indigenous 14-year-old boy, we could also see the online racism that happened on the social media platform. The social media platform also is used as a tool to spread gossip among girls to show girls’ jealousy when they are fighting for a boy. All of these cases point out the problems that exist in the governance of social media. The platform does not stop the harmful messages from spreading on time. The laissez faire attitude of the social media platform also shows their political position to a certain extent. However, the social media platform should be responsible for these harms. Since the platform did not do well in regulating, the government formulated the rules for minimizing the online harm. The first approach is the platform should be regulated. The complaints and recording system should be used in the social media to influence people’s behaviours on social media. The second approach is the operator should do the risk management. The operators also have the responsibility to tell their operations to the regulator on time. To sum up, the platform has the duty to monitor issues that happened on their social media platforms and make a timely response to minimize the harms that are brought by the nonstandard use of users.

 

 

 

 

Reference

Barlow, J. P. (1996). A Declaration of the Independence of Cyberspace. https://www.eff.org/cyberspace-independence

Carlson, B. & Frazer, R. (2018). Social Media Mob: Being Indigenous Online. Macquarie University, Sydney. 

House of Commons Home Affairs Committee. (2017). Hate crime: abuse, hate and extremism online (No. HC609). London: House of Commons.

Malone, N. (2017, July 24). Zoë and the Trolls Video-game designer Zoë Quinn survived Gamergate, an act of web harassment with world-altering implications. Intelligencer. Retrieved from https://nymag.com/intelligencer/2017/07/zoe-quinn-surviving-gamergate.html

Massanari, A. (2015). #Gamergate and The Fappening: How Reddit’s algorithm, governance, and culture support toxic technocultures. New Media & Society, 19(3), 329-346. DOI: 10.1177/1461444815608807

Parekh, B. (2012). Is there a case for banning hate speech? In M. Herz and P. Molnar (Ed.), The Content and Context of Hate Speech: Rethinking Regulation and Responses (pp. 37–56). Cambridge: Cambridge University Press.

Woods L. & Perrin W. (2022). Obliging Platforms to Accept a Duty of Care. In Moore M. & Tambini D. (Ed.) Regulating Big Tech Policy Responses to Digital Dominance (pp.93-109). New York, NY: Oxford University Press