Topic: Watch out! Your Facebook is tracking you!
SID:480026518

(Figure 1. Image created by Elena Lacy)
Since the 21st century, the world media landscape has undergone immense changes. Audiences start to read news on Facebook or Twitter, youngsters turn to use Messenger to chat and have social interaction, and celebrities create their accounts on multiple digital platforms to release the latest information for their fans. Welcome to Web 2.0 when digital platforms empower users to participate in cultural consumption, social interaction and collaboration, and social decision, and in turn, users become more unfettered from “mass media systems of owner-controlled and easily regulated content production and one-way transmission” (O’Reilly, 2005; Dahlberg, 2017). Substantially, the long-standing and much-debated social and legal issues brought by digital platforms, such as privacy threats, has rapidly risen recently. Thus, focusing on issues of privacy and corresponding solutions, this blog will explore and argue the following questions: 1) what is privacy and digital privacy, and the importance of personal data privacy; 2) threats to privacy in the digital age, and 3) the current governance framework. This article will also employ a case and a few examples to further explain these threats to privacy and its solutions.
In general, privacy refers to the state of being free from “secret surveillance or unauthorized disclosure of individual information” acting by a person, group, government, and institution (Maitlland and Lynch, 2020). Office of Australian Information Commissioner of the Australian Government defined privacy as a fundamental human right to protect the freedom of association, thinking and speech. A reason why issues of privacy continued mount remarkably in recent years is that it is affected by various factors, Taylor and Rooney (2016, as cited in Goggin et al., 2017) argued that privacy is related to multiple factors, such as “cultural context, nation, age, class, gender, sexuality, race, disability, income, occupation and others”. Thus, the diversity of digital platforms and multiple cultural contexts could initially explain that a majority of their participants responded that they feel loss of their privacy online and concerned about their information security according to the Digital Right in Australian report (Goggin et al., 2017). Furthermore, another causation of growing concerns arise when both users and operators rely more deeply on digital devices or platforms more, through the way of providing and preserving their essential data and information (Friedewald et al., 2017, as cited in Goggin et al., 2017). Therefore, some scholars proposed the concept of “digital privacy” at an overarching level due to the overmuch permission and expanding violation of personal information (Gelman et al., 2018).

(Figure 2. Case study: Watch out! Facebook is collecting your data)
Nowadays, Facebook personalized advertisements and whether it crosses the boundaries of personal privacy is a continuous topic for the public and academics, especially after the recommender system is utilised by the Facebook search engine and advertisement input. A lawyer named Megan Borovicka who sign up for a Facebook account in 2013 (Fowler, 2021). Although she has not used Facebook and never picked any “friends” in the past decade, Facebook has secretly collected her information and presented a list of shopping data of her husband one Christmas. The author who interviewed this lady also try to delete Facebook and Instagram for two weeks, but he still gets notification when he goes shopping or travel and he found that Facebook have tracked him at least 95 apps, websites and businesses.
This case has firstly reflected that both users’ neglect of privacy and lack of transparency of privacy policies gain the carelessness of privacy. In the past decades, Borovicka kept the app on her phone but never checked and the privacy policies have changed several times during this period. In general, privacy policies or term of use are typically aim to provide users with consent materials by entities that participated in data management to abide by the notice policy (Obar and Oeldorf-Hirsch, 2020). The result of Urban and Hoofnagle’s (2014) research also indicates that most consumers are not aware of privacy policies as they suffer from marketplace myopia, the consumers know the potential risks but remain an optimistic attitude, which leads them to feel that they can ignore privacy evaluation of products and services. Obar and Oeldorf-Hirsch (2020) find that participants often ignore privacy and term of service policies for social networking services in their experimental survey, 74% of participants skipped privacy policy and most participants spend an average of 51 seconds reading the term of service (which should take 15-17 minutes to read according to the average adult reading speed). This might be regarded as the notion of “privacy unconcerned” or “privacy pragmatists” proposed by Westin (2013, as cited in Flew, 2021), which refers to an unconcerned or pragmatic attitude to personal privacy. However, Urban and Hoofnagle (2014) reveal that Westin’s taxonomy (“privacy fundamentalists”, “privacy pragmatists”, and the “privacy unconcerned”, especially the second one) fails theoretically as it confused the foundation that users must accept the business models to use the services, which should be a liberal choice, but Westin (2013, as cited in Flew, 2021) transfer the focus on the balance between privacy and new technology on digital platforms. Suzor (2019, as cited in Flew, 2021) also observed the power and “absolute discretion” empowered to operators, digital platforms and online services, under these policies in this business structure, which reveal their relatively supreme position in the digital power class. Thus, in research exploring the influence of transparency on mobile privacy decision making, Betzing et al. (2019) find that increasing transparency of complicated privacy policies is the fundamental requirement for making informed consent and affects users’ attitudes to privacy policies in the long run.
In addition, new technology and its transparency also play an important role in protecting or violating privacy, as it offers unprecedented authority to access consumers’ personal information which allows businesses to provide personalised services (Betzing et al., 2019). In the Facebook case, a shocking fact is that Facebook are tracking author’s position and send notifications to the author about nearby services. Except for the consented information presented by the users, impersonally identifiable information analyses and generated by algorithm on digital platforms also bring user a new threat to privacy (Flew, 2021). Thus, the question should impersonally identifiable information be counted for the range of privacy protection is still demanded to solve by both media and ethics academics.
Furthermore, Betzing et al. (2019) also point out that the services driven by data are solid to assess its nature, for the perceived risks of sharing personal data are less visible than the perceived advantages of personalisation. In other words, people could easily feel the convenience brought by personalised algorithms or new technology but rarely feel or are not aware of their data security. Algorithms existed everywhere in people’s daily life, and their influence also has gradually increased in recent years. As such, transparency of algorithms or new technology has become an essential topic with the development of automated decision-making systems (Burrell, 2016; Pasquale, 2015, as cited in Felzmann, 2019), and many scholars declare the significant importance of transparency in protecting personal privacy (Felzmann, 2019). Growing number of digital users recognise and worry about the influence or potential risks of algorithms to their data privacy and other basic human rights. Hence, American privacy legislations expert Marc Rotenberg announce that defending basic human rights online and cybersecurity, and protecting personal privacy and freedom of expression is challenged due to the lack of algorithmic transparency in the nowadays digital ecosystem (UNESCO).
To protect personal privacy from identity stealing, misuse of personal data, cyber insecurity, the users’ neglect of privacy, the lack of transparency of privacy policies, risks brought by digital platforms and lack of algorithmic transparency, national governance measures are urgently needed. However, the Australian privacy laws has yet not updated personal information definition, strengthened consent requirement, enabled the erasure of personal information in the Privacy Act. As such, a significant and reference step in the national regulation of the global digital platform is the General Data Protection Regulation (GDPR) adopted on 25 May 2018, aiming of building a strong framework to protect personal data, which enable Europe to regulate privacy issues on global digital platforms (Flew, 2021). For example, Art. 5(1) states, “personal data shall be processed lawfully, fairly and in a transparent manner in relation to the data subject (‘lawfulness, fairness, transparency’)”. Such clauses indeed can be regarded as a good start to protect personal data and regulate digital platforms, but many scholars also have pointed out their lacks or gaps. Felzmann et al. (2019) propose that regulations related to transparency maybe failed to achieve the positive purposes associated with transparency due to the unclear results of researching the benefits of applying relative clauses. And back to an overarching level, Flew (2021) also stated that the current legal and regulatory frameworks are limited to solving regulatory issues brought by data fixation and dataveillance. For example, although GDPR could be used to solve some public interest problems regarding personal data, this legislation cannot fix the lack of data collection and data processing (ibid.). Hence, legislation, like GDPR, can be regarded as a significant step to protect privacy, but there are still more issues needed to be solved urgently.
The GDPR legislation could be seen as a sort of external governance, but itself is not enough to solve the current issues around privacy. It is also necessary to consider self-governance and co-governance, the former could establish industry standards and the latter one focus on covering all stakeholders’ interests. For example, Google has set up a privacy sandbox to build a standard to input personalised advertisements and protect lies or fake information, and Facebook also has a privacy committee to regulate the content on Facebook and Instagram. An example of co-governance in Australia is the Australian Privacy Foundation, a non-government organisation devoted to protecting individual privacy rights. They have found plenty of loopholes within the legislation and businesses’ behaviour, such as the doubt of Facebook reality push which is not gaming but utilising data, and the ABC privacy policy which changed to provide ABC followers’ data to Facebook. These three forms of governance and other initiatives form the platform governance triangle, aiming at structural analysing widely varying forms of governance (Flew, 2021).
In conclusion, this article has employed a Facebook case to illustrate a few threats to privacy and three forms of governance. Users’ neglect and the lack of transparency of privacy policies threat persona privacy in the digital era, and some scholars suggested that it is an urgent need to simplify privacy policies and reconsider the power structure between digital platforms and the users. In addition, new technology and its transparency also question the security of personal privacy as impersonally identifiable information could be automatedly generated by algorithms nowadays and the transparency of data used in most digital platforms is invisible to users. Overall, the suggestions to support privacy from the above threats are that users should consciously pay attention to privacy, the Australian government should develop a more sound and contemporary privacy legislation system, and social media companies should continue to establish the privacy committee and surveillance any potential threats. Thus, the platform governance triangle, self-governance, external governance, and co-governance could be employed to provide a governance framework to protect privacy. However, although the global government have promulgated a few legislations, digital platforms have established committees, and non-government organisations play the role of watchdog to protect privacy, the increasing rise of new technologies and changeable social context will still require these three parties to closely follow up in the future.
Reference list
Betzing, Tietz, M., vom Brocke, J., & Becker, J. (2019). The impact of transparency on mobile privacy decision making. Electronic Markets, 30(3), 607–625. https://doi.org/10.1007/s12525-019-00332-3
Chan, R. (2019). The Cambridge Analytica whistleblower explains how the firm used Facebook data to sway elections. Business Insider.
Dahlberg, L. (2007). Cyberlibertarianism. In the Blackwell Encyclopedia of Sociology (pp. 1–2). Oxford, UK: John Wiley & Sons, Ltd. https://doi.org/10.1002/9781405165518.wbeos0720
Felzmann, H., Villaronga, E. F., Lutz, C., & Tamò-Larrieux, A. (2019). Transparency you can trust: Transparency requirements for artificial intelligence between legal norms and contextual concerns. Big Data & Society, 6(1), 205395171986054–. https://doi.org/10.1177/2053951719860542
Fowler, G. (2021, August 29). There’s no escape from Facebook, even if you don’t use it. The Washington post. Retrieved from https://www.washingtonpost.com/technology/2021/08/29/facebook-privacy-monopoly/
Gelman, Martinez, M., Davidson, N. S., & Noles, N. S. (2018). Developing Digital Privacy: Children’s Moral Judgments Concerning Mobile GPS Devices. Child Development, 89(1), 17–26. https://doi.org/10.1111/cdev.12826
González, Yu, Y., Figueroa, A., López, C., & Aragon, C. (2019). Global Reactions to the Cambridge Analytica Scandal: A Cross-Language Social Media Study. Companion Proceedings of The 2019 World Wide Web Conference, 799–806. https://doi.org/10.1145/3308560.3316456
Maitland, N., & Lynch, J. (2020). SOCIAL MEDIA, ETHICS, AND THE PRIVACY PARADOX. Journal of Internet Law, 23(9), 3-14. Retrieved from http://ezproxy.library.usyd.edu.au/login?url=https://www.proquest.com/trade-journals/social-media-ethics-privacy-paradox/docview/2528508858/se-2?accountid=14757
Obar, J. A., & Oeldorf-Hirsch, A. (2020). The biggest lie on the Internet: ignoring the privacy policies and terms of service policies of social networking services. Information, Communication & Society, 23(1), 128–147. https://doi.org/10.1080/1369118X.2018.1486870
Office of the Australian Information Commissioner, Australian Government. What is privacy.
O’Reilly, T. (2007, January). What is Web 2.0: design patterns and business models for the next generation of software. Communications & Strategies, (65), 17+. https://link.gale.com/apps/doc/A180746404/ITOF?u=usyd&sid=bookmark-ITOF&xid=0a2fca95
UNESCO. Privacy expert argues “algorithmic transparency” is crucial for online freedoms at UNESCO knowledge café [press release]. https://en.unesco.org/news/privacy-expert-argues-algorithmic-transparency-crucial-online-freedoms-unesco-knowledge-cafe
Urban, J. & Hoofnagle, C. (2014). The privacy Pragmatic as privacy vulnerable. Retrieved by https://cups.cs.cmu.edu/soups/2014/workshops/privacy/s1p2.pdf