
Introduction
“Big data” is highlighted as a buzzword in the digital age since the technology’s rapid development prevails the use of digital devices (i.e.: laptops, tablets, smartphones). Thus, it facilitates people to generate tremendous data during communication on social networking. The highlight of big data is not only on the extraordinary quantity but also on the quality.
Firstly, big data has been generated in a pervasive and massive way, from the online recording of consumption; communication on digital devices; transaction via online banking; surfing data and social media posting, etc. (Kitchin, 2014; Arifin, Hariadi & Anshari, 2017). All of the activities on digital devices produce data and the proliferation of digital devices underscores the omnipresence of data production. Also, the amount of data reaches an astonishing level. In 2012, the information flow can reach a massive amount. Approximately, Google can process 2,000,000 searches in an average minute (James, 2012) and Facebook can process 2.5 billion content postings, 2.7 billion “Like” clicks, and 300 million photos posting in a day (Constine, 2012).
Secondly, the data explicitly reveals personal profiles and interpersonal relationships in the digital realm. From the personal profile, data can reflect a holistic aspect of self-representation (Olshannikova, Olsson, Huhtamäki & Kärkkäinen, 2017). In the digital space, the users create their profile related to their personal details from the basic identity information (i.e.: nickname, e-mail address, phone number, demographic background, birthday) to the personal private data (i.e.: marriage status, interest, personal picture, updated status from the posting). From the digital relationship perspective, the data represents the social relationship between users. The information of digital friendships and followership can explicitly reveal social relationships and social network patterns. Every click and online activity are generating diverse, well-defined, relational data.
Nevertheless, the phenomenon of big data initiates the controversial issue of privacy when data analysis can better realize users’ habits and influence decision-making. For commercial consideration, the data helps to tailor better marketing strategies to serve customers and influence the consumption decision. For political consideration, the data garnered from voters can better regulate propaganda targeted at those voters (Payton & Claypoole, 2014, p 24). The data also grabs the attention of thieves as well. The more they know about one’s information, the easier a scam will be on the him or her (Payton & Claypoole, 2014, p 25). The data is valuable for the economy, polity, and society at a large but meanwhile it is controversial on privacy infringement when their data are vulnerable to breaching and leveraging. As a result, the tremendous data flow brings challenges to the privacy issue.
The right to Privacy
Privacy is a basic right for humans. The right to privacy in a digital environment is suggested to be “The right to privacy is our right to keep a domain around us, which includes all those things that are part of us, such as our body, home, property, thoughts, feelings, secrets, and identity” (Onn, Yael, et al. 2005). That defines us to have the right to control the information to whom to access and to what extent to disclose. We are supposed to exercise the right of privacy belonging to us such as among our home, our body, and even our thoughts. The right to privacy should grant us the right to protect our valuable, sensitive, personal information and should be protected away from any unauthorized individual or group. The importance should not be neglected because the right to privacy is intricated with autonomy and freedom of choice.
- Autonomy
Privacy and autonomy are related. Having one ‘s privacy means controlling individual exclusive space. The infringement of privacy would mean violating the autonomy of controlling the personal sphere (Becker, 2019). When personal information is exposed without permission, people would lose the ability of independence to maintain their secrets and privacy. Moreover, the surveillance of private information distorts users’ experience and threatens the right to fulfill their desires (Van Otterloo, 2014). Imaged that the password of a bank account is revealed or under others’ observation, people would lose the autonomy of managing the bank account. Therefore, the harm to privacy results in decreasing autonomy.
2. Freedom of Choice
Another insight of privacy emphasizes on freedom of control. It refers to control over freedom of choice, control over one’s thought and body, freedom from surveillance, and control over the protection of one’s reputation (Solove, 2008, p1). The right to privacy grants people the ability to control their sensitive personal information. The freedom of choice grants the extent of freedom to decide how you prefer. Contrarily, the privacy violation restricts the freedom of expression and decision-making. For instance, if the password of the bank account is revealed, or the transactions are under others’ observation, people would refrain their decision from the deposit on that account. Therefore, the privacy violation restricts people’s freedom of choice. Also, it means the choices are narrowed down from whom aims to intervene in the choosing. Thus, it suppresses the freedom of choice. Further to this point, the right to privacy is considered as a measure of counterpower that the right of controlling own privacy can help to modify or nullify the perceived power of others (Johnson, 1974). In other words, the right to privacy means people can control the data of their will and grants them free thought without power coercion.
Case study
Facebook-Cambridge Analytica data Scandal
Facebook offers cyberspace for users to create their personal profiles and get connected to their friends based on their shared interests. The users can communicate with their friends on Facebook and get the latest information about their friends via common-interest groups and the pages they follow. They can receive any updated postings and activities on their friends ’newsfeeds and notifications (Nyoni & Velempini, 2018).
However, the operation of Facebook suffers criticism as an online social network in a contradictory way. Facebook earns the profit mainly through advertising. Those advertisements targets on the audience are based on their respective characteristics, from the information gleaned on Facebook without users ’consent. The advertising targeted at private data is paradoxical to the personal information expression on Facebook, which poses a threat to the privacy on users. In addition, it collects a large number of users ’data without any explication of its data policy and users’ consent. Facebook collects information on social activities such as users ‘postings, and communications with other. The information collected even expends to the information on other third-party platforms and another Facebook-operated platform (e.g.: Instagram) and different devices when logging in to Facebook, including the device’s operating system, hardware, device locations, and connection information such as time zone, IP address and so on (Kirtley & Shally-Jensen, 2019, p223). The operation poses a potential threat of data breach and big data analytics can be utilized to profile users by commercial cooperation or law enforcement agencies.
In 2018, a Facebook and Cambridge Analytica scandal raised awareness on data breaches and privacy issues. Contrary to the expectation, Donald Trump won the 2016 presidential election. A consulting company named Cambridge Analytica was revealed to play an important role in assisting Trump’s success in the election. Cambridge Analytica was reported to adopt a method called behavioral microtargeting. They leveraged big data from social networking sites like Facebook to influence voter decisions and voting behavior. Using the data from 220 million Americans, it tailored the respective message for individual voters according to their psychological traits. Those messages were infused with personal fears, interests, and preconceptions based on the profiling data (Isaak & Hanna, 2018; Cadwalladr & Graham-Harrison, 2018). For ethical concerns, the method of data collection for commercial and political purposes severely violates the right to privacy and relatively harms personal autonomy and freedom of choice. Behavioral microtargeting manipulated the voters ’ thoughts and limited their ability to access different content and in turn restricted them to act autonomously (Ward, 2018). Also, the instilling psychological impact damages the freedom of choice during voting. The voting behavior became problematic because the choice at the ballot box is in circumstances with the biased message (Richterich, 2018). The voting behavior was swayed because the decision was derived from manipulation rather than freedom of choice. In this case, the users were manipulated to achieve political and commercial propose through the capture of personal data. In this sense, it represents a threat to human beings as commodified objects and potential exploitation. Overall, the manipulation casts a doubt on the intrinsic value of human nature when loss of autonomy and freedom of choice.
Challenge of privacy protection
Regarding privacy protection, it needs to consider some issues which impede the right to privacy protection. First, the complexity of policy inhibits the users to understand the privacy issue. The preliminary registration on the platform is designed to be easy with clicking to agree on all conditions in the Privacy Policy without reading the text. Thus, users are not aware of how their data would be utilized ( Romansky & Noninska, 2020). On Facebook, the users do not totally comprehend the privacy policy due to the length and technical terms, despite Facebook having regulated a privacy policy on the issues of data used, changed or removed, etc. Therefore, many users are not familiar to use privacy settings to secure their data (Nyoni & Velempini, 2018). Another issue causing concern on privacy is to restrict the users’ choice and understanding due to obscure operation of websites. The personal data would automatically transfer without users’ consent. Only 54% of social network users acknowledge the information for collecting personal data (Romansky & Noninska, 2020). In China, few websites provide users with multiple choices, and it is inconsistent between the privacy policy and network behavior (Lin, Liu, Li, Xiong & Gou, 2022). That is not beneficial for users to understand and to exercise the right to privacy for data protection.
Regulation on protecting privacy
The academic suggestion on legislation and law enforcement can better contribute to protecting privacy. First, it is suggested to promise public transparency on websites privacy. The public should learn to secure their data by learning means of data collection, the data retained and its purpose, and the information sharing with third parties. All data collection processes must be transparent to users and disclose the mechanism of activity tracking. The website and application should empower the users by providing complete information on their data. Besides, it should empower the users’ ability to control. For example, it should enhance the control of users with “do not track” requests and block disclosure by third-party cookies. Also, the users should have the right to delete, and terminate personal data from any site, cloud service, or collection device. Legally mandated age of consent when dealing with private information should be provided to minors (Isaak & Hanna, 2018).
Moreover, law enforcement on privacy is important for the protection of right to privacy and the policy of privacy can be altered from nation to nation depending on the nation’s ideology. China and European Union have regulated respective policies on big data. In the case of China, the policy of privacy aims to regulate data surveillance by the government, instead of the private sector. In Europe, the policy is proposed to protect data and privacy by the companies and states. Privacy protection aims to be achieved by limiting the behavioral control of the users (Aho & Duffield, 2020).
Conclusion
In the digital age of big data, it emerges with the issue of data exploitation and privacy violation that arises public concern. The privacy of controlling personal data is threatened by commercial and political data exploitation. It largely endangers the autonomy of controlling personal information and the freedom of choice. The ability of autonomy and freedom of choice guarantees us the right to make decisions independently and without others’ coercion. In the case of scandal, it indicates that data breaches and privacy violations can trigger harm to freedom of choice and autonomy when people were manipulated as objects. Considering the challenges that intervene the right to privacy, law enforcement should be needed and the academic suggestion for transparency and empowering control should be adopted for better improvement.
References
Aho, B., & Duffield, R. (2020). Beyond surveillance capitalism: Privacy, regulation and big data in Europe and China. Economy and Society, 49(2), 187-212.
Arifin, F., Hariadi, M., & Anshari, M. (2017). Extracting value and data analytic from social networks: big data approach. Advanced Science Letters, 23(6), 5286-5288.
Becker, M. (2019). Privacy in the digital age: comparing and contrasting individual versus social approaches towards privacy. Ethics and Information Technology, 21(4), 307-317.
Cadwalladr, C., & Graham-Harrison, E. (2018). Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach. The guardian, 17, 22.
Constine, J. (2012). How big is Facebook’s data? 2.5 billion pieces of content and 500+ terabytes ingested every day. http://techcrunch. com/2012/08/22/how-big-is-facebooks-data-2-5-billion-pieces-of-content-and-500-terabytes-ingested-every-day.
Isaak, J., & Hanna, M. J. (2018). User data privacy: Facebook, Cambridge Analytica, and privacy protection. Computer, 51(8), 56-59.
James, J. (2012). Data never sleeps: How much data is generated every minute. Domo Blog, 8.
Johnson, C. A. (1974). Privacy as personal control. Man-environment interactions: evaluations and applications: part, 2, 83-100.
Kirtley, & Shally-Jensen, M. (2019). Privacy Rights in the Digital Age. Grey House Publishing.
Kitchin, R. (2014). The real-time city? Big data and smart urbanism. GeoJournal, 79(1), 1-14.
Lin, X., Liu, H., Li, Z., Xiong, G., & Gou, G. (2022). Privacy protection of China’s top websites: A Multi-layer privacy measurement via network behaviours and privacy policies. Computers & Security, 114. https://doi.org/10.1016/j.cose.2022.102606
Nyoni, P., & Velempini, M. (2018). Privacy and user awareness on Facebook. South African Journal of Science, 114(5-6), 1-5.
Olshannikova, E., Olsson, T., Huhtamäki, J., & Kärkkäinen, H. (2017). Conceptualizing big social data. Journal of Big Data, 4(1), 1-19.
Onn, Y., Geva, M., Druckman, Y., Zyssman, A., Timor, R. L., Lev, I., … & Pery, L. (2005). Privacy in the digital environment. Haifa Center of Law & Technology, 1-12. http://books.google.com/books?id=yeVRrrJw-zAC.
Romansky, & S. Noninska, I. (2020). Challenges of the digital age for privacy and personal data protection. Mathematical Biosciences and Engineering : MBE, 17(5), 5288–5303. https://doi.org/10.3934/mbe.2020286
Payton, T., & Claypoole, T. (2014). Privacy in the age of Big data: Recognizing threats, defending your rights, and protecting your family. Rowman & Littlefield.
Richterich, A. (2018). How data-driven research fuelled the Cambridge Analytica controversy. Partecipazione e conflitto, 11(2), 528-543.
Solove. (2008). Understanding privacy. Harvard University Press.
Tene, O., & Polonetsky, J. (2011). Privacy in the age of big data: a time for big decisions. Stan. L. Rev. Online, 64, 63.
Ward, K. (2018). Social networks, the 2016 US presidential election, and Kantian ethics: applying the categorical imperative to Cambridge Analytica’s behavioral microtargeting. Journal of media ethics, 33(3), 133-148.