More and more data are being created throughout the world in this age of information explosion. Data has tremendous value benefits. The deep qualities and hazy interior demands of consumers may be recognized through big data analysis, delivering value and advantages to organizations. The search engine Google makes a huge fortune every year by analyzing people’s search terms and providing recapture services. But this kind of behaviour can be a huge invasion of personal privacy. In the data age, individuals are one of the sources of data (Romansky & Noninska, 2020). Following the collection of a huge amount of information from users via technologies or processes, the organization integrates and analyzes the enormous data pertaining to individuals, and mines the data that is valuable to the enterprise. However, for people, it is to disclose their living situations, consumption habits, and identifying qualities to others under circumstances over which they have no control and are unaware. This greatly violates the privacy of individuals. As Internet corporations devote greater attention to the discovery of data value, a trend of deriving economic benefits from user data will emerge, inevitably infringing on users’ personal privacy. Nowadays, an increasing number of users reveal and share their life on social media, as well as supply different information about themselves, which surely poses hidden risks to user privacy. Many consumers are unaware of the problem of privacy leaks, or even recognize the significance of these security concerns.
With the fast growth of information technology in recent years, the digital age has officially arrived. Nevertheless, the issue of individual privacy has started increasingly afflict all humans. Algorithms may be used to calculate people’s purchasing patterns, preferences, and desires. An invisible hand is amassing important private details on a big scale through algorithmic suggestions and data mining (Theodos & Sittig, 2021). As a result, different data breaches and instances involving privacy rights are relatively commonplace. We appear to be in the midst of a period of openness and privacy.
Global internet behemoths, as well as traditional enterprises, are striving to get more user data. Internet behemoths like Facebook or Google have massive amounts of user data that will be collected to the greatest extent feasible(Sabin & Harland, 2017). Unfortunately, the majority of consumers are unaware of the data acquired by Internet behemoths, and the manner of acquisition is rather obscure. Other internet corporations even deploy abduction clauses, such that when users open their program, they need to accept to the security clauses imposed by them, or they would be unable to use it. This bully clause results in less choice for the user. Although the era of big data has brought many new experiences and conveniences(Saksena et al., 2021). Users, for example, can benefit from a variety of quick and comfortable living experiences made possible by artificial intelligence (AI) and internet of things (IoT) technologies such as recommendation algorithms, and autonomous vehicle driving. At the same time, algorithm technology will gather and utilize user information in the creation and manufacture of various items. The privacy of customer information will be compromised as a result of this operation. Private information networking and openness have become an unstoppable trend(Pyrrho et al., 2022).
The shocking Facebook leaks
The Guardian New York and the Times revealed on March 17, 2018, that a business named “Cambridge Analytica” had exposed the data of around 500,000,000 Facebook users. Its user data is also utilized to position and push highly targeted political adverts in the United States presidential election in order to affect people’s inclination to vote. Once the leak came out, Facebook was once again at the centre of the whirlpool of public opinion.
The whole story of the leaks
In 2007, Facebook unveiled the “Facebook Platform” initiative, which really is similar to the strategy of several social networks in that it allows users to explore 3rd software. In 2013, Aleksandr Kogan, a psychology professor at Cambridge University, created and published the software “This is your digital life” on Facebook. This software can assess a person’s character. When you launch the app, a series of personality questionnaire items will surface. The subscriber merely needs to answer these items, and the software will show you what type of personality you have. The most significant aspect is that after responding to the question, every participant will receive a $5 prize. At the time, Facebook made several API interfaces available. Kogan may acquire the user’s personal information without the participant’s awareness and in accordance with Facebook’s standards when the user accessed the software and consented to approval. Following then, according to the Facebook association agreement, the application can immediately gather the data of all the customers’ connections without the client’s friends’ approval, and evaluate the obtained relevant data. “This is your digital life” creates entrance obstacles for registrants, who must have over 180 friends and be ready to grant the application access to their personal and friend details. In the end, Kogan collected the data of more than 500,000,000 Facebook users with the permission of 300,000 individuals and sold it to the “Cambridge Analytica” organization.
What is the reason for this situation?
Returning to the basis, the cause of this Facebook leak is its application programming interface, which was released in May 2007. 3rd applications can be installed and operated on Facebook using the interface provided by Facebook. This network provides users with access to a huge variety of 3rd apps. Although Facebook has provided consumers with a wonderful experience, it has also set the conditions for potential security breaches. When a user contacts and utilizes a third party, the 3rd software reads personal information. Such data contains information left by Facebook users as well as additional information left by consumers while using 3rd applications.
Weak awareness of user privacy protection
The most terrifying aspect of the Facebook issue is that Kogan collected more than 50 million items of subscribers’ private details with just 320,000 people’s permission. Unfortunately, these 320,000 users are ignorant that their privacy rights, as well as the rights of their acquaintances, have been significantly breached. This clearly illustrates the fact that, in the age of data, while the public appreciates the different benefits provided by data, their knowledge of protecting their personal information has not considerably increased. Social media sites are the most vulnerable to user leaked information (Landau, 2021). Users have the option to allow personal details used in 3rd apps on the network.
The advancement of privacy protection and technology appear to be fundamentally conflicting in the environment of the data era. The Internet’s quick expansion has altered the traditional method of sharing data and distribution surroundings, which has become a necessity in community relationships. Whereas the Internet’s existence and growth have essentially achieved McLuhan’s notion of the “global village,” it has tremendously eased the flow of information between individuals (Martinez-Martin et al., 2020). However, technological advancements are eroding obstacles to people’s privacy. Using social media, as an illustration, people seem to be at a handicap in the battle between technology and data. Customers regularly disclose some of their private details freely in order to gain better convenience. And this form of “voluntary” is compelled; there is no other option, and this is such a kind of surrendering to technology. Personal details are so prevalent in the age of data since it has a high economic value. After collecting customer data, the data management business will evaluate it and sell the processed data for the 2nd. That is used to precisely position adverts on social media sites, resulting in significant economic benefits (Lustgarten et al., 2020).
In essence, the personal privacy crisis in the data age is mainly reflected in two aspects: First, the transparency of personal private life, which seriously violates the personality and dignity of individuals. In the data age, everyone’s private life is exposed to a variety of data collection tools. Numerous private lives (preferences, locations, behaviours, interests, etc.) are collected in the form of digital information. Individuals have become “transparent people” in the face of big data analysis. A scholar declared: “We are indeed entering a new era of mass surveillance.” In the era of national surveillance, a large number of private lives that are unwilling to be disclosed are disclosed without consent. Numerous private weaknesses that were intended to be covered up were exposed brazenly. Some private information that damages the image was learned by many onlookers. Everyone has the potential to be discussed, watched, evaluated, and condemned. Their personal image may be destroyed, and their personal dignity may be trampled on. This kind of harm is exactly what the traditional concept of privacy tries to avoid(Romansky & Noninska, 2020b).
The second manifestation of the crisis of personal privacy is the comprehensive control of personal private life, which seriously violates the freedom and autonomy of individuals. If it is said that omnipotent information collection will make personal life invisible, and further seriously damage the personal dignity of the individual. Then all kinds of network application software covering political, economic, and social life will seriously interfere with the free choice and independent development of individuals. Big data can aggregate a lot of personal information, and form a profile of a specific aspect of the individual through specific algorithms. This profile portrait will then accurately predict the individual’s future behaviour (Golbus et al., 2020). On the one hand, merchants influence our shopping choices through precise advertising push. Such prediction and catering to individual preferences may seriously damage the autonomy and selectivity of individual citizens. Because such personalized recommendations may trap individuals in the cage of information. On the other hand, the autonomy of various artificial intelligence is increasing day by day(Quach et al., 2022). However, when various intelligent unmanned systems are more capable of autonomous evaluation, selection and decision-making, the problem of its subject status and corresponding responsibility will be more prominent. This will lead to less freedom of choice and less responsibility for individual human beings. Once the freedom of choice is completely lost, the individual loses his individuality. The most terrifying thing is that we may become an animal without free will. Schönberger warns us: “The future of humanity must reserve some space, allowing us to shape according to our desires. Otherwise, big data will distort the most essential things of human beings, namely rational thinking and free choice”.
The value choice of personal privacy
The data age has arrived, and various ways of collecting, analyzing, and applying big data already exist. We cannot go back from the era of big data to the era of small data or even no data. Under this premise, everyone needs to make difficult choices. What kind of life do we need? Too extreme a lifestyle is unrealistic. A complete rejection of big data will make it difficult for us. Fully embracing big data can make us lose ourselves again. The real question to ask is how much personal privacy are we willing to pay for the benefits of living with big data? Some experts suggested: “Perhaps we should ask ourselves this question over and over again: How much privacy are we willing to give up for the Internet services and devices we enjoy?
References:
Golbus, J. R., Price, W. N., & Nallamothu, B. K. (2020). Privacy gaps for digital cardiology data: big problems with big data. Circulation 141(8), 613-615.
Landau, S. (2021). Digital exposure tools: Design for privacy, efficacy, and equity. Science, 373(6560), 1202-1204.
Lustgarten, S. D., Garrison, Y. L., Sinnard, M. T., & Flynn, A. W. (2020). Digital privacy in mental healthcare: current issues and recommendations for technology use. Curr Opin Psychol, 36, 25-31.
Martinez-Martin, N., Wieten, S., Magnus, D., & Cho, M. K. (2020). Digital contact tracing, privacy, and public health. Hastings Center Report, 50(3), 43-46.
Pyrrho, M., Cambraia, L., & de Vasconcelos, V. F. (2022). Privacy and health practices in the digital age. Am J Bioeth, 1-10.
Quach, S., Thaichon, P., Martin, K. D., Weaven, S., & Palmatier, R. W. (2022). Digital technologies: tensions in privacy and data. J Acad Mark Sci, 1-25.
Romansky, R. P., & Noninska, I. S. (2020a). Challenges of the digital age for privacy and personal data protection. Mathematical Biosciences and Engineering, 17(5), 5288-5303.
Romansky, R. P., & Noninska, I. S. (2020b). Challenges of the digital age for privacy and personal data protection. Mathematical Biosciences and Engineering, 17(5), 5288-5303.
Sabin, J. E., & Harland, J. C. (2017). Professional ethics for digital age psychiatry: boundaries, privacy, and communication. Curr Psychiatry Rep, 19(9), 55.
Saksena, N., Matthan, R., Bhan, A., & Balsari, S. (2021). Rebooting consent in the digital age: a governance framework for health data exchange. BMJ Glob Health, 6(Suppl 5)
Theodos, K., & Sittig, S. (2021). Health Information Privacy Laws in the Digital Age: HIPAA Doesn’t Apply. Perspect Health Inf Manag, 18(Winter), 1l.