Understanding Big Data: Data and Privacy in the Digital Age

image: Photo by Towfiqu Barbhuiya on Unsplash

The rapid development of the digital age has provided people with a wide range of space. People have left many data footprints on the internet, which are cumulative and connected. Aggregating multiple data footprints together can reveal private information about individuals. Malicious elements use this information to commit fraud and other acts, causing many problems or financial losses to individuals’ lives. Therefore, the protection of personal privacy online has become a focal point in developing information network security in contemporary society. Firstly, this blog will introduce data leakage in the digital age. Then, it will explore the blurred privacy rights of individuals in the big data environment through case studies. Finally, it will discuss how to protect personal privacy in the digital age better.

Data leakage in the digital age

Databases have emerged as the volume of data continues to increase. The rapid development of database technology and the widespread use of management systems have led to an increasing amount of data being accumulated. Thus, there is an urgent need to convert this data into valuable knowledge and reveal its potential value for various applications. Data mining is a data processing technology that has been developed in response to this need. It helps decision-makers adjust their strategies and make the right decisions by analysing an enterprise’s data and making inductive inferences.

Many businesses or organisations collect, process, use and publish personal information without restriction, driven by the immense value of data in Big Data. There is also much sharing of user information between large corporations or between corporations and third parties. However, this use and sharing of user data bring business opportunities to businesses while also having an alarming impact on individuals. For example, some shopping sites target products or make personalised advertising recommendations based on a user’s purchase behaviour over time (Christiansen, 2011); People can accurately predict the occurrence of a crime based on their behavioural history on the internet before they commit it. This information is obtained from big data analysis. Also, it is primarily a threat to the security of individuals’ lives and a major factor affecting social security.

Research shows that most people are “worried” or “very worried” about threats to their privacy online and are willing to take action to protect it. 87% of internet users are “worried” about threats to their privacy online, with 87% ready to take action to protect it. 87% of Internet users are “concerned” about threats to their privacy online, with 56% of them “very concerned”; 70% of US consumers are concerned about online privacy (Paine et al., 2007). It confirms that businesses have asymmetric rights to consumer data (Bandara et al., 2020). As the volume of data held by businesses continues to grow, the lack of a strategy to address the protection of personal privacy information will negatively impact businesses and even society.

 

Privacy breach case studies

The popularity of social media has exposed a great deal of people’s private lives to the Internet, and personal information has become readily available in the age of big data. In particular, social media platforms mediate the way people communicate. At the same time, the decisions they make have a tangible impact on public culture and their users’ social and political lives (Suzor, 2018).

In 2018, the Facebook data breach erupted. London-based political consulting firm Cambridge Analytica was accused of using the personal information of 50 million users to influence the UK’s EU referendum and the US election. According to reports, the company launched a personality analysis test app. Users were asked to “authorise permission for the app to access information about themselves and their friends’ Facebook data.” Although only 270,000 users agreed, the app eventually gained access to over 50 million Facebook users after a snowball effect. What caused the panic was when the British company Cambridge Analytica turned around and sold the information of 50 million users to a third party (Graham-Harrison & Cadwalladr, 2018). Facebook argues that these companies obtained user information with their permission but sold it to third parties. Although Facebook was already aware of the vulnerability, it is the main reason for the breach.

“User permission” is an essential criterion for determining whether a company’s use of user information is legal or not. When installing a new app, it is common to be asked to access contacts, geolocation, and other information. Still, few companies give clear explanations about the purpose, time, and method of access. Previously, Facebook user information has also been accessed and used by ‘Cambridge Analytica’ without the user’s knowledge. They targeted political advertising content to these users to support the Trump team in the 2016 US presidential election (Liberini et al., 2020). There is no denying the growing strategic importance of data in the digital age. There is also a consensus on the commercial value of big data. Still, only a tiny percentage of it is commercially viable. Those who do evil under the banner of “protecting user privacy” are deliberately and blindly grabbing data. The casualty is the security and privacy of users’ data. Facebook violated the privacy of individuals by not telling them exactly what kind of personal information was being collected and for what purpose it was being used. Also, there is an essential question in weighing the right to privacy interests in this case. What is a right to privacy that belongs solely to the individual and is not subject to interference by governmental authority or disclosure by the mass media? The law uses the principle of “public interest” to determine the boundaries of rights in a rights relationship. For example, personal privacy that is not of public interest is protected by law. Where there is a public interest in solitude, it is not covered by law or restricted as a defence against governmental authority or the mass media.

Another case is the Chinese Consumers’ Association (CCA), which evaluated 100 APPs on personal information collection and privacy policies in 2018. It found that many apps were suspected of excessive collection or use of personal information, especially location information, internet history, and other information (Borak, 2019). The collection of personal information has always been a concern for consumers when installing apps. According to the evaluation results, three types of personal data, ‘location information, ‘address book’ and ‘mobile phone number,’ are the most common elements of excessive mobile phone or personal information use. In addition, users’ photos, property information, biometric information, and transaction records were all overused and collected.

In the current context of big data, there is a trade-off between the right to privacy and free online services due to the amount of information that can be made available (Flwe, 2021). The tension between the right of Internet users to hold personal data and the ability to control it has sparked a widespread debate about the right to privacy on the Internet. The debate focuses on the inability of users to control and influence the way personal data is used and processed. It also emphasises companies’ role in controlling and managing personal data. In addition, the private sector’s control over personal information often contrasts with that of government authorities, who are neither able nor willing to provide substantial protection of users’ data. Application leads to user data leakage, making it difficult for individual users to combat the risk of the total exposure of their privacy. As Taraszow argues (2010), the rapid growth of social media has made people accustomed to disclosing personal information on it. Whenever users use their smartphones, shop online, or engage in social media interactions, they must transfer ownership of their data to the service provider. It is further complicated by multiple transactions and the involvement of numerous third-party channels. The boundaries between identifiable and non-identifiable personal data ‘consent’ are increasingly blurred (Flwe, 2021). The edges of individual data rights have disappeared or blurred, and citizens’ privacy protection has encountered serious challenges.

 

Comparative analysis of the cases

Comparing the two cases above shows that the development of the right to privacy presents different rights processes due to other legal systems and cultural traditions. It also reflects the two different kernels of value that the individual right to privacy exhibits in different cultural contexts. The Facebook case is about liberty interests. The primary source of threat is the government, while the CCA case is about human dignity. The primary source of threat is the mass media (Whitman, 2003). There is no doubt that the value of information is becoming increasingly important today. Measuring the relationship between the use of personal data and the protection of personal information has become a new legal issue. Individuals need to protect information from being used by others while also benefit from disclosing and using their information at their discretion. Businesses collect personal information for economic gain, and economic development contributes to the overall progress of society, which in turn translates into the public interest.

However, the excessive collection of personal information can also be detrimental to the interests of individuals and the state. The government collects personal information for social management and to improve governance efficiency. This process may also result in excessive intrusion of public power into private space. At the same time, the ubiquitous media and the media in public spaces can make people reveal their personal information unknowingly.

In addition, the above cases reveal that the difficulties of privacy protection in the digital age manifest themselves in three ways. First, there is extreme information asymmetry between data collection companies and users. Users have no specific knowledge of how personal information is collected and in what ways it is used. Secondly, because digital information can be easily copied, stored, and distributed, users’ privacy cannot be recovered once compromised. Finally, the lack of a quantifiable standard for penalties and compensation for privacy breaches leaves a great deal of uncertainty about privacy protection policies. The issue of privacy breaches under cross-border data flows is further highlighted.

 

How can personal privacy be better protected in the digital age?

Personal information has public attributes, and the development of the digital economy cannot be separated from the reasonable use of personal information. Protecting individual privacy is a basic premise and an essential guarantee for promoting the high-quality development of the digital economy. In recent years, the data leaks on Facebook are still alarming. Even the US has an excellent legal system for privacy and the social culture of industry self-regulation.

Based on the analysis of personal privacy protection issues in this paper, a complete and understandable security solution to meet the needs of personal privacy protection is expected to be available in the future. For example, the law should impose requirements on social media to obtain explicit consent from users on what information is collected and for what purposes. And they are focusing the solution on how to achieve depersonalisation, security, and complete deletion of information processing through technical means. Of course, being in the information age, it is unrealistic to completely protect personal privacy if people use the internet. Therefore, it is essential to integrate legislation and industry regulations into implementing technology and business practices and keep them in sync to maximize the utility of data use and minimise privacy breaches to meet current needs and address additional challenges.

 

Conclusion

The privacy of users’ information is a matter of the entire media ecology. It may also involve the dignity and even the property and personal safety of each individual. Therefore, the government needs to step up its efforts to improve the current privacy protection policies. And platforms need to develop practical privacy protection measures while making users more aware of privacy protection and improving media literacy. In fact, in moving towards a digital society, we crave trust more than ever. But just like any other legal system, the legislative process of privacy and personal information protection is a cyclical progression of response-lag-response-lag again. To build the cornerstone of trust in the digital society and legal response, we also need technology and practice to collaborate.

 

 

Reference

Bandara, R., Fernando, M., & Akter, S. (2020). Managing consumer privacy concerns and defensive behaviours in the digital marketplace. European Journal of Marketing, 55(1), 219–246. https://doi.org/10.1108/ejm-06-2019-0515

Barbhuiya, T. (2021). Privacy. Retrieved from https://unsplash.com/photos/FnA5pAzqhMM

Borak, M. (2019). Tech in Asia – Connecting Asia’s startup ecosystem. Retrieved April 2, 2022, from Tech in Asia website: https://www.techinasia.com/chinese-apps-collecting-data

Christiansen, L. (2011). Personal privacy and Internet marketing: An impossible conflict or a marriage made in heaven? Business Horizons54(6), 509–514. https://doi.org/10.1016/j.bushor.2011.06.002

Flew, Terry (2021) Regulating Platforms. Cambridge: Polity, pp. 72-79.

Graham-Harrison, E., & Cadwalladr, C. (2018, March 17). Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach. The Guardian. Retrieved from https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-election

Paine, C., Reips, U.-D., Stieger, S., Joinson, A., & Buchanan, T. (2007). Internet users’ perceptions of ‘privacy concerns’ and ‘privacy actions.’ International Journal of Human-Computer Studies, 65(6), 526–536. https://doi.org/10.1016/j.ijhcs.2006.12.001

Suzor, N. P. (2018). Lawless: The secret rules that govern our digital lives. Center for Open Science. Retrieved from Center for Open Science website: http://dx.doi.org/10.31235/osf.io/ack26

Taraszow, T., Aristodemou, E., Shitta, G., Laouris, Y., & Arsoy, A. (2010). Disclosure of personal and contact information by young people in social networking sites: An analysis using Facebook profiles as an example. International Journal of Media & Cultural Politics, 6(1), 81–101. https://doi.org/10.1386/macp.6.1.81/1

Whitman, J. Q. (2003). The two Western cultures of privacy: Dignity versus liberty. SSRN Electronic Journal. https://doi.org/10.2139/ssrn.476041