Introduction

Modern science and technology that represented by data and information, has ushered in a digital age and led to unprecedented changes in the way humans exist and live. Human nature and its dignity are facing serious challenges in the digital era, one of which is the crisis of privacy. Privacy is an exclusive and hidden space in which the self can exist, and the right to privacy is one of the fundamental human rights of modern society. Although it is not a new issue, the privacy crisis is one of the major social issues of the digital age. Karppinen(2017) has noted that the growing focus on human rights somehow reflects the increasing threats to human rights in the digital age. In the context of the widespread use of artificial intelligence and big data technologies, there is much potential for serious infringement of individual privacy. Such serious threats to privacy may be existed everywhere, and there are no effective measures in sight to curb them. At the heart of the privacy issue in the digital age is the privacy of data. On one hand, this is a privacy problem caused by the widespread and ubiquitous use of data technology. On the other hand, the fundamental characteristic of privacy today is that it exists in the form of data. One could say that the privacy crisis nowadays is a data privacy crisis.
Google is a world-renowned Internet company and its products have a wide range of users around the world. It is important to examine the privacy issues raised by Google. This blog presents an in-depth analysis of the impact of Google’s products on users’ privacy and concludes that the large amount of privacy information collected by Google through its complex product lines poses a risk of leakage of users’ privacy. More importantly, with the growth of the number of users, the expansion of innovative fields, and the new policy of the use of Cookie, the privacy threats tend to expand. The government and society should start from the policy and technical level to prevent the privacy leaking.
Recently, Google announced that it will introduce a Privacy Sandbox on Android, introducing newer, more private advertising solutions. These solutions will restrict the sharing of user information with third entities and will be able to run without cross-app logos and advertising identifications. Generally speaking, this is an upgrade to the way Google operates its digital advertisement to improve user privacy – allowing advertisers to push ads in a more secure way without tracking users. In fact, back in 2019, Google announced that it would drop support for third-party cookies and introduce a “privacy sandbox”. However, this plan was followed by an investigation by the UK Competition and Markets Authority (CMA) and strong opposition from the advertising industry and technology organizations.

Why Google push the plan?
Google’s formerly privacy policy poses a great risk to personal privacy and users have no idea what data about them is being collected and used. As Marwick (2018) noted that the ability to achieve privacy often requires privilege to make choices and create structures to make the freedom possible. If the platform didn’t tend to protect users’ privacy, it’s difficult for users themselves to keep their privacy from leaking.
Google has been hit with several lawsuit accusing the internet platform of illegally invading the privacy of millions of users by tracking their online information through the ‘privacy mode’ of its browser, which is set up as an unmarked window. The complaint alleges that Google collects information from users through applications such as Google Analytics, Google Ad Manager and website tools including smartphone apps, regardless of whether they click on advertisements supported by Google. This allows Google to learn about users’ friendships, hobbies, eating and shopping habits, and even the most intimate and potentially embarrassing content they search for online. While users may be able to use private browsing mode as a safe haven from prying eyes, there are concerns that Google and its competitors may track users’ identities in all browsing modes, combining data from users’ private or regular browsing modes to build a more informative user profile to send personalized advertisement.
During the process of registration and use of Google products, Google collects a large amount of user information and forms a user information database. With the continuous expansion of Google’s product line and product operation data, the database of user information becomes more and more huge, which creates a potential problem for privacy security. If the massive amount of user information is compromised by hackers, it will cause significant privacy leaking problems. Users will expose their habits, interests and attitudes in a digital way when using the browser. For instance, the user types in some keywords and the search engine intelligently displays the relevant keywords that the user uses frequently. That shows Google has a basic user profile for each user and that includes a lot of information. Besides, Google may disclose user information to developers for commercial reasons to increase the competitiveness of its products. For example, the official Google app shop Google play, provides paid user information to app developers in order to enhance the competitiveness of Android apps due to commercial interests. This also puts the security of user privacy at risk.
In the light of current developments, there is a growing tendency for information security issues to arise from Google’s products: the analytics generated by user data development can provide significant support for Google’s product development and marketing strategies, which makes it difficult for Google to leave user information unused. Google’s development of users’ data can lead to the disclosure of user information and the violation of privacy rights through a variety of channels, such as internal disclosure, external hacking and so on. Google’s privacy policy centralizes the use of user information from many products for data analysis. This can lead to the collection of information by Google without the user’s knowledge, increasing the aggregation of user privacy information and the probability of user privacy breaches due to the cross-use of multiple products.
In fact, Google’s privacy policy is about the interconnection of data behind different products. By linking products, Google’s businesses are supported by an adequate data background and complementary support, and the intersection of users between products is strengthened. This makes not only strong products more competitive, but also supporting the development of relatively weak products. The synergy of multiple products has led to a more refined approach to Google’s product offerings. But at the same time, Google has a long product line, a large number of users and a large amount of user data. If user data is cross-used, user privacy is put at risk. This makes it more difficult to protect user data, increases the frequency of access to user data within Google and the number of ways for hackers to steal user privacy through Google products. In addition, users are given different privacy rights for different Google products. The privacy policy allows for the sharing of user information across more than 60 Google products and that makes the user’s online presence more tangible and realistic. It’s a significant invasion of user privacy.
The privacy problem of privacy sandbox
The leakage of user information mentioned above is mainly due to cookies, which are text files stored locally that can be used by web servers to determine whether a user is visiting for the first time. However, because it contains personal information about the user, it is also used by marketing companies and third-party statistical companies to form a user profile, which can then be used to push targeted advertising. From basic identification technology to advertisement delivery aids, users can actually know that their information is being passed on to advertising platforms when they see various types of ads that meet their preferences and needs. But it is not clear how much privacy can the platform actually see or what is being collected, which raises questions about personal privacy.
Google’s Chrome will certainly need to change too. But unlike other browsers, Google is not only a browser developer, it is also a huge advertising platform, and its advertising business is tied to cookie tracking. When Google had to find a way to keep both users and advertiser-publishers happy and this is why the privacy sandbox was created – to allow advertisers to advertise and earn revenue without using third-party cookies. In this way, users would prevent their privacy from leakage and advertisers will still put various advertisements to certain audience.
The controversy surrounding the introduction of Google’s ‘privacy sandbox’ is actually first and foremost about the potential for more privacy problems with the new technology. The FLoC technology used in this case means that users with similar browsing behaviour are categorised into several categories and advertisers then display relevant ads for the category, so that users can be invisible in their interest groups. However, Bennett (2021) believes that categorising people based on their browsing behaviour is likely to lead to discrimination in areas such as employment, housing and other types of discrimination. At the same time, FLoC itself could pose a privacy risk – advertisers could use browser fingerprinting to further narrow down potential customers in groups, and could also combine group IDs to distinguish single users. With group IDs remaining the same even across websites, advertisement tracking will have the opportunity to link external channels to access user data, which could reveal more privacy information than Cookie.
The privacy sandbox is a much more radical feature. Whereas previously advertisers needed to get permission from users before they could track their behaviour, and the choice was in the hands of the user. But the privacy sandbox doesn’t need the user’s consent at all, which is tantamount to putting the power directly in Google’s own hands.
The problem with the privacy sandbox is that it is entirely up to Google to decide what level of user information advertisers can see, reinforcing Google’s power at the expense of independent advertising platforms, publishers and app developers, and not to mention the fact that users have no right to know about their privacy at all. In the digital advertising space, Google is the unabashed giant and stakeholder. This reality dictates that the adoption of privacy sandboxes in Android may not necessarily be good news for consumers. It’s true that the privacy sandbox may make it difficult for other advertisers to track users, but it all relies on Google’s business ethics. In short, Google has set strict thresholds for other advertisers to access user privacy, but has gained so much power for itself that no one can regulate what it does with user privacy.
Conclusion

No matter what method is used, Google will not give up getting more user privacy because it means a rapid drop in revenue. Google, as the internet advertising giant, generates as much as 80% of its revenue from advertising in 2020, so trends in digital advertising have a direct impact on the company’s financial position. Although the company has been fined many times for leaking users’ privacy, it’s not surprising that Google will not make any changes. A commercial company is very likely to choose for more profit and ignore the privacy crisis on its platform. Kummer (2019) has mentioned that the success of new digital technology platforms may depend to a large extent on the ability of service providers to collect and analyse sufficient personal information. But on the other hand, if providers store too much personal data, it can lead to widespread unease and loss of trust in the market. With so much personal information in one place, Google is becoming the biggest personal privacy risk on the Internet. The company’s status as both a major player and a rule-setter in the industry will be inevitably questioned. Google cannot claim the legal right to make decisions unilaterally and there needs to be some regulations from external to solve the problem.
Reference List
Bennett cyphers. (2021). Google’s FLoC Is a Terrible Idea. Retrieved April, 8, 2022, from https://www.eff.org/zh-hans/deeplinks/2021/03/googles-floc-terrible-idea.
Karppinen. (2017). Human rights and the digital. In H. Tumber & S. Waisbord (Eds.), The Routledge Companion to Media and Human Rights (pp. 95–103). https://doi.org/10.4324/9781315619835
Kummer, M., & Schulte, P. (2019). When Private Information Settles the Bill: Money and Privacy in Google’s Market for Smartphone Applications. Management Science, 65(8), 3470–3494. https://doi.org/10.1287/mnsc.2018.3132
Marwick, & Boyd, D. (2018). Understanding Privacy at the Margins: Introduction. International Journal of Communication, 12, 1157–1165.
Miller AR, Tucker C (2009) Privacy protection and technology diffusion: The case of electronic medical records. Management Sci. 55(7):1077–1093.