Introduction
The Internet perhaps more than any other institution, has made us aware of our limitations in the face of human cultural production – a limited destiny (Andrejevic, 2019). In this vast world of digital information, more content is produced through internet searches alone than a human being can read in a lifetime. In order to get more out of our cultural experience in a limited amount of time, we have to be selective in our viewing of the vast amount of information available. And automated content management seems to offer a viable solution to the problem of centralised cultural production and the exponential increase in distribution. In fact, much of the social work that shapes the cultural world has shifted to automated systems (Andrejevic, 2019). This also means that people are entering the trap of “filter bubbles”. This blog will address the impact of the “filter bubble” phenomenon on human ideology – creating cognitive isolation and polarisation of opinion.
What is Filter Bubble

The logic of filter bubbles is based on algorithms and big data. It is a phenomenon in which the information environment of Internet users is gradually personalised under the influence of algorithmic editing. When browsing information, people unconsciously click on what interests them and ignore what they don’t like. The “filter bubble” algorithm is an ever-improving predictive mechanism that uses our past search history to filter out what we don’t like or what we don’t agree with, and only recommend what we do like. It seems to be a perfect algorithm for the viewer. It allows us to avoid spending time browsing content we don’t like and to be recommended to our own communities.
Cognitive isolation
From a commercial point of view, the algorithm that causes the “filter bubble” phenomenon does bring a lot of convenience to the public. Take the Chinese shopping app Taobao as an example: the first page people open is “Guess Your Likes”. “Taobao collects data on your habits (what you browse often, what you have bought before, etc.) to suggest items you might like. Even their prices are tailored to you. And soon you’ll find that an item you searched for on Taobao will be frequently advertised on other platforms such as RED, Weibo and Tiktok. It seems like a wonderful thing, the information you need appearing without any effort. But is this really the case? Commercialised uses aside, the highly customised nature of culture and politics in the internet world makes each of us get different results for our queries. Under the algorithmic mechanism of the filter bubble, it is easy to see what the Internet wants us to see, but it is not necessarily what we really need. This intelligent filtering and sifting traps us in an information bubble of our own. It is often difficult to decide what is included. More importantly, we have no way of knowing what is programmed outside the bubble. As Eric Schmidt has said it will be difficult for people to watch or consume something that is in some sense not tailored to them (Mork, 2019).
Case study of Tiktok
Many apps have a registration phase in which you select the topic you are interested in. This is the first step into the trap. You may have the feeling that time passes very quickly when you are using TikTok. In response to this, Chinese netizens have chimed in: “Five minutes on TikTok, three hours on earth”. This means that people think they have only spent five minutes browsing on TikTok, when in fact three hours have passed in the real world. Tiktok, a popular short video platform in recent years, uses big data algorithms to cater to its audience’s preferences in order to gain popularity. One survey showed that 90% of TikTok users visit the app every day. The average daily active time is 52 minutes, and younger generations tend to spend more time on the app (Aslam, 2022). According to the TikTok statistics report published by App Ape Lab, TikTok users use the app more times a day than any other social media platform(Sheikholeslamy, 2018). the average number of times Twitter users open the app is around 15. However, TikToker has more than twice as many daily activations, with TikTok users opening their app 38 to 55 times a day (Aslam, 2022). The culprit for this is the algorithm that creates the “filter bubble”.

In fact, Tiktok has admitted that the algorithm creates “filter bubbles”(Kasana, 2020). But it also gives the user the power to choose. It explicitly mentions how its algorithm works and what information the platform collects in its settings under “Manage personalised content recommendations”. Search, long press disinterest, and interactions including likes, favourites, followers and comments all provide the algorithm with information about your behaviour in order to tailor Tiktok’s recommendations to you. Before Tiktok, the Slogan of its parent company ByteDance’s faddish app “Today’s Headlines” was: “What you care about is the headline”. It constantly recommends new content based on what you like just like Tiktok. This results in an “addiction” that separates you from different opinions and keeps you isolated in your own cultural or ideological bubble.
The highly customisable ticktok can get its audience lost in its recommendations. But this’s not a good thing. For us, getting information from the internet is as much a part of everyday life as eating and drinking. We are like disobedient children, eating only our favourite junk food and secretly throwing away healthy foods like vegetables. And the Internet under the algorithm is like a mother who dotes on her children and caters to our needs unconditionally. In the long run, the balance is gradually lost and we become surrounded by junk information. When individuals focus only on what they choose or what pleases them, and reduce their exposure to other information, they become confined like silkworms in a self-made “information cocoon”. In other words, the isolation of information caused by the individualisation of the information system creates a state of cognitive isolation, which prevents people from having a true and complete understanding of the world.
The polarisation of political opinion
Not only that, but the phenomenon of the ‘filter bubble’ is also reflected in the polarisation of opinions. On some social media platforms, people often stick to the circles of information that suit their preferences, influenced by their own positions and social circles. This can lead to divisions and even confrontations between various circles. This is often reflected in political issues as well. In the context of the algorithmic spread of social networks, the political polarisation on Western social networks is marked by the polarisation of online groups. This is due in large part to the fact that the racial populism of the internet has been programmed into the algorithms, effectively driving the spread of white supremacy, xenophobic rhetoric, radicalism, etc. in political discourse and throughout the Western world.
Case study of the “Unite the Right” in Charlottesville
The Charlottesville incident can be seen as the beginning of an offline action by the white nationalist polarisation movement in the US through algorithmic communication.On a late summer evening in 2017, hundreds of far-righters gathered in Charlottesville, Virginia. They defended a statue of Robert E. Lee, a symbol of slavery and white supremacy, with tiki torches (Katz, 2017). The rally, organised mainly online, was called the “Unite the Right” rally. After the Charlottesville rally, former US President Donald Trump defended the far-right protesters at the Charlottesville rally, saying they were not all neo-Nazis and white supremacists and blaming the violence on what he called the “alt-left” (Jacobs & Laughland, 2017). This is tantamount to standing in the Oval Office of the White House and openly promoting white supremacist rhetoric. The extremist ideology of white supremacy, spread algorithmically through search engines and social networking platforms, has turned the internet into a focal point for the gathering of alternative right-wing forces in the US. Trump’s ‘Very fine people on both sides’ rhetoric has caused resentment on both sides (Parker, 2019). The atomised individuals with various grievances scattered across the social web were algorithmically reconfigured into two new political groups and created antagonisms.

White supremacists march at the University of Virginia in 2017 (Photos by Evelyn Hockstein for The Washington Post)
In fact, in the world of the Internet meaning and representation are no longer the only major domain of politics, and human emotions have become central to a political perspective that cannot be ignored. The politics of emotion stems largely from the algorithmic capture and prediction of netizens’ emotions in the internet. Algorithm-led control of emotions is not only an auxiliary factor in the polarisation of opinions, but also has the potential to contribute to a means of social control. The “filter bubble” in a digital context can be seen as a precise method of directing and disciplining attention. The echo chamber effect created by the “filter bubble” phenomenon can easily manipulate people’s emotions and sway their choices. Emotional politics is a phenomenon that can also be seen in Western voters. In recent years intelligent algorithms have helped a number of Western political candidates with populist overtones to take to the political forum. Donald Trump, the former leader of the US Republican Party, who represents white interests, is one of them. Algorithms have helped populist political candidates in the West to exacerbate ideological polarisation through “filter bubbles”, facilitating the spread of populist memes on the internet and the proliferation of false news with a unilateral polarisation. This kind of big data algorithm-driven emotional polarisation has become an alternative form of demagoguery in Western political elections.

Donald Trump wins the 2016 presidential election and delivers his victory speech in New York City. (Photo by Mark Wilson/Getty Images)
In general, populist logic has found a favourable space to develop on social networks. When an online polar group with populist tendencies and wrapped up in the reality of political activities such as partisan struggles, political campaigns and extremist national democratic movements, it takes on new polarising manifestations. This suggests that online political polarisation has populist and nationalist overtones. Based on racial populism on the internet, algorithms push search results from racist websites and communities for those seeking to identify with racist ideas, and target them precisely to identify and develop their shared hatred. Media platforms under “filter bubble” algorithms are a relatively closed environment, which results in voices with similar views being repeated in our ears, even in exaggerated or otherwise distorted forms. This leads most people in this relatively closed environment to believe that these distorted stories are the whole truth. And people tend to hold more extreme positions after talking to like-minded people. Therefore, based on the fact that social media platforms are more prone to the “echo chamber effect” under the influence of a “filter bubble”, this algorithm not only accelerates the spread of white supremacist ideology in the US, but also exacerbates the polarisation of racial antagonism.
Conclusion
In conclusion, in social networks where political and cultural expression is exceptionally active, the precise placement of algorithmic technology has severely reduced the chances of people perceiving divergent views. While rational fact-based analysis and cultural diversity appeal to everyone, people still choose to seek out what supports their emotional worldview. However, being in an individually tailored “bubble” for too long without accepting new ideas can lead to being surrounded by spam. Not only does this result in a loss of balance, but it is also completely detrimental to uniting a divided society. The internet, once thought to open up the world to all possible information and bring people together, is now pulling people into their own corners (Newman et al., 2017).

People are content to accept the automated distribution of content and huddle in media communities that conform to their beliefs. This’s Terrible. Exposure to a wider range of perspectives is an effective way to ameliorate fragmentation and polarisation (Andrejevic, 2019). We need to see new ideas, hear different voices and feel the diversity of the world, not isolate ourselves in a personal network bubble. To become the frog in the well in the information world without realising it.
Reference List:
Andrejevic, M. (2019). Automated media (1st Edition, pp. 44–72). Routledge. https://doi-org.ezproxy.library.sydney.edu.au/10.4324/9780429242595 (Original work published 2019)
Aslam, S. (2022, March 13). TikTok by the numbers (2020): Stats, demographics & fun facts. Omnicore. https://www.omnicoreagency.com/tiktok-statistics/
Jacobs, B., & Laughland, O. (2017, August 16). Charlottesville: Trump reverts to blaming both sides including “violent alt-left.” The Guardian. https://www.theguardian.com/us-news/2017/aug/15/donald-trump-press-conference-far-right-defends-charlottesville
Wilson, M. (2016). Donald Trump wins the 2016 presidential election and delivers his victory speech in New York City. In Teenvogue.
Kasana, M. (2020, September 12). TikTok admits algorithms create “filter bubbles” that shield users from differing views. Input. https://www.inputmag.com/culture/tiktok-lifts-the-cover-off-its-algorithm-data-practices
Katz, A. (2017). Unrest in virginia. TIME.com. https://time.com/charlottesville-white-nationalist-rally-clashes/
Mork, L. (2019, December 5). Algorithmic personalization: My Google is not your Google | Minitex. Minitex.umn.edu. https://minitex.umn.edu/news/elibrary-minnesota/2020-07/algorithmic-personalization-my-google-not-your-google
Newman, N., Fletcher, R., Kalogeropoulos, A., A. L. Levy, D., & Kleis Nielsen, R. (2017). Reuters Institute Digital News Report 2017 (pp. 30–31). https://reutersinstitute.politics.ox.ac.uk/sites/default/files/Digital%20News%20Report%202017%20web_0.pdf
Parker, A. (2019, May 7). How Trump has attempted to recast his response to Charlottesville. Washington Post. https://www.washingtonpost.com/politics/how-trump-has-attempted-to-recast-his-response-to-charlottesville/2019/05/06/8c4b7fc2-6b80-11e9-a66d-a82d3f3d96d5_story.html
Sheikholeslamy, A. (2018, July 26). Average daily activation count : 43 times a day ! Teenagers are addicted to “TikTok” ! Appa.pe; App Ape Lab. https://en.lab.appa.pe/2018-07/addicted-to-tiktok.html
Wilson, M. (2016). Donald Trump wins the 2016 presidential election and delivers his victory speech in New York City. In Teenvogue.