Have you ever wondered why the more time you spent on a media platform, the more homogeneous content you see? Why after watching this video, you just cannot help yourself to check on the next one? This is not your illusion, but the result of the recommendation algorithms. On the media platforms, each of us got a virtual ID, and any content we see are decided by recommendation algorithms. This blog takes TikTok as a case to analyze the filter bubbles of short video platforms under the influence of recommendation algorithms and how to avoid its negative influence from the perspective of platforms, the government, and every one of us.
Why using recommendation algorithms?
According to Andrejevic (2019), Netflix has close to 15,000 titles, Spotify has around 40 million songs from some 2 million different artists, and Amazon.com offers some 30 million books. It requires the new forms of cultural curation of internet media platforms, such as recommendation algorithms, in order to keep up with the volume of digitized content. Taina Bucher (2018) described this process as “automated sociality”. Besides, it is also the product of fierce competition in the short video market. There are various short video platforms using recommendation algorithms in order to accurately attract users through content and ensure that audiences remain sticky in using them, so as to achieve accurate dissemination to audiences. Although this practice solves the problem of information flooding to a certain extent, it tends to trigger the “filter bubble” effect, which leads to limited personal vision, increases social and political polarization and extremism, and also leads to the decreasing social stickiness. Therefore, in the era of algorithms, how to effectively avoid the “filter bubble” effect and solve the problem of homogenization of content is one of the most important issues for Internet companies, governments, and the public.
The term of Filter Bubble was introduced by Internet activist Eli Pariser (2011) in his book “The Filter Bubble: What the Internet Is Hiding from You”. Pariser (2015) claims that every one of us is living in a “personalized universe of information”, where the computer records the history traces left by the audience’s search and browsing on the web, calculates and infers the audience’s favorite information and preferences, and further pushes relevant information according to this, thus realizing the audience’s personalized information customization. In the new media era, especially in the “battlefield” of short videos, in order to achieve perfect and accurate pushing and ensure the stickiness of users’ media usage, all platforms have started to compete to use algorithms. However, although this personalized bubble meets part of the audience’s needs, it isolates the information beyond the needs and makes it difficult to realize the diversified delivery of information. To a certain extent, this filtered bubble is a one-sided and stereotypical guide to the audience.
Formation of filter bubbles: Take TikTok as an example

Fig2. TikTok
From: https://i.imgur.com/8JWG5ST.png
According to Axios, the company recently shared some insight into its machine learning that per TikTok helps amplify user engagement. The company touched on how machine learning for its wildly short video app is able to create filter bubbles. The most common bubble filtering mode is to recommend content based on basic user information, i.e., the user’s gender, age, location and so on are recorded as background information, and corresponding content is delivered to the user. TikTok is linked to social media such as WeChat, QQ, and Weibo, and users can log in through these media with one click, and the system can use this information to achieve content recommendations on the home page.

Fig3. Popular Topics in TikTok
From: https://i.imgur.com/S7CTvxW.png
Users’ browsing, liking, commenting and retweeting behaviors are also the important basis for judging users’ preferences on the TikTok platform. Whenever TikTok users log into the app, they are shown eight videos from various genres. Based on the feedback such as what video users chose, the algorithms share another eight new videos in the same genre. Usually, once a user likes or make comments towards a video or a topic, they will be recommended similar information content from the platform. By this way, it arouses their interests and continuously explode the topic. TikTok also collects data based on user’s social relationships, through the binding with the address list, WeChat and other social media platforms. For example, once a friend joins TikTok, TikTok will make recommendations, and the interaction among friends will also leads to higher priority to recommending content posted by those friends. The system also gathers information on users’ devices, account settings, captions, hashtags, language preferences, and location. TikTok identifies and categorizes people into “clusters” and maintains them in “bubbles” based on this information.
The negative effects of TikTok’s “filter bubble”
Quality content loss
TikTok filtering bubble makes some fixed topics occupy a lot of traffic resources, often the more vulgar content with intense sensory stimulation occupies a lot of traffic, often recommended, such as the wide spread of memes. Taking #mixuebingcheng as an example. It is a popular commercial song, which became a meme and attract huge amount of users to recreate using this song. The song is adapted from a folk song “Oh!Susanna”, however, after the adaption, its lyrics doesn’t make any sense and the content under this topic are also homogenous and to be meaningless imitation. Hence, a huge number of high quality videos are not widely recommended because they do not cater to popular preferences or are not strongly sensory stimulating, resulting in a loss of high quality content and decreasing creative enthusiasm of users.
Content homogeneity and audience polarization
As the filter bubble is mainly based on the audience’s interests and search preferences, the platform will obtain the user’s preferences based on the user’s browsing records, and then push the short video of interest. Over time, due to the role of filtering bubbles, the information will show a single characteristic, to a certain extent, the audience will strengthen bias and stereotypes instead of breaking them. It is because that filter bubbles can result to excessive information preference guidance, so that the audience is only partially aware of the content, lacking a complete understanding of the information content. Audiences may develop a persistently self-assured and narrow-minded perception as a result of this. People who are influenced by the “filter bubble” tend to become immersed in their own worlds, pursuing their own interests while disregarding contact and engagement with others, resulting in a polarized audience. According to Pariser (2011, 13), personalization filters serve up a kind of invisible autopropaganda, indoctrinating us with our own ideas.

Fig4. Negative Effect of Filter Bubbles
From: https://i.imgur.com/lXo8pbd.png
Disorderly social norms
The “filter bubble” seems to bring people a single influence, but involves the whole platform area, because the audience of various information receiving different, each audience’s interests and concerns are also different, which makes the audience’s views very individual, lack of mass cohesion. When social norms require uniformity, the “scattered beads” of the audience will “express their own opinions”, affecting the orderliness of social norms.
Breaking the “filter bubble”

Fig5. Breaking the “bubble”
From: https://i.imgur.com/j8jtwjB.png
Platforms
It is necessary for platforms to improve user insight technology, to address cross-domain recommendation, and to optimize the topic setting. Firstly, since online media platforms, such as TikTok, are mainly based on recommendation algorithms to analyse audience preferences, breaking the “filter bubble” requires abandoning this behavior and continuously improving user insight technology. For example, the algorithm can be based on the original algorithm to increase the measurement indicators of the content pushed: content effectiveness, timeliness, user satisfaction, and so on, or the algorithm to calculate the average intensity of a topic in the various circles in which the audience is located, in order to provide a smarter, more complete, and accurate measurement of people’s information needs and priorities.
TikTok’s algorithm is being tweaked to avoid showing users the same types of videos too often. It’s significant because TikTok claims that the changes were made to avoid encouraging opinions that could have a harmful impact on users. According to Fischer (2021), TikTok is experimenting with strategies to avoid recommending content that, while not dangerous when viewed sparingly, can be detrimental when viewed sequentially, such as severe dieting videos. Experts in medicine, clinical psychology, AI ethics, and other fields are contributing to these initiatives. TikTok’s algorithm was created to avoid giving people repetitive movies from the same creators, which could weary them. New adjustments would broaden these efforts by ensuring that videos aren’t simply from different artists and use new music, but also don’t repeat the same topics, which might be harmful if watched repeatedly. TikTok is experimenting with allowing users to choose which topics or hashtags they don’t want to see in their primary “For You” feed. A person who recently lost a dog, for example, may choose to avoid watching films about pets, while a vegetarian may wish to avoid watching videos that contain meat.
Besides, cross-domain recommendation mainly refers to the “information cocoon” out of the recommendation. Take the recommendation system “Individuality” as an example. If user A often browses the same content, the recommendation system will try to explore other cross-domain categories as much as possible, including the same species across material properties, items across different categories and different species with the same color and material, to avoid the influence of “filter bubble”.
Moreover, in the face of the one-sided and stereotyped information brought by the “filter bubble”, media platforms should deliberately avoid the homogenization of topics. Through algorithmic technology, it should not only recommend content of interest to users, but also recommend other types of information, and more importantly, recommend content with mainstream values, so as to guide the audience with values and convey positive social energy, and create a good short video environment.
However, there is a dilemma that in a context of media surfeit, people find themselves both exposed to a broader range of information and less inclined to take into con-sideration the larger community of which they are part and the perspectives of those unknown others who comprise it (Andrejevic, 2019). Therefore, Exposing people to a variety of information does not solve the problem if online media platforms fails to lead users take into consideration of other users’ needs, perspectives, and values, including those who live in other “bubbles” because that exposure to countervailing views and evidence rarely leads to careful reconsideration and thoughtful deliberation rather than a further hardening of people’s positions (Andrejevic, 2019).
Users
In terms of the audience, it is significant for them to consciously and actively “poke the bubbles”. In addition to the efforts of the technology and the platform, it is more important that the audience must have the consciousness to break the “filter bubble”, and constantly improve their media literacy and network literacy, consciously and actively accept the information beyond their preferences. In addition to technology and platform efforts, it is more important for audience users to have the consciousness to break the “filter bubble”, continuously improve their media literacy and network literacy, consciously and actively accept information beyond their preferences, continuously superimpose the information they get into the information of the larger society, and explore the unknown information in the larger society to create a multi-faceted information environment through the efforts of multiple parties.
Government
In addition to media platforms and the public, government regulation plays an important role in breaking the bubbles. Taking China as an example, according to the ” Internet Information Service Algorithmic Recommendation Management Provisions” (2022), The regulations requires algorithm recommendation service providers to respect users’ rights, including the right to know the algorithm, which requires providers to disclose the algorithm’s core principles, aims, and operational procedures. Users should also be able to choose options that aren’t particular to their personal traits and turn off the algorithm’s suggestion service, according to the laws (Agrawal, 2022). At present, TikTok, WeChat, Taobao, Baidu, Weibo and many other platforms allowing users to turn off “personalized recommendations” in the with one click.
However, filter bubbles are not the only reason causing the negative effects such as addiction and misleading adolescents. Those effects are also resulted from social environment. For example, the China Comment report mentioned the problem of left-behind children in the mountainous area of Yimeng, Shandong Province, who are addicted to short videos. According to media reports, most of the left-behind families in the mountainous area are led by the elderly, who only take care of food and housing, and have more than enough power in education. Here, the “information cocoon” is not only a sub-topic of the urban-rural problem, but also the problems of inter-generational communication, family education, school education and youth psychology are all magnified. Therefore, the “information cocoon” and its related problems are not only about media literacy, but also not limited to it, and we can better solve the problem of adolescent addiction by recognizing this.
References:
Andrejevic, M. (2019). Automated media (1st Edition, pp. 44–72). Routledge. https://doi-org.ezproxy.library.sydney.edu.au/10.4324/9780429242595 (Original work published 2019)
Agrawal, A. (2022). China creates new rules to control algorithm recommendation services. JURIST Legal News & Research Services, Inc. https://www.jurist.org/news/2022/01/china-creates-new-rules-to-control-algorithm-recommendation-services/#
Arendt, H. (1982). Lectures on Kant’s Political Philosophy. Chicago: University of Chicago Press.
Bucher, Taina. 2018. If … Then: Algorithmic Power and Politics. Oxford: Oxford University Press.
Kasana, M. (2020, September 12). TikTok admits algorithms create “filter bubbles” that shield users from differing views. Input. https://www.inputmag.com/culture/tiktok-lifts-the-cover-off-its-algorithm-data-practices
Pariser, E. (2011a). The Filter Bubble: What the Internet Is Hiding from You. London: Penguin.
Pariser, E. (2011b) Beware Online “Filter Bubbles“. TED, March 2011. Retrieved from https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles
Sara Fischer. (2021). Axios: Exclusive: TikTok tackles filter bubbles. In Axios. Newstex.
Sunstein, Cass R. 2001a. Republic.com. Princeton: Princeton University Press.