Have you ever noticed that if you are a K-pop lover, half a dozen of the short video content is probably related to Korean culture when you browse through TikTok; or if you like photography, then simply most of the content is related to camera skills. In short, you can simply see whatever you like or are interested in through accessing this viral short video platform. It is unsurprisingly that you, as this internet platform user, are living within the algorithmic filter bubbles.
***
Recently, many short video platforms have launched an offensive to gain a foothold in the highly competitive video streaming market. These platforms use algorithmic recommendation technology to attract users through on-target content, ensure that they remain sticky, and achieve accurate distribution. Although this approach might have solved the problem of unrelated information flooding to a certain extent, it can easily lead to the filter bubble effect and result in the problem of polarisation. Therefore, this article aims to analyze the formation, impact, and possible governance of the filter bubble by examining TikTok as the case.
What is Filter Bubble?
Media content was traditionally filtered through artificial collecting, processing and reviewing (Zuiderveen Borgesius et al., 2016). Yet following the rapid development of internet technology, computer algorithms can be used to accurately deliver information to the audience, thus greatly increasing the number of clicks and attention.

The idea of the Filter Bubble was firstly introduced by Internet activist Eli Pariser in 2011. He noticed that two people searching for the same term through Google could get completely different pages of results (Bruns, 2019). He then found out that search engines can readily understand users’ preferences and then filter out heterogeneous information, creating a personalized world for users in the digital era. At the same time, a virtual wall separating different information and ideas can gradually be built, leaving the users in a net bubble that prevents the exchange of diverse views. In Pariser’s view, the filter bubble is not only oriented toward the interests of audiences but also encompasses all aspects of their lives, so as creating a unique and customized information field for them online (Andrejevic, 2019). In other words, this means that a computer can personalize information for any audience simply by recording their digital behaviors conduct online, and continue to calculate and infer their preferences, then pushing out relevant information based on this algorithmic process.
Formation: TikTok creates Filter Bubble

Marshall McLuhan proposed the well-known theory, Medium is the Message, indicating that humans can only engage in communication and other social activities which are compatible with a certain medium form once they have it (McLuhan, 1964). The emergence of TikTok’s short videos replaces textual contents with 15 seconds of vivid video, creating a visual and non-selective form of presentation for users, penetrating every gap in their fragmented time. TikTok is literally a short video information streaming platform that strongly relies on recommendation algorithms. According to the algorithmic mechanism released by TikTok itself, the basis for the genesis of its personalized recommendations and filter bubbles can be summarized in three basic logic.
The first one is based on the basic user information, which is the simplest, most fundamental, and most common bubble filtering model in TikTok’s algorithm system. In other words, both the users’ gender, age, location, and other basic personal information is recorded backstage and the corresponding video content is then delivered to them. The very first time when a new user accesses the platform, it has no information or data to capture. Hence, the user’s login detail is used as the primary data that TikTok collects and follows by giving a general portrait of the user’s basic profile. New users are recommended to log in to TikTok through a third-party one-click login option. This means that users can use third-party authorizations such as Facebook, Twitter, or Google to log in. This allows TikTok to directly use the user’s information on other platforms and obtain the user’s nickname, profile picture, geographic location, etc., so as to profile the user’s characteristics. Whenever TikTok is linked to the rest of the social media, the backstage system can use this profile data to make the most basic home page #ForYou recommendations. At the same time, there are also visitors who are not logged in. TikTok will then capture these users’ basic interactions. For instance, the time spent on a certain type of content. The algorithm uses this information to initially analyze and profile those visitors, so that providing accurate videos and filter out irrelevant content.

The second layer of logic is based on the social behavior of users. The main basis consists of the users’ interaction activities with the media contents on TikTok. The user’s browsing, staying, liking, commenting, or sharing of video content on the platform can be an important point for determining users’ preferences. Once collected, the system can simply analyze the content preferences and then recommend similar information to the users which fits their own data model. The accuracy of content recommendations is judged by all types of interactive behaviors, and the data model is adjusted accordingly to form the subsequent recommendations. Thus, the key basis for #ForYou is generally to filter the related content according to users’ own choices and to distribute the content in a way that pleases the users. Meanwhile, by binding with other third-party social media, once a user has other friends who have joined TikTok, the platform will give priority to friends’ recommendations. As a result, TikTok’s powerful algorithm connects users to their interests and friend zone. Recording their online behaviors, it eventually achieves the goal of wrapping them in a highly homogenized stream of information and bubbles.

The third special mode of the formation of the filter bubble on TikTok is the overlay of recommendations and trending topics. Media content that receives more likes, comments, or shares is automatically identified as quality content by the system. It will then be stacked up with more referral traffic and resulting in large numbers of short videos with millions or even tens of millions of views. By creating bubble filters, the traffic is further increased.
To summarize, the platform filters the content based on the users’ preferences, like providing them the Daily Me, which is basically an effect similar to filter bubbles as a digital daily newspaper that is customized for any individuals with his or her own label and interests (Negroponte, as cited in Andrejevic, 2019). Users automatically are filtered with the information which is made up of their social connections and online behaviors. TikTok aggregates them with the same hashtags, allowing users to hear more of the same voices.
Impact: The negative effects of Filter Bubble
After analyzing the formation of filter bubbles on TikTok, it is clear that the personalized recommendations allow users to meet their favorite contents to a maximum extent. Although there are some arguments that filter bubbles on TikTok have improved the efficiency of information searching, it has basically exacerbated the problem of narrow knowledge. Users are passively placed in a limited circle, unknowingly and unilaterally receiving homogenized information. As the filter bubble places more emphasis on the influence of technology on users’ perceptions and thinking patterns, they are gradually isolated in the information silos, making it difficult for them to reach out to the real and wide world. By constantly pushing short video content which matches users’ preferences, the platform lets them immerse themselves in their own discourse system and the cognitive framework created by TikTok as well. Over time, the filter bubble effect will provide the audiences with a one-sided and stereotypical impact. This is especially serious for those users who use TikTok as their only source of information receiving. The algorithm does not provide the most comprehensive content to them, and users are exposed to the media content which has already been filtered through the system. By approaching TikTok, users unwittingly give up their access to other things, resulting in individual information bias. This Daily Me effect and unique environment prevent further exposure to the opposite or non-consistent voices. Users ultimately become the kind of one-dimensional people who care only about themselves (Marcuse, 2013). Once this happens, audiences can easily develop a perception of fixation.
The personalized recommendation mechanism thus increases the prevalence of opinion polarisation. When users interact with other like-minded people, their own perceptions, opinions, and views are strengthened (Mäs & Flache, 2013). They will then become more dependent on staying within this relatively closed information bubble. When users are passively guided by the algorithmic filters for a longer period, they tend to develop two completely opposing cognitive perspectives, absolute truth or absolute fallacy. This can lead to negative incidents such as cyberbullying or online harm toward other inconsistent users when they form a homogeneous circle with people who share the same perceptions or similar interests through online platforms.
At the same time, as the various groups of audiences are keen on different types of information, each user’s concerns and thoughts are also different and most importantly, one-sided. When the filter bubble effect comes to the entire platform domain, it can make the audience’s views highly individualistic and lacking in mass cohesion. Whenever social norms require uniformity, it is easy for those audiences to become a rabble-rouser and affect the orderliness of norms.
In short, the filter bubble effect inadvertently regulates users’ thinking and behaviors. Users had the right to broadly choose other content in newspapers or television or other traditional media forms before the popularization of the internet. The emergence of automated algorithmic recommendations has, in a sense, deprived them of their right to choose and has determined their information consumption and production behaviors to a certain extent. The consequences of this denial were also illustrated by Pariser in his TED talk, mentioning the internet recommends what it believes the users want to see but not always things users have to see. In the long run, users’ perceptions will be narrowed, and their minds will become solidified as they continue to receive homogenized information. They will lose the awareness and ability to actively access different information or media news. It is ironic that instead of feeling the virtual pain of disenfranchisement, users in the filter bubble will even feel a higher level of pleasure and enjoyment.
Governance: How can Filter Bubble be avoided?
As the formation of the filter bubble is largely based on the internet platform, the key solutions to this dilemma will have to rely on the platform’s own technological improvements. According to an official news in 2021, TikTok has informed the public that it is continuously working to optimize its existing algorithm to avoid showing similar video content to users too often. The platform should be more intentional about avoiding the homogenization of topics and improving the platform’s topic sets. By updating the algorithmic logic, it will be possible to better measure the comprehensiveness and focus of users’ information needs. At the same time, the government should also actively take steps to strengthen the regulation of TikTok and other algorithm-related online platforms as well.
To be honest, it is difficult for us, as the general users, to have a direct influence on what the platforms or government can do to improve the situation. Therefore, in order to burst the filter bubble, it is more important for the user community to improve their media and online literacy and to consciously and actively receive or access more information beyond their personal preferences. Users should consciously expand their news sources from numerous different paths to create a multi-faceted and diverse information environment. In addition, in a platform like TikTok that offers a personalization system, it is important to proactively add different categories of hashtags or topics, in an effort to expand the range of media content that can be pushed out, and thus weaken the impact of the filter bubble. As the filter bubble is the result of a combination of technology, people, and society, it then requires the cooperation of both technological improvements, platform optimization, and human consultation. Only through the joint efforts of all these sides can achieve better management and communication results of TikTok. In short, once users can use online information platforms wisely, they will not only be able to enjoy a personalized experience that is algorithmically effective and accurate but will also be able to promote the diversity of recommended contents and avoid the dangers of the filter bubble.
References
Andrejevic, M. (2019). Automated Culture. In Automated Media (pp. 44-72). Routledge.
Bruns, A. (2019). Filter Bubble. Internet Policy Review, 8(4), 1-14.
Dasdan, A. (2021). Isolated in online social spaces: The filter bubble algorithm [Online image]. Lynbrook High School. https://lhsepic.com/9373/in-depth/isolated-in-online-social-spaces-the-filter-bubble-algorithm/#
Feyissa, S. (2020). Person holding an iPhone running TikTok. Unsplash. https://unsplash.com/photos/Yaw9mfG9QfQ
Marcuse, H. (2013). One-dimensional man: Studies in the ideology of advanced industrial society. Routledge.
Mäs, M., & Flache, A. (2013). Differentiation without distancing. Explaining bi-polarization of opinions without negative influence. PloS one, 8(11).
McLuhan, M. (1964). The Medium Is The Message. In Understanding Media: The Extensions Of Man (pp. 23-35). Signet.
Negroponte, N. (1995). Being Digital. Coronet.
Pariser, E. (2011). The Filter Bubble: What The Internet Is Hiding From You. Penguin.
Pariser, E. (2011, March). Beware online “filter bubbles” [TED Video]. TED. https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles?language=en
TikTok. (2020, June 19). How TikTok recommends videos #ForYou. https://newsroom.tiktok.com/en-us/how-tiktok-recommends-videos-for-you
TikTok. (2021, Dec 16). An update on our work to safeguard and diversify recommendations. https://newsroom.tiktok.com/en-us/an-update-on-our-work-to-safeguard-and-diversify-recommendations
Zuiderveen Borgesius, F., Trilling, D., Möller, J., Bodó, B., De Vreese, C.H., & Helberger, (2016). Should we worry about filter bubbles?. Internet Policy Review, 5(1), 1-16.