Introduction
With the development of the digital age, the news media industry is undergoing earth-shaking changes.
Algorithms, artificial intelligence and other technologies are increasingly being applied in people’s daily life and constantly bring users novel feelings and experiences. People often talk about how humans shape machines, but they rarely talk about how machines shape us (Carah & Louw, 2015). These new technologies and capabilities pose challenges for people and society. The emergence of algorithm recommendations has greatly influenced communication, especially the emergence of filter bubbles, which have attracted people’s attention (Möller et al., 2018). This paper will define the filter bubble, analyse its causes, illustrate their effects with examples and put forward some suggestions on governance and application of the filter bubble.
Definition of Filter Bubbles
The algorithmic recommendation is a required service method for digital platforms. With the practical application of personalised recommendation, information dissemination has changed from traditional manual gatekeeping to algorithmic gatekeeping (Andrejevic, 2019). The purpose has also changed from delivering new events to recommending content that meets user preferences. Users passively accept relevant information, thus forming filter bubbles.
Pariser (2011) proposed that the Internet creates a unique information world for us.

He argues that users’ preferences are shaped because their searches are recorded online. On this basis, information is pushed to users according to the algorithm to ensure user stickiness. Each platform acts as a bubble separating the user from the rest of the information on the platform, immersing the user in the world of their preferred platform. It is difficult for users to take the initiative to break through the boundaries to discover new information: personalised search, news push algorithms and other user-oriented personalised recommendation technologies limit the scope and channels for users to obtain further information, and users are gradually confined to the homogenised information bubble (Pariser,2011). So that whether socialising, shopping or reading the information, the audience will be tightly surrounded by filter bubbles. As we all know, technology is a double-edged sword. On the one hand, the algorithm improves the productivity of the Internet. Still, on the other hand, its emergence also faces many challenges, such as privacy infringement of users in data collection, algorithm bias and discrimination, fake news, information narrowing, and opinion polarisation.
Formation of filter bubbles
The algorithm that provides personalised recommendation services to users is a set of coding programs that rely on data collected on the Internet to establish a digital link between user data and content data. The platform can obtain specific information that matches the user’s interests and preferences through algorithms. These feedback results are not static; the algorithm will automatically track the real-time update of the data stream on the Internet, analyse the user’s changing online behaviour, and optimise the recommendation content (Andrejevic, 2019).
Algorithm has built up the power to influence how we think and behave in daily life. Thus, in the filter bubble, the user’s choice of information is primarily ceded to the algorithm.
Filter bubbles are not entirely caused by algorithm recommendation but more by self-selection. The personalised recommendation is based on the user’s behaviour data. The user controls the right to adopt the recommendation content, and the generation of a filter bubble is the user’s choice to believe and accept the recommendation. When the mass media has carried out a layer of information screening, the audience will still conduct a second screening of information based on self-interest (Nguyen et al., 2014). It’s more of the psychology of conformity and following the mainstream (Nguyen et al., 2014). People are more willing to contact content consistent with or close to their own positions and attitudes while avoiding and ignoring opposing or conflicting information.
While it may seem like the algorithm is driving the spread of information, the content that the algorithm pushes is actually determined by the audience.
Therefore, the reasons for the formation include the internal needs of humanity and emotion and the external promotion of algorithm recommendation technology.
Consequences and case study
The explosive development of information brought by the Internet makes it difficult for users to quickly find exciting targets in the massive information. To further improve distribution efficiency, big data algorithm plays a positive role. For example, TikTok will analyse and push more videos that users may like according to the number of times they browse videos and the length of time they stay there to increase users’ use of the platform and their experience satisfaction. Therefore, to improve distribution efficiency and enhance user engagement, many media begin to use the algorithm recommendation function to filter out the content users are not interested in and ignore and retain and update more content that users like (Pasquale, 2015). This may bring more commercial value to media platforms, such as attention economy and live streaming.
But more generally, filtering technology tends to narrow information, making users more inclined to hear like-minded opinions. The communication in the discussion community constantly strengthens people’s inherent ideas and prejudices, and competitive views are suppressed and ignored, ultimately leading to group polarisation. The development of digital media has reduced the cost and time of producing and transmitting information, including political information. A functioning democracy relies on the role of the media to enable different groups and interest groups in society to participate in democratic debate and create opportunities for citizens to contact different viewpoints (Owen & Smith, 2015).
Thus, more and more citizens are accessing political information and participating in political discussions through media platforms. The 2016 US presidential election has raised further concerns about the impact of filter bubbles on democracy.

2016 has been a magical year. Everyone says we are witnessing history.
We saw Brexit and the election of Donald Trump as US President.
The interesting thing is that some people think it can’t happen, while others believe it, and the scariest thing is that they have no idea what other people think. The filter bubbles made the news that shocked the world.
In the presidential election, media platforms like Google, Facebook and Twitter and their users were caught up in the political fallout from the filter bubble. Because these platforms use algorithms to sift through what they see for users, searches for relevant political news are inevitably blocked by bubbles, leaving them increasingly unable to see opinions and information from the opposite party. In the US, where Facebook is now one of the key election battlegrounds, 60 per cent of users are unaware of filtering algorithms that only show them an opinion (Hern, 2017). This affects the voting public and may affect the outcome of the election. Filter bubbles led to huge ideological conflicts between supporters of the two parties in the 2016 US presidential election and even political riots organised by social media platforms, resulting in vicious social disputes.
In general, people tend to favour information supporting their beliefs or biases and reject information that does not fit into their belief system, known as confirmation bias.
So when we are only exposed to information that we believe to be true, it deepens our belief that it could be the opposite and increases the polarisation of opinion.
The filter bubble phenomenon makes people’s consciousness and ideas further narrow or even single and locks people firmly in their existing and recognised consciousness environment, thus generating opinion polarisation.

Ongoing polarisation of opinion creates an information cocoon. Living in the cocoon of information for a long time can easily make people blindly confident and narrow-minded. People assume their own views are correct and reject and attack the rest. Especially after gaining the approval of the same kind of people, it increasingly evolved into extreme ideas and extreme behaviour. At the same time, similar information and the same views are gathered on social media so that people’s original attitudes are constantly confirmed and strengthened, isolating information from other fields and expressions of dissent. Instead of hearing cyberspace’s complete and authentic voice, people only hear the amplified echo of enclosed space. The response room effect makes fake news more common. Disinformation is defined as the dissemination of demonstrably false or misleading information. It is associated with a larger propaganda strategy to manipulate the target population by influencing their beliefs, attitudes or preferences to obtain behaviour consistent with the propagandist’s political goals (Benkler et al., 2018). The US election is an excellent example of this. Benkler et al. (2018) pointed out that American politics and media culture faced an epistemic crisis and hit the foundation of a democratic society. This is enough to illustrate the negative impact of filter bubbles on opinion polarisation and political events.
Governance and response to filter bubbles
Improving the transparency of algorithm design will help users understand the basis for the algorithm’s operation and let users know which aspects of the algorithm push information for themselves and which information is filtered out by it.
In this way, users can adjust their behaviours according to the operation rules of the algorithm and decide the generation and influence degree of “bubbles”. For instance, we can protect our privacy and delete our browser cookies. Or we can prevent browser data from being logged by choosing to browse without a trace. We also need to be careful about agreeing to platform requirements and rules. Bozdag et al.(2014) believed that a personalised recommendation algorithm would affect the moral value of information, especially some hot information. The development of algorithmic recommendations creates a new reading experience. It forms a new distribution mechanism for news communication, but it cannot avoid the narrowing of push content and the lack of ability to pursue truth and understand society. Therefore, engineers should not only meet the personalised and diversified information but also provide users with space to understand and control the influence of the algorithm.
In the face of filter bubbles, users should improve their media literacy.
When media provides personalised, customised function services, users should consciously add different categories of topics, expand the scope of information push, and weaken the impact of filter bubbles. The diversity tools used for “bursting bubbles” should conform to the inherent spirit of democracy, reflecting pluralism and liberalism (Bozdag & van den Hoven, 2015).
Subsequently, new media platforms have also made corresponding efforts to solve the “filter bubble” problem caused by algorithms.

For example, The Guardian has launched a “Burst your bubble” column to provide readers with a different perspective (OWEN, 2017). BuzzFeed has launched “Outside Your Bubble,” a feature that displays various opinions and comments from Twitter, Facebook and Reddit at the bottom of some widely shared news articles (Smith, 2017). Google Chrome has introduced the “escape your bubble“, a plug-in that balances the information imbalance by inserting politically different information from users’ reading habits, allowing the user to understand other points of view (Sanchez, 2016). Therefore, media platforms should make their algorithms more transparent to reduce the filter bubbles that develop over time (Flaxman et al., 2016). They will also provide diverse access to information and the ability to identify misinformation to ensure quality content and reduce polarisation of opinion.
Conclusion
In conclusion, we know what filter bubbles are and the causes of this phenomenon through this paper. The formation of filter bubbles is complicated, which has both the impetus of algorithmic recommendation technology, the internal needs of audiences and emotions, and the association with social power and political opinions. Filtering bubbles have certain benefits, but mostly because filtering leads to information narrowing and opinion polarisation. The negative impact of this phenomenon and the concern of online citizens can be seen in the example of Trump’s election. To eliminate bubbles, we not only need to optimise algorithmic technology to give users access to diverse information but also need to create a more democratic, free and equal dialogue and voice space to see a more honest and comprehensive world. It is hoped that more and more Internet users can critically view the “filter bubble” phenomenon. While reducing the harmful effects of filter bubbles, we can also learn to use their advantages to broaden our access to information and maximise our own interests.
Understanding that what we see is not all there is will help us understand that we live in a distorted reality and remind us to remove our glasses.
Therefore, media platforms and users are escaping from the ties of filter bubbles and stepping out of the comfort zone of personalisation and how to better govern the use of algorithms have become a topic we need to complete.
References
Andrejevic, M. (2019). Automated Media. Routledge. https://doi.org/10.4324/9780429242595
Benkler, Y., Farris, R., & Roberts, H. (2018). Network Propaganda: Manipulation, Disinformation, and Radicalisation in American Politics. Oxford University Press. https://doi.org/10.1093/oso/9780190923624.001.0001
Bozdag, E., Gao, Q., Houben, G.-J., & Warnier, M. (2014). Does Offline Political Segregation Affect the Filter Bubble? An Empirical Analysis of Information Diversity for Dutch and Turkish Twitter Users. ArXiv:1406.7438 [Physics]. http://arxiv.org/abs/1406.7438
Bozdag, E., & van den Hoven, J. (2015). Breaking the filter bubble: Democracy and design. Ethics and Information Technology, 17(4), 249–265. https://doi.org/10.1007/s10676-015-9380-y
Carah, N., & Louw, E. (2015). Media and Society: Production, Content and Participation. SAGE Publications Ltd.
Flaxman, S., Goel, S., & Rao, J. M. (2016). Filter Bubbles, Echo Chambers, and Online News Consumption. Public Opinion Quarterly, 80(S1), 298–320. https://doi.org/10.1093/poq/nfw006
Hern, A. (2017, May 22). How social media filter bubbles and algorithms influence the election. The Guardian. https://www.theguardian.com/technology/2017/may/22/social-media-election-facebook-filter-bubbles
Möller, J., Trilling, D., Helberger, N., & van Es, B. (2018). Do not blame it on the algorithm: An empirical assessment of multiple recommender systems and their impact on content diversity. Information, Communication & Society, 21(7), 959–977. https://doi.org/10.1080/1369118X.2018.1444076
Nguyen, T. T., Hui, P.-M., Harper, F. M., Terveen, L., & Konstan, J. A. (2014). Exploring the filter bubble | Proceedings of the 23rd international conference on World wide web. Association for Computing Machinery, 677–686. https://doi.org/10.1145/2566486.2568012
Owen, D., & Smith, G. (2015). Survey Article: Deliberation, Democracy, and the Systemic Turn. Journal of Political Philosophy, 23(2), 213–234. https://doi.org/10.1111/jopp.12054
OWEN, L. H. (2017). A new feature in The Washington Post’s Opinion section will alert readers to opposite viewpoints (with the help of AI). Nieman Lab. https://www.niemanlab.org/2017/11/a-new-feature-in-the-washington-posts-opinion-section-will-alert-readers-to-opposite-viewpoints-with-the-help-of-ai/
Pariser, E. (2011). The filter bubble: What the Internet is hiding from you. New York, NY: Penguin.
Pasquale, F. (2015). The Black Box Society: The Secret Algorithms That Control Money and Information. Harvard University Press. https://www-jstor-org.ezproxy.library.sydney.edu.au/stable/j.ctt13x0hch
Sanchez, C. (2016). A software developer created a way for you to escape your political bubble on Facebook. Business Insider. https://www.businessinsider.com/escape-your-bubble-facebook-news-feed-2016-12
Smith, B. (2017). Helping You See Outside Your Bubble. https://www.buzzfeed.com/bensmith/helping-you-see-outside-your-bubble

