Pop that Filter Bubble and Avoid Opinion Polarization: TikTok is at War

Introduction

Each year, it seems like trusting the news has become more complex. Remember when Donald Trump shut down a CNN reporter by saying “you are fake news”? What about those absurd conspiracy theories that proposed the link between Covid-19 and 5G? During the past two decades, social media platforms have classified themselves as essential tool for communication but have also become a popular means of information seeking and news consumption. This poses a concern to some people, as the rapid development of social media has contributed to the spread of problematic information. Perhaps, you might even find yourself stuck in a filter bubble.

Terry (2021) argues that two main factors contribute to the spread of misinformation and disinformation. The first factor is the growing distribution of information on social media. Undoubtedly, the affordances and fast-paced nature of social media platforms contribute to the widespread dissemination of information. Some argue that these affordances, namely algorithms, create filter bubbles and echo chambers. While the terms often overlap, two distinct factors separate them. The filter bubble, introduced by Eli Pariser, is widely used to describe an online environment where people are exposed to specific opinions and information. Bruns (2019) adds that a filter bubble emerges when “a group of participants chooses to preferentially communicate with each other, to the exclusion of outsiders (e.g., by comments on Facebook, @mentions on Twitter, etc.)” Echo chambers, on the one hand, emerge when users choose to connect with each other. However, both concepts raise the concern of how users are exposed to a limited range of views and opinions. 

The second factor is the growth of political polarisation. In the United States, for example, there is a growing divide between Democrats and Republicans, especially since the presidential election in 2016 (Azzimonti and Fernandes, 2018). Various streams of fake news slipped through the internet. Gu, Kropotov, and Yarochkin (2016) defined fake news as “…the promotion and propagation of news articles via social media…The news stories distributed are designed to in‑uence or manipulate users’ opinions on a certain topic towards certain objectives” (as cited in Azzimonti and Fernandes, 2018). While propaganda is nothing new, social media has made it extremely easy for ideas to reach a wider audience. However, it has also made it difficult for platforms to moderate and govern their circulation. 

Keeping these characteristics in mind, we will specifically explore how the affordances and features on TikTok influence the spread of problematic information. In addition, we will examine how filter bubbles and echo chambers contribute to further political polarization in the context of the Ukraine and Russia wars. Finally, we will explore several solutions that could help mitigate the spread of problematic information. 

TikTok at war

During the wake of the Russian invasion of Ukraine, content related to the war substantially increased. On February 24, 2022, TikTok videos tagged as #Ukraine surpassed over 30 billion views by March 17 (Paul, 2022). However, given that TikTok is a content-heavy platform, users can easily post and recreate videos. This inevitably contributed to the enormous surge of mis- and disinformation related to the Ukraine and Russia war. For example, videos of injured civilians and explosions were uploaded, as if they were live reports from Ukraine. More specifically, some non-Ukrainian users abused the live streaming feature on TikTok for financial gains. These users attempted to “mimic some of the scenes in Ukraine, with users pointing cameras at nondescript scenes or looping video and playing sirens in the background” (Tenbarge and Collins, 2022), to solicit donations. 

Multiple viral videos revealed an audio recording of several Ukrainian border guards on Snake Island confronting a Russian military unit. The audio portrayed an exchange where the Ukrainian guards were suggested to “lay down your weapons” or “be hit” in which the guards responded, “Russian warship, go to hell”. The Ukrainian guards were reportedly killed, as President Volodymyr Zelenskyy of Ukraine personally announced that they would each be awarded the title Hero of Ukraine. However, a few days later, Ukrainian officials uploaded a Facebook post that confirmed the men were still alive and taken prisoner by Russian forces. The TikTok videos that reported the guards’ deaths were never corrected or taken down (Frenkel, 2022). 

Features, such as audio and video editing, are central for TikTok creators, but they can be used out of context. For example, Figure 1 displays a screenshot of a TikTok video that utilized an audio sound of people screaming, followed by an explosion in the background (Richards 2022, as cited in Tenbarge and Collins, 2022). The original sound, however, was a video of an explosion in Beirut, Lebanon in 2020. The audio was placed over the shaky footage of a man anxiously running away from the balcony. At first, it may seem like the audio and visuals match thus, it’s very easy for viewers to quickly interpret the video as valid. Ioana Literat from Columbia University shared that “emotive videos, emotional videos, can really make people skip the verification stage and not give enough thought to the accuracy of the video and not really exercise their media literacy” (as cited in Rosenblatt and Tenbarge, 2022). 

Figure 1. Screenshot image of TikTok Video. Retrieved from Media Matters.

Additionally, TikTok provides a plethora of editing tools including those as simple as adding text. An example (Figure 2) shows how users placed a text that read, “UKRAINE LIVE” on top of footage of military units in uniforms, lines of military vehicles, and explosions. Simple phrases like these easily mislead the viewer into thinking that the video footage was taken recently (Nilsen et. al., 2022). TikTok videos in general lack any context, which leaves users to interpret the content to the best of their abilities.

Figure 2. Screenshot image of TikTok Video. Retrieved from The Media Manipulation Casebook.

All these features help videos to reach a wider audience; however, simple interactions such as commenting, and sharing also help form algorithms. TikTok has a main page called the “For You Page”, which is essentially where users can find their curated videos. Pariser (2011), in his literature shares, that “your identity shapes your media, and your media then shapes what you believe and what you care about. You click on a link, which signals an interest in something, which means you’re more likely to see articles about that topic in the future, which in turn prime the topic for you”. 

It’s worth noting that TikTok’s audience is considerably younger compared to other media platforms. In the United States, 32.5% of TikTok’s active users were aged 10 to 19, and 29.5% were aged 20 to 29 as of March 2022 (Doyle, 2022). While TikTok allowed young adults to initiate political discussions, they’re often vulnerable to other users on the platform that may or may not share the same beliefs. As presented in the examples mentioned above, the recommendation algorithm has the potential to trap users, often young ones, in a filtered bubble. Just and Latzer (2017) share that the main goal of algorithms is the personalization of processes and results. Algorithm-based recommendation systems shape “daily lives and realities, affect the perception of the world, and influence behavior” (Just and Latzer, 2017). When users rely too much on the algorithm, they won’t even realize that they’re entering and being trapped in a confined circle on the Internet. Simply put, things may escalate when users unconsciously stumble into echo chambers, where contact with contradicting views will be filtered out, further promoting political polarisation online.

Are we trapped? 

Let us further examine how filter bubbles (and echo chambers) enabled political polarisation in the recent Ukraine and Russia conflict. The two governments are involved in an information war on the Internet. Ukraine, for example, has strategically used social media to win over the hearts of the West. On 27 February 2022, the official Ukrainian Twitter account uploaded a video montage of a fighter plane coursing through the sky as enemy planes exploded around it. It was said that this viral pilot had single-handedly shot down multiple Russian fighter planes. Within days, the name of the pilot, the “Ghost of Kyiv” went viral on many different social media platforms. Impressive as it may seem, the original video was found on YouTube, and was in fact realistic video footage created with a digital combat simulator. While there was a split in opinions (as there always is), most people perceived this as a form of fake news propaganda. Thompson and Davey (2022) quoted Laura Edelson, a computer scientist studying misinformation at New York University, where she elaborated on how Ukraine is “telling stories that support their narrative” and that “sometimes false information is making its way in there, too, and more of it is getting through because of the overall environment”. Ukrainian leaders have revolved their propaganda around evoking emotion and by doing so, the Ukrainian government is succeeding in reaching out to their own people, as well as international people. 

Russia, on the one hand, carried its narratives through China. For example, on February 24, “the Chinese Communist Party’s Global Times posted a video saying that a large number of Ukrainian soldiers had surrendered, citing the Russian state-controlled media network RT” (Goldenziel, 2022). Additionally, Chinese state television programs used social media to report that Ukrainian president, Volodymyr Zelenskyy had fled Kyiv (which was, in fact, not true). Regardless of China’s acts of support, Russia is struggling to push its narratives to the West, mainly because social media companies have blocked or restrained access to their platforms. 

TikTok, for example, suspended all new content from Russia after a Russian law took effect that anyone intentionally spreading fake news about the military would face 15 years of jail time (Mellor, 2022). Also, the company released a statement that introduced a new section on the app for digital literacy tips “to help our community evaluate and make decisions about the content they view online” (as cited in Mellor, 2022). Clearly, this wasn’t enough since there was still a good chunk of pro-Russian content posted on social media in the United States. As aforementioned, the remix-ability of TikTok videos makes it difficult to differentiate between problematic information and reliable content. Companies like TikTok have a long way to go to integrate strong initiatives that filter negative sentiments and statements.

So, how do we pop the bubble?

We explored how the affordances of social media contribute to the spread of mis- and disinformation. In the case of the Ukraine and Russia wars, users turned to TikTok to post content. Regardless of whether the content was posted with specific intent, or for financial or political gain, the ability to remix content on TikTok has enabled problematic information to spread across the platform. Some of these features include the overlaying of audio or the use of text. Although these tools can be integrated with ease, it plays a huge role in misleading thousands, if not millions of viewers. We also explored how interacting with certain content on TikTok influences a user’s algorithm. The formation of an algorithm inevitably places an individual in a filter bubble, which promotes the polarisation of politics. We saw how both Ukraine and Russia utilized media platforms to write their own narratives, in hopes to deceive people’s knowledge about the war. 

To avoid further polarisation, users need to actively consume news and opinions on their own. In fact, users should not rely on social media – which heavily relies on algorithms – as their primary source for news consumption. Finally, while it may be difficult to produce technical solutions for mitigating the spread of mis- and disinformation, we as users should remain critical about how we interpret content on social media. Pariser (2011) shares, “…I think we really need the Internet to be that thing that we all dreamed of it being. We need it to connect us all together. We need it to introduce us to new ideas and new people and different perspectives. And it’s not going to do that if it leaves us all isolated in a Web of one” (Pariser, 2011). 

 


References

Azzimonti, M., & Fernandes, M. (2018 April 2). Social Media Networks, Fake News, and Polarization. SSRN. Retrieved from https://ssrn.com/abstract=3154245

Bruns, A. (2019 Nov 29). Filter bubble. Internet Policy Review. Retrieved from https://doi.org/10.14763/2019.4.1426

Doyle, B. (2022 March 8). TikTok Statistics – Updated March 2022. Wallaroo Media. Retrieved from https://wallaroomedia.com/blog/social-media/tiktok-statistics/

Frenkel, S. (2022, Mar 6). TikTok is gripped by the violence and misinformation of ukraine war. New York Times Retrieved from http://ezproxy.library.usyd.edu.au/login?url=https://www.proquest.com/newspapers/tiktok-is-gripped-violence-misinformation-ukraine/docview/2636121497/se-2?accountid=14757

Goldenziel, J. (2022, Mar 31). The Russia-Ukraine information war has more fronts than you think. Forbes. Retrieved from https://www.forbes.com/sites/jillgoldenziel/2022/03/31/the-russia-ukraine-information-war-has-more-fronts-than-you-think/?sh=6d29d053a1e2

Just, N., & Latzer, M. (2017, Mar 1). Governance by algorithms: reality construction by algorithmic selection on the Internet. Media, Culture & Society, 39(2), 238–258. Retrieved from https://doi.org/10.1177/0163443716643157

Mellor, S. (2022, Mar 22). TikTok slammed for videos sharing false information about Russia’s war on Ukraine. Fortune. Retrieved from https://fortune.com/2022/03/21/tiktok-misinformation-ukraine/

Nilsen, J., Fagan, K., Dreyfuss, E., & Donovan, J. (2022, Mar 10). TikTok, the war on Ukraine, and 10 features that make the app vulnerable to misinformation. The Media Manipulation Casebook. Retrieved from https://mediamanipulation.org/research/tiktok-war-ukraine-and-10-features-make-app-vulnerable-misinformation

Pariser, E. (2011, Mar). Beware online “filter bubbles”. TED Conferences. [Video]. Retrieved from https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles

Pariser, E. (2011). The filter bubble: What the Internet is hiding from you. Penguin Press.

Paul, K. (2022 Mar 20). TikTok was ‘just a dancing app’. Then the Ukraine war started. The Guardian. Retrieved from https://www.theguardian.com/technology/2022/mar/19/tiktok-ukraine-russia-war-disinformation

Richards, A. (2022, Feb 25). TikTok is facilitating the spread of misinformation surrounding the Russian invasion of Ukraine. Media Matters. Retrieved from https://www.mediamatters.org/russias-invasion-ukraine/tiktok-facilitating-spread-misinformation-surrounding-russian-invasion

Rosenblatt, K., & Tenbarge, K. (2022, March 5). Ukraine fights back on TikTok, where war is fought with memes and misinformation. NBC News. https://www.nbcnews.com/tech/tech-news/tiktok-ukraine-war-misinformation-propaganda-rcna18146

Tenbarge, K., Collins, B. (2022, Feb 26). War in Ukraine sparks new wave of misinformation.  NBC News. Retrieved from https://www.nbcnews.com/tech/tech-news/war-ukraine-sparks-new-wave-misinformation-rcna17779

Flew, T. (2021). “Fake news, trust, and behaviour in a digital world”. In Research Handbook on Political Propaganda. Cheltenham, UK: Edward Elgar Publishing. Retrieved from https://doi-org.ezproxy.library.sydney.edu.au/10.4337/9781789906424.00009

Thompson, S. A., & Davey, A. (2022, Mar 3). Fact and mythmaking blend in Ukraine’s information war. New York Times. Retrieved from http://ezproxy.library.usyd.edu.au/login?url=https://www.proquest.com/newspapers/fact-mythmaking-blend-ukraine-s-information-war/docview/2635253112/se-2?accountid=14757