What really stops YouTube from promoting vaccination is the filter bubble created by YouTube itself!

In recent years, social media has played an increasing role in facilitating news and information consumption (Spaans, 2021, p.132). Did you notice that people who like to watch variety shows will get the latest news about variety shows first, and people who like to shop online will receive discount information from social media first? When I was discussing the topic of marriage with my friends, I searched the term “ring” on eBay on a whim. Almost immediately afterwards, I started receiving various push messages relating to marriage or rings delivered by eBay and numerous other social media platforms.

 

Although algorithmic recommendation technology solves the problem of information flooding to some extent, it easily causes occurrences like “information cocoon room” and “echo chamber” effects, creating “filter bubbles”, and leading to the phenomenon of “group polarization” (Spaans, 2021). We used to believe that what we see on the Internet is what we need, but in fact, what the Internet shows us is what it believes we want to see (Andrejevic, 2019). Because social media produces filter bubbles, this movement has resulted in heightened ideological polarisation (Spaans, 2021, p.132).

 

What is a filter bubble?

 

Eli Pariser, an Internet activist, created the term “filter bubble” (Pariser, 2011). The filter bubble is defined as intellectual isolation that can arise when a website uses an algorithm to selectively assume what information a user wants to see and then gives that information to the user. These assumptions are based on user information such as past click activity, browsing history, search history, and location. As a result, websites are more likely to simply provide information that corresponds to the user’s previous actions (Pariser, 2011). Moreover, filter bubbles can lead to a reduction in user exposure to opposing viewpoints, which can lead to an increasingly narrow view of the information available online (Holone, 2016).

 

However, the outcome is influenced by more than simply your own actions. The algorithm takes into account the interests and preferences of people in your social network, making you more likely to receive search results that your social network recommends. It becomes a concern, though, if your choices contain components that tilt search results towards misinformation (Holone, 2016, p. 299).

 

Case study

 

As the COVID-19 vaccine has been distributed all over the world, anti-vaccine activists are using social media to propagate misinformation and influence vaccine hesitancy (Jennings et al., 2021, p. 593). The dread of a COVID-19 vaccine, on the other hand, stems from misunderstandings that herd immunity gives protection, anxieties of rapid vaccine development and side effects, and the assumption that the virus was created by humans and utilised for population control (Holone, 2016, p. 298). Those who acquire their information from unregulated online sources with personalised recommendations based on viewing history and who believe in conspiracy theories are less likely to get vaccinated.

 

Figure 1. Youtube and Covid 19 (Source: Amaan, 2021).

 

The role of the Internet in providing health information to the public

The Internet’s rise as a general-purpose medium has resulted in a well-documented shift in how people perceive their health status (Baker et al., 2003). The Internet can be used to search for anything from self-diagnosis to information about medicines, potential treatments for medical conditions, and epidemic outbreaks. Especially since the COVID-19 pandemic, conspiracy and anti-vax beliefs have led to low trust in institutions and increased reliance on social media for health information (Jennings et al., 2021, p. 593).  Not only are large search engines like Google used for this type of ‘research’, but social platforms like Facebook, with its groups and pages, are frequently used to find like-minded people or others with similar health conditions (Baker et al., 2003, p. 2400).

 

People’s health choices will be influenced as more online resources are used to obtain health information, so it is critical to research how they play an important role in filtering bubbles. During the pandemic, whether to receive Covid-19 vaccines has become a cultural debate on social media. This sparked a wave of anti-vaccination fanatics. On the contrary, many people developed the opinion that vaccine hesitancy is ignorant or selfish, while anti-vaccine protestors questioned the safety and hygiene of early vaccination methods (Stolle et al., 2020, p.4482). One of the major contributors to cognitive deviations is the filter bubble in social media.

 

Algorithms on YouTube

Information dissemination has evolved from traditional manual gatekeeping to algorithmic elitism as a result of the practical use of personalised recommendations (Andrejevic, 2019). Therefore, when watching a video on YouTube, viewers will notice a list of recommended videos displayed on the right side of the page. The composition of this list is determined by two factors: the user’s search history on YouTube itself and elsewhere, and the user’s digital profile, which YouTube determines based on specific algorithms written by the platform’s developers. YouTube’s recommendation algorithm is designed to increase user engagement. Because of their provocative “clickbait” names and focus on contentious issues that encourage user engagement, the algorithm frequently promotes sensational and conspiracy-related videos (Spaans, 2021, p. 150). YouTube appears to know exactly what each viewer likes, is interested in, and would want to watch by recommending videos to them based on these statistics. When you choose a video, you’ll be taken to others that are similar to the first (Spaans, 2021, p. 135).

 

Anti-vaccine movement data on YouTube

COVID-19 is not only an epidemic disease but an “infodemic” of complex and dynamic information, both factual and inaccurate, which leads to vaccine hesitancy (Loomba et al., 2021). YouTube has been the most influential social media platform in promoting anti-vaccination and COVID-19 denial campaigns since the launch of the vaccination program against COVID-19 (Spaans, 2021, p. 129). Anti-vaccination videos are rapidly being broadcast on YouTube, with 7.8 million individuals subscribing to such accounts since the epidemic. Anti-vaccination channels have attracted at least 17 million followers globally (Spaans, 2021, p. 129).

 

Figure 2. YouTube top 50 YouTube creators posting videos about don’t want to get vaccinated (Source: Sam Clark, Mishaela Robison & Mark Ledwich, 2021).

Furthermore, between January 2020 and April 2021, nearly 3,634 videos concerning vaccination hesitancy and the pandemic were published on YouTube, garnering a total of 72 million views (Clark, Robison, Ledwich, 2021)! The top 50 YouTube creators with the most subscribers have declared their aversion to vaccinations. People who have searched for anti-vax topics may get more or similar anti-vax information in future searches, resulting in a filter bubble of artificial confirmation bias, limiting their exposure to other viewpoints, making it increasingly difficult for such individuals to see videos that may be fraught with danger (Stolle et al., 2020, p.4482).

 

Why is finding the correct vaccine information online isn’t easy

 

Filter bubbles promote the spread of fake news

Figure 3. YouTuber Léo Grasset was appalled by attempts to recruit him (Source: Charlie Haynes and Flora Carmichael, 2021).

 

The algorithmic mechanism of “filter bubble” personalised recommendations render it easy to promote the generation and spread of fake news, resulting in a lot of hidden but equally valuable diversified information (Bozdag & Van Den Hoven, 2015, p. 249). Léo Grasset, a German YouTuber, and journalist Mirko Drotschmann both received emails from Fazze, a marketing agency that was secretly offering high sponsorship offers to spread misinformation about the Covid-19 vaccine. Léo Grasset revealed that this marketing agency wanted him to share a story about a data breach at the European Medicines Agency that wanted to fake a leak of mortality statistics. The information Leo was asked to contribute had been culled from several sources and taken out of context.

 

The original intention of an intelligent recommendation algorithm is to play a role in “information overload”, to help users filter information, recommend content and help their decision-making with the user’s consent (Spaans, 2021, p. 134). However, at present, most of the algorithmic mechanisms operating on the market have become the decision-making ministers of the platform and content, assisting in the push of specific content to target screen users, in order to attract traffic and obtain greater commercial benefits. In fact, in the early days of vaccines, many YouTubers on YouTube were promoting false vaccine information. Fazze is using the algorithm mechanism on Youtube to attract more attention, trying to spread fake news and fake social focus event information. Meanwhile, YouTube will also become a spreader of fake news.

 

Controversy Hidden in Algorithmic Mechanisms

Figure 4. A screenshot of the Chinese vaccines described by Western Media on YouTube (Source: Youtube, 2022).

Figure 5.  A screenshot of the Western vaccines described in Chinese media on YouTube (Source: Youtube, 2022).

 

Algorithms assist consumers in screening content while also screening users for information to be pushed, reflecting a circle of disagreements, such as political disagreements. On July 17, 2021, White House Secretary Jen Psaki said at a news conference that the State Department had determined that Russia and China had been promoting vaccine misinformation through social media platforms, including “information that undermines Western vaccine development programs.” According to Chinese news sources, the Western media’s smear campaign against Chinese vaccines has never stopped. And users with polarised opinions will be divided into different classes by YouTube based on their past search behaviour and habits, and most importantly, because people often organise themselves into groups of people who share their opinions, it is easy to fall into one-sided arguments. In the information bubble (Spaans, 2021) this reflects the hidden disputes in the algorithm mechanism.

 

The development of information technology also increases the insecurity of user information leakage. Internet big data collects users’ data information, which not only provides the prerequisites for the realisation of “filter bubbles”, but also breeds a hotbed of privacy leakage (Spaans, 2021). Due to the complex operation of algorithmic systems and the opaqueness of their automated decision-making, it is difficult for users to enjoy the right to know at the level of data collection and the right to supervise data sharing (Spaans, 2021, p. 134). This makes the user privacy terms signed by the platform and user’s incapable of protecting user privacy.

 

What Burst bubble strategy does YouTube implement?

 

In 2019, YouTube announced changes to its recommendation algorithm, saying it would stop recommending conspiracy videos like “vaccinate your family.” YouTube also blocked some anti-vaccine videos from showing ads and making money and started providing more information about the threat of vaccine hesitancy in a window below anti-vaccine videos. In September 2021, YouTube announced a blanket ban on vaccine misinformation and terminated the accounts of several prominent anti-vaccine influencers, including Joseph Mercola and Robert F. Kennedy Jr., citing “the need to remove shockingly harmful content.” These actions mean that removing this content can effectively stop false filter bubbles from being created and stop individuals from being misled by false vaccine information when using the YouTube platform.

 

Filter bubbles are something we can choose

While filter bubbles do have negative effects, algorithms can’t account for everything. The open-ended nature of social media allows us to subscribe to like-minded people and follow only those who have agreed with us, which creates an echo chamber of information. When individuals are preoccupied with their personal filter bubbles, this can lead to a lack of critical discussion, divided opinion, and political discourse in echo chambers (Panke & Stephens, 2018, p. 248).

 

At the same time, filter bubbles are also a matter of human nature. A human weakness that we have is that we do not want our ideas to be challenged, therefor we all tend to read what we agree with. Filter bubbles, on the other hand, are something we can choose, and it is the same on YouTube. For example, on the right side of the YouTube videos, there will be options for “Not interested” and “Don’t recommend channel”. People can build an information barrier around a filter bubble of their choosing, blocking them from seeing other ideas and creating the impression that “our narrow self-interest is there” (Pariser, 2011).

 

Conclusion

 

In conclusion, filter bubbles are caused by the combined effect of algorithms and our own choices. As a result. Eliminating “filter bubbles” needs the combination of technical improvements, balanced media cover and social conflict prevention. It will also require learning to take advantage of efficient distribution and accurate promotion of information to continuously broaden an individual’s own horizons for obtaining information from YouTube channels and maximise self-interest.

 

 

References

 

Abul-Fottouh, D., Song, M., & Gruzd, A. (2020). Examining algorithmic biases in YouTube’s recommendations of vaccine videos. International Journal Of Medical Informatics, 140, 104175.

 

Andrejevic, Mark (2019), ‘Automated Culture’, in Automated Media. London: Routledge, pp. 44-72.

 

Baker, L., Wagner, T., Singer, S., & Bundorf, M. (2003). Use of the Internet and E-mail for Health Care Information. JAMA, 289(18), 2400. doi: 10.1001/jama.289.18.2400 (Baker et al., 2003)

 

Bozdag, E., & Van Den Hoven, J. (2015). Breaking the filter bubble: democracy and design. Ethics and information technology, 17(4), 249-265.

 

Clark, S., Robison, M., Ledwich, M. (2021). COVID Vaccine Hesitancy on YouTube [Blog]. Retrieved 8 April 2022, from https://www.pendulumfn.com/reports/covid-vaccine-hesitancy-on-youtube?end=2021-05-30T23%3A00%3A00.000Z&start=2020-01-01T00%3A00%3A00.000Z.

 

Holone, H. (2016). The filter bubble and its effect on online personal health information. Croatian Medical Journal, 57(3), 298-301.

 

Jennings, W., Stoker, G., Bunting, H., Valgarðsson, V., Gaskell, J., & Devine, D. et al. (2021). Lack of Trust, Conspiracy Beliefs, and Social Media Use Predict COVID-19 Vaccine Hesitancy. Vaccines, 9(6), 593.

 

Loomba, S., de Figueiredo, A., Piatek, S., de Graaf, K., & Larson, H. (2021). Measuring the impact of COVID-19 vaccine misinformation on vaccination intent in the UK and USA. Nature Human Behaviour, 5(3), 337-348.

 

Panke, S., & Stephens, J. (2018). Beyond the echo chamber: Pedagogical tools for civic engagement discourse and reflection. Journal of Educational Technology & Society21(1), 248-263.

 

Pariser, E. (2011). The filter bubble: What the Internet is hiding from you. Penguin UK.

 

Spaans, D. (2021). Conspiring Algorithms: Tracing the Anti Vaccination and COVID 19 Conspiracy Movement on YouTube. Leiden Elective Academic Periodical, 1(1), 129-154.

 

Stolle, L., Nalamasu, R., Pergolizzi, J., Varrassi, G., Magnusson, P., LeQuang, J., & Breve, F. (2020). Fact vs Fallacy: The Anti-Vaccine Discussion Reloaded. Advances In Therapy, 37(11), 4481-4490.