Algorithmic governance and the anti-vaxxer during the COVID-19 Pandemic

Interpretation of networks (Kindler, 2019).

You’re searching for a dress! You scour the internet for the perfect one, but you can’t seem to find what you’re looking for, so you close your browser and open up Facebook. You’re suddenly inundated with ads for dresses from the same websites you just visited, and now you’re wondering how Facebook knows you.  

Have you ever thought about what the purpose of an algorithm is? Or what datafication or automation really entails? Well, I’m about to explain it to you, but for you to really grasp what I’m talking about, let’s take ourselves back to the dark month of March 2020, when we were at the beginning of what we now know as the COVID-19 pandemic.

While we were learning how to make sourdough starters, whipped coffee, and binging Tiger King on Netflix, the anti-vaccination movement was in full speed before the vaccines were even developed. Facebook groups were created and rampant with COVID-19 misinformation, and YouTube videos made about the dangers of vaccines were trending worldwide (De Vynck & Lerman, 2021).

Now, two years later, we have the knowledge to find out exactly how the anti-vaxxer movement perpetuated by the pandemic was able to happen.

Before I delve into the mechanics of the power of anti-vaxxers on social media, let’s have an overview of what algorithms are, how they work, and why they are fundamental to the running of the internet.

Definitions are important to understand in order for us to explore and examine the ways in which organisations can use the internet to influence us, so let’s break it down.

What is an algorithm?

An academic definition of an algorithm is “the set of programs that implement or express that algorithm.” (Yanofsky, 2010, 253). While that definition isn’t incorrect by any means, it is a little difficult to understand. A simplified one could be a sequence of instructions and statements that take an input and produce an output.

In a recent real-world scenario, we can think of TikTok as the perfect example that exemplifies the use of an algorithm. For those who aren’t familiar with TikTok, it’s an app where users upload video content that lasts from about 15 seconds to 3 minutes and uses humour to relate to other users.

TikTok, much like YouTube’s recommended page, has a ‘For You’ page, where the algorithm observes the other videos watched, collects all similar videos, and places them on your ‘For You’ page (Matsakis, 2020). So, while you may be sitting there thinking, “wow, TikTok really gets me!”, ­it’s their precise algorithm that is in charge of perfectly curating the content to your likes and interests.

TikTok sidebar menu (TikTok, 2022).

What is datafication?

We can think of datafication as a way that different “subjects, objects, and practices” are converted into a “digital data” format (Southerton, 2020, p. 1).

A great example of datafication is streaming services, notably Netflix, which started off similarly to companies like Blockbuster, but instead of waiting in line for the new releases, you could pre-order a DVD that gets sent straight to you.

In this day and age, the thought of waiting for a TV Show in the mail seems unfathomable, and that’s why with the advance in technology, Netflix converted all those movies and television shows into a digital format.

What is automation? 

It’s no surprise that in 2022 automation is a natural part of our lives, from streaming services, ‘self-driving’ cars to even your washing machine. When we think of automation, we often equate it to the loss of human labour, and in some ways, that’s exactly what it is.

Through algorithms and datafication, automation has now transformed into a form of “hyper customisation,” (Andrejevic, 2019, p. 45) with users on the internet being able to filter the content they’re exposed to.

Let’s think of Facebook! It started off as a way for users to communicate without physically seeing each other (Barr, 2018), but now most users use Facebook for their news consumption (Somaiya, 2014). In Australia, for example, approximately 60% of users consume their news on social media (Watson, 2022).

Much like buying a newspaper from a specific news organisation, users are able to filter which news sites they want to consume from (Rodriguez, 2021). But this is where misinformation and fake news thrive, as users who are media illiterate fall into the trap of accidentally distributing fake content (Brashier & Schacter, 2020), and the more users who share, the higher the algorithm prioritises it.

Facebook user engagement features (Facebook, 2022).

What is algorithmic governance?

Now that we’ve looked at what algorithms, datafication, and automation are, it’s time to see how they all work together to shape the way we, as users, consume the internet.

The idea that big data and “algorithmic selection” are working together for “reality construction” is not new (Just & Latzer, 2017, p. 239). As a refresher, big data refers to “massive data sets” containing a “complex structure” (Sagiroglu & Sinanc, 2013, p. 42).

The leading theory involved with algorithmic governance is that these algorithms control the way we perceive the world, observe our society, and ultimately influence our thoughts and opinions (Just & Latzer, 2017).

Going back to our news consumption example, users can develop a warped perception of society through those filtering devices and by following unreliable sites. The U.S. 2020 election illustrates this phenomenon as users incorrectly believed that President Joe Biden’s election was rigged through misinformation campaigns sustained by algorithms and that Donald Trump was actually the rightful leader (Hern, 2020). This ultimately led to the January 6th Capitol Riot.

As you can see, algorithms and automation have the power and ability to influence our lives. From curated feeds to filter bubbles, our digital selves are vulnerable to fake news and misinformation campaigns.

What is the anti-vaxxer movement? 

In order to understand the current movement, let me take you on a journey back in time!

Before the COVID-19 pandemic and even before the Spanish Flu, smallpox ran wild in the 18th Century, particularly in Britain (Grignolio, 2018). Despite smallpox being around since the time of the Egyptians, the 18th Century saw a sudden epidemic crisis with children being the main victims.

At the time, inoculation or variolation, the predecessor of what we now know as vaccines, was the preferred virus protection method. The process was taking a small sample of the virus and ‘infecting’ oneself to trigger the immune response (Grignolio, 2018).

Satirical painting of Edward Jenner vaccinating titled ‘The Cow-Pock-or-the Wonderful Effects of the New Inoculation!’ (Gillray, 1802).

In the 1790s, Edward Jenner created the smallpox vaccine, and with that, the birth of the anti-vaxxer movement occurred (North, 2022).

With the vaccine mandated in the 19th Century, resistance was immediate, and there were soon protests not dissimilar to today. They also spread vaccine hesitancy and misinformation through pamphlets; we could see those as our modern-day Facebook posts.

Throughout time and with the development of new viruses, the anti-vaccination movement ebbed and flowed in popularity (North, 2022)

The movement we have in our contemporary society can date back to 1998 and Andrew Wakefield; as a doctor, he used his cultural capital to suggest a link between the Measles, Mumps, and Rubella (MMR) vaccine and autism spectrum disorder existed, (Watson, 2019).

Despite being debunked, this correlation has been used as a monumental piece of evidence anti-vaxxers use to promote their ideology.

Now that we have had our overview, how does the movement of the past influence the movement of today? How did they use the internet to their advantage?

To recap:

  • Algorithms are instructions that input and output an instruction
  • Datafication is a process where objects are converted into a digital data format
  • Automation is when we replace human actors with technical actors

So, how does what we now know apply to the COVID-19 anti-vaxxer movement? 

Although it’s easy to dismiss anti-vaxxers as misinformed ‘Karens’, the proliferation of the movement is much more complex as they utilised social media algorithms and automation to influence users who had previously not been against vaccinations (Weinberg & Dawson, 2021).

As mentioned before, digital literacy is essential when finding reputable sources online. Without understanding how to decipher between what’s real and what’s fake, you can make yourself vulnerable to that influence.

A study in 2019 (so before the current pandemic) showed that anti-vaccine sentiment was growing rapidly (Burki, 2020, p. 504):

  • Approximately 7.8 million users follow anti-vaccination accounts on social media platforms
  • 31 million users on Facebook following anti-vax groups.

These statistics indicate that the movement was strong right before COVID-19 gained control over our lives. That means that those already clicking, liking, sharing, and retweeting content, influencing the algorithm and automation, were exposed to new misinformation about COVID-19 (Weinberg & Dawson, 2021).

Audience engagement is the biggest money-maker for companies like Meta, formerly Facebook. Likes, sharing, and comments are some of the strongest ways to determine audience participation, and social media platforms’ algorithms are designed to promote engagement (Weinberg & Dawson, 2021).

We have to remember that automation and algorithms on platforms like Facebook aren’t human. They’re a form of Artificial Intelligence (AI), and lack a moral compass, making it difficult for them to determine what’s morally or ethically right and wrong. All they’re programmed to do is to look at the numbers and promote content based on engagement statistics.

And this is precisely how the anti-vaxxer movement was allowed to proliferate during the early years of the COVID-19 pandemic, mainly because thousands of human moderators who filtered harmful content were laid off at the start of the pandemic due to the lockdowns in the United States. (Dwoskin & Tiku, 2020).

Through Facebook’s algorithms, if a user was to follow one or two groups that promoted anti-vaccination sentiment, they would be suggested to follow others based on the “you may like” feature. So, the “new recruits” start growing, and soon enough, thousands to millions start following these groups (Weinberg & Dawson, 2021, p. 2).

Facebook search page (Facebook, 2022).

Algorithms are designed to reinforce content that will amp up engagement, and when that happens, misinformation can spread like wildfire, which is what happened with the COVID-19 pandemic.

According to Facebook Whistle blower Frances Haugen, algorithms are helping spread extremist views, as she states they are “unquestionably making it worse,” (Chan, 2021).

Can we stop the spread of these kinds of views? 

The short answer is not really, but that doesn’t mean things aren’t already changing.

Deplatforming has been discussed as a way to tackle vaccine hesitancy and the increase in anti-vaccination groups. Deplatforming is the process of removing groups and individuals who hold extremist views/values and making it harder for them to communicate and gather (Romano, 2021). The primary issue with deplatforming is that it encroaches on freedom of speech and also could be viewed as a form of censorship (Armitage, 2021).

While Facebook has previously threatened a ban, they have yet to follow through and instead have created policies that remove COVID-19 misinformation that “contributes to the risk of imminent violence of physical harm” and redirects users to reliable content (Scott, 2020).

Calls for the algorithms to be public is another conversation circling around to prevent further misinformation campaigns, but with Meta owning large popular social media platforms like Facebook and Instagram, it is unlikely their algorithms or AI technology will be released.

It’s easy to get bombarded with talks of algorithms controlling our lives, and it’s easy to get worried about privacy and the increase in AI technology, but in a modern world, being digitally literate will be the key to escaping getting pulled into misinformation campaigns.

Before you follow a Facebook group, or share a post, remember to check if it’s a reliable source and if the information is accurate. Remember that with every like, share, and post, the algorithms will target others who share your ideas.

References:

Andrejevic, M. (2019). Automated media. Routledge.

Armitage R. (2021). Online ‘anti-vax’ campaigns and COVID-19: censorship is not the solution. Public health190, 29–30. https://doi.org/10.1016/j.puhe.2020.12.005

Barr, S. (2018, August 23). When did facebook start? the story behind a company that took over the world. The Independent. Retrieved from https://www.independent.co.uk/life-style/gadgets-and-tech/facebook-when-started-how-mark-zuckerberg-history-harvard-eduardo-saverin-a8505151.html

Brashier, N. M., & Schacter, D. L. (2020). Aging in an era of fake news. Current directions in psychological science29(3), 316-323.

Burki, T. (2020). The online anti-vaccine movement in the age of COVID-19. The Lancet Digital Health2(10), 504-505.

Chan, K. (2021, October 25). Facebook whistleblower says the platform is making online hate worse. The LA Times. Retrieved from https://www.latimes.com/world-nation/story/2021-10-25/whistleblower-haugen-facebook-making-online-hate-worse

De Vynck, G., & Lerman, R. (2021, July 22). Facebook and YouTube spent a year fighting covid misinformation. It’s still spreading. The Washington Post. Retrieved from https://www.washingtonpost.com/technology/2021/07/22/facebook-youtube-vaccine-misinformation/

Dwoskin, E., & Tiku, N. (2020, March 24). Facebook sent home thousands of human moderators due to the coronavirus. Now the algorithms are in charge. The Washington Post. Retrieved from https://www.washingtonpost.com/technology/2020/03/23/facebook-moderators-coronavirus/

Facebook. (2022). Search menu [Screenshot]. https://m.facebook.com/?soft=search

Facebook. (2022). Search menu [Screenshot]. https://www.facebook.com/

Gillray, J. (1802). The Cow-Pock-or-the Wonderful Effects of the New Inoculation! [Painting]. British Museum. https://www.britishmuseum.org/collection/object/P_1851-0901-1091

Grignolio, A. (2018). Vaccines: are they worth a shot?. Springer.

Hern, A. (2020, November 11). Trump’s vote fraud claims go viral on social media despite curbs. The Guardian. Retrieved from https://www.theguardian.com/us-news/2020/nov/10/trumps-vote-claims-go-viral-on-social-media-despite-curbs

Just, N., & Latzer, M. (2017). Governance by algorithms: reality construction by algorithmic selection on the internet. Media, Culture & Society, 39(2), 238-258. doi: org/10.1177/0163443716643157 journals.sagepub.com/home/mcs

Kindler, M. (2019). science netherland Amsterdam. [Photo]. Unsplash. https://unsplash.com/photos/G66K_ERZRhM

Matsakis, L. (2020, June 10). TikTok finally explains how the ‘for you’ algorithm works. Wired. Retrieved from https://www.wired.com/story/tiktok-finally-explains-for-you-algorithm-works/

North, A. (2022, March 4). The long, strange history of anti-vaccination movements. Vox. Retrieved from https://www.vox.com/the-goods/22958419/covid-vaccine-mandate-pandemic-history

Rodriguez, S. (2021, March 31). Facebook will let users control more of what they see rather than forcing them to rely on algorithms. CNBC. Retrieved from https://www.cnbc.com/2021/03/31/facebook-gives-users-more-control-over-content-with-feed-filter-bar.html#:~:text=Facebook%20said%20it%20will%20introduce,friends%20and%20pages%20they%20follow

Romano, A. (2021, January 21). Kicking people off social media isn’t about free speech. Vox. Retrieved from https://www.vox.com/culture/22230847/deplatforming-free-speech-controversy-trump

Sagiroglu, S., & Sinanc, D. (2013). Big data: A review.   International Conference on Collaboration Technologies and Systems, 42-47. doi: 10.1109/CTS.2013.6567202.

Scott, M. (2020, March 30). Facebook’s private groups are abuzz with coronavirus fake news. Politico. Retrieved from https://www.politico.eu/article/facebook-misinformation-fake-news-coronavirus-covid19/

Somaiya, R. (2014, October 26). How facebook is changing the way its users consume journalism. The New York Times. Retrieved from https://www.nytimes.com/2014/10/27/business/media/how-facebook-is-changing-the-way-its-users-consume-journalism.html

Southerton, C. (2020). Datafication. In L. Schintler & C. McNeely (Eds.), Encyclopedia of big data. Switzerland: Springer. https://doi.org/10.1007/978-3-319-32001-4_332-1

TikTok. (2022). Sidebar menu [Screenshot]. https://www.tiktok.com/foryou?lang=en

Watson, A. (2022, February 15). Social media news worldwide – statistics & facts. Statistica. https://www.statista.com/topics/9002/social-media-news-consumption-worldwide/#dossierKeyfigures

Watson, J. (2019, May 31). The anti-vaccination movement’s history dates back to the very first vaccine. ABC News. https://www.abc.net.au/news/2019-05-31/anti-vaccination-movement-history-dates-back-to-first-vaccine/11153102

Weinberg, D., & Dawson, J. (2020). From Anti-Vaxxer Moms to Militia Men: Influence Operations, Narrative Weaponization, and the Fracturing of American Identity. Foreign Policy at Brookings, 1-33.

Yanofsky, N. S. (2011). Towards a definition of an algorithm. Journal of Logic and Computation21(2), 253-286.