Is there any connection between coronavirus and 5G technology? Is Bill Gates the actor behind the pandemic? Is Covid-19 a hoax? For each question, there are self-claimed online experts whose answer is yes. The 5G theory, claiming that the coronavirus outbreak was caused by 5G (fifth-generation broadband technology) electronic radiation, is among the most popular Covid-19 conspiracy theories (Shahsavari et al. 279). Other prominent conspiracies include claims that Bill Gates is using the pandemic as “a cover to launch a broad vaccination program to facilitate a global surveillance”, “the virus is a bio-weapon released deliberately by the Chinese”, and that “the pandemic is a hoax perpetrated by a global cabal” (279). Over the last two years, these conspiracy theories and other conspiratorial narratives around Covid-19 have been proliferating on social media.
Scholars and media pundits have increasingly expressed concerns over social media’s role in spreading conspiracy theories online. Many journalists not only blamed social media for spreading, but also as “a source for news and a breeding ground for pandemic conspiracies”. Before the Covid-19 pandemic, researchers had examined key factors contributing to the spread and stickiness of conspiracy theories (Douglas, Ang, and Deravi). Some argue that the Internet does not affect, and can even inhibit, the proliferation of conspiracy theories (Clarke; Uscinski and Parent). Others contend that online communities play a vital role in disseminating conspiracy theories (Douglas, Ang, and Deravi; Stecula and Pickup). In line with the latter, recent studies on coronavirus conspiracies argue that social media play a central role in disseminating conspiracy theories (Bodner, Welch, and Brodie; Enders et al.). Further, they also argue that exposure to social media is associated with higher conspiratorial beliefs than traditional media.
There is a common assumption that we now live in “the golden age of conspiracy theories”, primarily due to the extensive use of social media. However, it is important to note that, although it is easier to find and disseminate them nowadays due to their online presence, belief in conspiracy theories has been widespread throughout human history. Conspiracy theories have fuelled many wars and crimes against humanity (Pipes). Each era has its crises, and, accordingly, the content of specific conspiracy theories varied substantially over time. In an extensive research project, Uscinki and Parent (54-72) analysed 104,803 randomly selected published letters sent to The New York Times and the Chicago Tribune between 1890 and 2010. They found that the extent to which these letters contained conspiracy theories fluctuated but did not increase over time. Uscinki and Parent contend that the Internet’s role in conspiracy theorising was restricted to replacing other means of communication such as word-of-mouth.
My article does not try to prove or disprove social media’s role in propagating conspiracy theories. As social media have become integrated into our everyday life, and even more so as we were forced to retreat to our homes during the pandemic, they naturally become our primary platforms for communication and information-related activities. The spread of pandemic conspiracies, too, would take place on social media. Observably, Covid-19-related online communication materialises not only on social media. However, due to their ubiquity, here, I focus my discussion on what unravels on social media, namely “Internet-based channels that allow users to interact opportunistically and selectively self-present … with both broad and narrow audiences who derive value from user-generated content and the perception of interaction with others” (Carr and Hayes 50). In particular, I am interested in analysing the relationship between social media algorithms and human users in disseminating conspiracy theories and how they assemble a fertile habitat for a conspiratorial environment to grow. In developing an analytical framework that captures the complexity and dynamics of this relationship, I rely on three analytical points. First, the relationship between social media affordances and the salient crisis event created by Covid-19. Second, social media algorithms’ biases towards conspiracy theories. And lastly, the formation of algorithmic enclaves within the context of binary discourse around the pandemic.
Societal Crises and Social Media Affordances
Research suggests that societal crises have accelerated the dissemination of conspiracy theories and stimulated belief in such theories (van Prooijen and Douglas 323). Historically, every major crisis event had inspired substantial conspiracy theorising (325). A health crisis, such as Covid-19, represents unknown risks to most people as it “involves organisms that cannot be seen and diseases and symptoms that have not before been evident in the general population” (Reynolds and Seegers 44). The unfamiliarity, invisibility, and involuntary exposure to a novel highly infectious disease such as coronavirus evoke a sense of fear, uncertainty, and insecurity for the broad public (Reynolds and Seegers). In such a crisis, individuals generally are more encouraged to seek information to make sense of the situation (van Prooijen and Douglas 327). Conspiracy theories address this need by providing people “with a simplified answer, specifically to questions of how a certain situation emerged, and which social actors can and cannot be trusted” (327). Major societal crises are essentially complex events that usually cannot be explained by a simple formula or a simple causality. Conspiracy theories offer simplified explanations of complex events that otherwise are difficult to digest by attributing these events to powerful evil actors (Pipes; van Prooijen and Douglass). The Covid-19 pandemic is one of those significant societal crises, a monumental global crisis that stimulates substantial uncertainty and fear. The pandemic lays out a generative platform for belief in conspiracy theories to grow.
Further, the Covid-19 pandemic has transformed many aspects of people’s lives globally. The pandemic shut down schools and workplaces, forced billions to stay at home for extended lengths of time, and, thus, moved activities that otherwise would have taken place outside the screen into the screen. In addition, since many people cannot connect with their family and friends in person, social media have become the primary tool to maintain these connections. As a result, social media platforms reported a sharp increase in usage during the pandemic. For example, Facebook reported a 50% increase in total messaging in the countries hit hardest by the virus. In these places, voice and video calls have more than doubled on Messenger and Whatsapp.
Social media platforms do not cause the proliferation of coronavirus conspiracies. However, features of social media provide the grounds for affordances. Here, affordances can be described as “action possibilities and opportunities that emerge from actors engaging with a focal technology” (Faraj and Azad 238). The concept of affordances is relational. It is dependent neither on the technology itself nor simply the users alone, but on the relationship between the users and the material features of the technology. So, in this context, social media platforms may offer some affordances for some users, but these affordances for action may or may not be actualised in use.
There is a long list of social media affordances discussed in previous studies. Here, I identify four that are central to users’ engagement in information-sharing. First, communicability/interactivity as social media enable users to communicate and interact (Stamati, Papadopoulos, and Anagnostopoulos 17). Second, association/connectivity as social media allow connections between disparate individuals and groups, facilitating associational networks to emerge (Malsbender, Hofmann, and Becker 7). Third, metavoicing as social media enable users to react to each other's content (7). And lastly, distributivity as social media facilitate users to distribute, share, and reshare content with ease.
In general, these affordances facilitate rapid information and mass content sharing. However, research also reveals that these affordances are not translated into equal opportunities for all types of content to proliferate rapidly and widely, to be viral. Viral events are an exception to the rule as the vast majority of content only reaches a few users (Nahon and Hemsley). On social media, users are exposed to large volumes of information and a scarcity of attention, a condition called the economics of attention (Lanham). Hence, the propensity for virality and popularity does not correlate with content's quality, accuracy, and truthfulness. Instead, it correlates with memeability, which is a content’s proximity to a meme, a compressed package of information that can grab users’ attention in a short timeframe (Jenkins and Huzines; Lanham). In the world of memes dominated by sensational narratives around renowned personalities, it is unlikely for complex narratives with logical reasoning to be virally disseminated (Lim). Memeable messages, or simplified/oversimplified and sensational ones, have a much higher likeliness to go viral (Lanham; Lim). Additionally, a low level of digital media literacy skills among some social media users makes it difficult for them to evaluate the quality of information they encounter online reliably. Moreover, in the Covid-19 crisis, we deal with infodemics, an over-abundance of information “including false or misleading information in the digital and physical environment during a disease outbreak”, rendering it difficult for social media users to find trustworthy sources of information and reliable guidance.
Conspiracy theories, as discussed earlier, provide simplified explanations of complex crisis events and clear identification of the enemy that caused the crisis, such as 5G technology, Bill Gates, or Big Pharma. Coronavirus-related conspiracy theories are thus more memeable than any complex scientific explanations around the Covid-19 crisis. In the pre-Internet era, conspiracy theories circulated locally, primarily among small communities. In the social media era, conspiracy theories can be propagated rapidly, on a massive scale, nationally, and even globally. The monumental crisis, intensified social media usage, and infodemics instigated by the Covid-19 pandemic, in combination with social media affordances, together created an opening and possibility for widespread conspiracy theories and the belief in such stories.
Social Media Algorithms’ Biases towards Conspiracy Theories
Beyond affordances, we also need to look at the centrality of algorithms in social media usage. Social media are algorithmic media whose functions depend on algorithmic procedures. In theory, an algorithm is simply “a process or set of rules to be followed in calculations or other problem-solving operations”. In practice, however, algorithms are “opinions embedded in mathematics” (O’Neill 21). Combining rules and data captured from users’ consumption patterns, algorithms calculate an opinion about what a given technology should do, such as recommending a new YouTube video, a group to join on Facebook, whom to follow on Twitter, or the ranking of Google search results.
What do social media algorithms look like? There are many different algorithms utilised by social media platforms, and they keep on changing. However, in a nutshell, the social media algorithm’s underpinning concept is machine learning, where an algorithm uses historical data from users' past behaviours to predict and influence their future behaviours. Social media algorithms primarily are not designed to provide quality experiences for users. Rather, they are meant to maximise user engagement—defined as likes, reactions, comments, shares, and click-throughs—to better target users for brand advertising. Over the years, social media algorithms have evolved to cater for the economics of emotion where “emotions are leveraged to generate attention and viewing time, which converts to advertising revenue” (Bakir and McStay 154). Hence, algorithmic biases privilege content that evokes emotion. This observation is confirmed by a recent exposé from a Facebook whistle-blower, which reveals that starting in 2017, the platform’s ranking algorithm treated emoji reactions (love, haha, wow, sad, and angry) as five times more valuable than generic ‘likes’. Simplified and sensationalised content with superlative values—the most sensational, the most controversial, the cutest, the funniest—measured quantitatively by the number of reactions it draws, therefore tends to be more visible than mundane, moderate, comprehensive, informative, and complex content (Lim 191).
While providing simplified answers to complex events, conspiracy theories are often supported by a range of elaborate arguments that may appear analytical. Empirical evidence, however, suggests that belief in conspiracy theories is highly associated with intuition; conspiracy beliefs are highly emotional (van Prooijen and Douglas 329). Conspiratorial narratives are primarily crafted not to conjure logical and analytical reasonings but to arouse emotional experiences (329-330). Research demonstrates that most popular online conspiratorial texts tend to have highly reduced complexity, high certainty, and signal high emotion (Visentin, Tuan, and Di Domenico). Here, we see indirect consequences and unintentional impacts of social media algorithms. By catering to the economics of emotion, social media algorithms yield biases toward conspiracy theories.
Algorithmic dynamics also reinforce the personalisation of information, where a user’s feed is tailored algorithmically to their interests and beliefs, causing them to stay in a filter bubble (Pariser 2011). Hence, the filter bubble effect may prevent users who inhibit conspiratorial thinking from seeing information that challenges their beliefs. Further, algorithmic dynamics also allow users with similar tendency for conspiratorial beliefs to connect and share information. This can cause pockets of social media spaces to become conspiratorial echo chambers, communities of like-minded people who reinforce their conspiratorial beliefs by repeated exposure to information that confirm their existing beliefs.
Conspiratorial Algorithmic Enclaves and Binary Discourse
While social media affordances and algorithms are biased towards the circulation and distribution of conspiracy theories, it does not mean that human users have no agency in the process. People do not simply believe conspiracy theories because they are displayed on their feeds. Conspiracy theories are most likely to translate into beliefs for those already attracted to conspiratorial explanations for main events (Enders et al.). Conversely, there is an absence or a very weak relationship between users who exhibit the lowest level of conspiracy thinking and beliefs in conspiracy theories. Even when exposed to them online, this latter group of users will reject such conspiracy ideas. In the context of the coronavirus pandemic, for example, users who developed positive feelings toward QAnon, an American far-right political conspiratorial movement, also exhibit strong beliefs towards conspiracy theories (Enders et al.). On the contrary, those who feel otherwise evidently reject such ideas (Enders et al.).
Algorithms do not have the absolute power to convert people into believers of conspiracy theories. However, algorithms can amplify users’ exposure and propensity to conspiratorial content based on their past patterns of behaviours. On salient issues, social media algorithms can facilitate the formation of algorithmic enclaves (Lim 194):
a discursive arena where individuals, afforded by their constant interactions with algorithms, interact with each other and collectivise based on a perceived shared identity online for defending their beliefs and protecting their resources from both real and perceived threats, usually from a common enemy.
This enclave conveys a type of grouping that is voluntary in nature. While facilitated by algorithms that bind users with a similar tendency (to believe or not) in conspiratorial thinking, social media users have agency and play their role in forming and joining this enclave (Lim). Social media users are not necessarily clustered into segregated conspiratorial echo chambers just because algorithms divide them into filter bubbles. Algorithmic enclaves can be formed through both the confirmation and the contradiction of opinions and information (Lim 195). Users within the highest and the lowest conspiratorial thinking groups, to a certain degree, are exposed to opposing viewpoints. Disagreeable information and conversations may perpetuate extreme results, making them hold on more to their existing beliefs and intensifying the antagonistic relationship.
The susceptibility to develop such an enclave is high for issues that represent binary discourse, where conversations of the main issue and any relevant issues around it can be best or only expressed as a binary on a single axis, either X or anti-X (Lim 196). This binary is typically instigated from the binary political system, where political choices are effectively limited simply to two polarised options. On issues that represent binary discourse, such as Brexit, (anti)immigration in Europe, and Duterte’s war against drugs in the Philippines, the construction of common enemies generally characterises conversations and interactions among social media users in algorithmic enclaves (Lim 196). Coronavirus is one of such issues, with the binary discourse of anti- vs. pro-mask, anti- vs. pro-vaccine, anti- vs. pro-Dr. Fauci, etc.
The binary discourse is typically accompanied by labelling to portray the enemies in terms of difference, deviance, and threat (Lim). In many progressive movements worldwide, social media were used to galvanise the sense of unity against corrupt leaders, authoritarian governments, and other powerful oppressive actors by mobilising the binary discourse of “we, the people” versus “them, the enemy of people”. However, such binary mechanisms can also be easily applied to mobilise disinformation and conspiratorial discourse, which typically revolves around the construction of the malevolent enemy. Consequently, the social media landscape can be a fertile ground to cultivate the binary discourse around coronavirus (Bodner, Welch, and Brodie) and cluster individuals into conspiratorial algorithmic enclaves.
The coronavirus pandemic affecting the global population is a historic and monumental crisis with profound societal effects. In such a crisis, conspiracy theories provide people who are overwhelmed by the subjective feeling of uncertainty with an oversimplified answer to a complex event that is difficult to understand. Meanwhile, social media, which have become the primary tool for people’s connectivity during the pandemic, offer affordances for propagating such theories. Further, by catering to the “economics of attention” and “economics of emotion”, social media algorithms yield biases toward conspiracy theories, making them more visible and, consequently, more likely to be circulated. However, we also learn that social media algorithms do not have an absolute hegemony in translating the high visibility or even the virality of conspiracy theories into a belief in them. Human users still retain their agency, but their choices and preferences on what to view and consume are highly influenced by algorithmic dynamics. The colossal societal crisis caused by the pandemic, social media affordances, and algorithmic dynamics together assemble a fertile habitat for the emergence of algorithmic enclaves that cultivate binary discourse. Within many algorithmic enclaves, human users who exhibit the highest-level conspiratorial thinking converse with each other, produce, consume, circulate, and re-circulate conspiracy theories that, in turn, confirm and amplify their existing beliefs.
The research was undertaken, in part, thanks to funding from the Canada Research Chairs program and SSHRC Insight Grant 435-2017-1470.
Bakir, Vian, and Andrew McStay. "Fake News and the Economy of Emotions: Problems, Causes, Solutions." Digital Journalism 6.2 (2018): 154-175.
Bodner, John, Wendy Welch, and Ian Brodie. COVID-19 Conspiracy Theories: QAnon, 5G, the New World Order and Other Viral Ideas. Jefferson: McFarland, 2020.
Carr, Caleb T., and Rebecca A. Hayes. "Social Media: Defining, Developing, and Divining." Atlantic Journal of Communication 23.1 (2015): 46-65.
Clarke, Steve. "Conspiracy Theories and the Internet: Controlled Demolition and Arrested Development." Episteme 4.2 (2007): 167-180.
Douglas, Karen, Chee Siang Ang, and Farzin Deravi. "Reclaiming the Truth." The Psychologist 30 (2017): 36-42.
Enders, Adam M., et al. "The Relationship between Social Media Use and Beliefs in Conspiracy Theories and Misinformation." Political Behavior (2021): 1-24. 9 Mar. 2022 <https://link.springer.com/article/10.1007/s11109-021-09734-6>.
Faraj, Samer, and Bijan Azad. "The Materiality of Technology: An Affordance Perspective." Materiality and Organising: Social Interaction in a Technological World. Eds. Paul M. Leonardi, Bonnie A. Nardi, and Jannis Kallinikos. Oxford: Oxford UP, 2013. 237-258.
Hemsley, Jeffrey, and Karine Nahon. Going Viral. Cambridge: Polity Press, 2013.
Jenkins, Eric S., and Monica Huzinec. "Memeability in an Attention Economy: On the Form of the Nike Kaepernick Meme." Southern Communication Journal 86.4 (2021): 402-415.
Lanham, Richard A. The Economics of Attention: Style and Substance in the Age of Information. Chicago: U of Chicago P, 2006.
Lim, Merlyna. "Algorithmic Enclaves: Affective Politics and Algorithms in the Neoliberal Social Media Landscape." Affective Politics of Digital Media: Propaganda by Other Means. Eds. Megan Boler and Elizabeth Davis. London: Routledge, 2020. 186-203.
Malsbender, Andrea, Sara Hofmann, and Jörg Becker. "Aligning Capabilities and Social Media Affordances for Open Innovation in Governments." Australasian Journal of Information Systems 18.3 (2014): 317-330.
O'Neil, Cathy. Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. New York: Broadway Books, 2016.
Pariser, Eli. The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think. London: Penguin, 2011.
Pipes, Daniel. Conspiracy: How the Paranoid Style Flourishes and Where It Comes From. New York: Simon and Schuster, 1999.
Reynolds, Barbara, and Sandra Crouse Quinn. "Effective Communication during an Influenza Pandemic." Health Promotion Practice 9.4 (2008): 13S-17S. <https://doi.org/10.1177/1524839908325267>.
Shahsavari, Shadi, et al. "Conspiracy in the Time of Corona: Automatic Detection of Emerging COVID-19 Conspiracy Theories in Social Media and the News." Journal of Computational Social Science 3.2 (2020): 279-317.
Stamati, Teta, Thanos Papadopoulos, and Dimosthenis Anagnostopoulos. "Social Media for Openness and Accountability in the Public Sector: Cases in the Greek Context." Government Information Quarterly 32.1 (2015): 12-29.
Stecula, Dominik A., and Mark Pickup. "Social Media, Cognitive Reflection, and Conspiracy Beliefs." Frontiers in Political Science 3 (2021): 62.
Uscinski, Joseph E., and Joseph M. Parent. American Conspiracy Theories. Oxford: Oxford UP, 2014.
Van Prooijen, Jan-Willem, and Karen M. Douglas. "Conspiracy Theories as Part of History: The Role of Societal Crisis Situations." Memory Studies 10.3 (2017): 323-333.
Visentin, Marco, Annamaria Tuan, and Giandomenico Di Domenico. "Words Matter: How Privacy Concerns and Conspiracy Theories Spread on Twitter." Psychology & Marketing 38.10 (2021): 1828-1846.