Scholars, journalists, conspiracists, and public-facing groups have employed a variety of analogies to discuss the role that misleading content (conspiracy theory, disinformation, malinformation, and misinformation), plays in our everyday lives. Terms like the “disinformation war” (Hwang) or the “Infodemic” (United Nations) attempt to summarise the issues of misleading content to aide public understanding. This project studies the effectiveness of these analogies in conveying the movement of online conspiracy theory in social media networks by simulating them in a game. Building from growing comparisons likening conspiracy theories to game systems (Berkowitz; Kaminska), we used game design as a research tool to test these analogies against theory. This article focusses on the design process, rather than implementation, to explore where the analogies succeed and fail in replication.
Background and Literature Review
Conspiracy Theories and Games
Online conspiracy theories reside in the milieu of misinformation (unintentionally incorrect), disinformation (intentionally incorrect), and malinformation (intentionally harmful) (Wardle and Derakhshan 45). They are puzzled together through the vast amount of information available online (Hannah 1) creating a “hunt” for truth (Berkowitz) that refracts information through deeply personal narratives that create paradoxical interpretations (Hochschild xi). Modern social media networks offer curated but fragmented content distribution where information discovery involves content finding users through biased sources (Toff and Nielsen 639).
This puzzling together of theories gives conspiracy theorists agency in ‘finding the story’, giving them agency in a process with underlining goals (Kaminska). A contemporary example is QAnon, where the narrative of a “secret global cabal”, large-scale pedophile rings, and overstepping government power is pieced together through Q-drops or cryptic clues that users decipher (Bloom and Moskalenko 5). This puzzle paints a seemingly hidden reality for players to uncover (Berkowitz) and offers gripping engagement which connects “disparate data” into a visualised conspiracy (Hannah 3).
Despite their harmful impacts, conspiracy theories are playful (Sobo). They can be likened to playful acts of make-belief (Sobo), reality-adjacent narratives that create puzzles for exploration (Berkowitz), and community building through playful discovery (Bloom and Moskalenko 169). Not only do conspiracies “game the algorithm” to promote content, but they put players into in a self-made digital puzzle (Bloom and Moskalenko 17, 18).
This array of human and nonhuman actors allows for truth-spinning that can push people towards conspiracy through social bonds (Moskalenko). Mainstream media and academic institutions are seen as biased and flawed information sources, prompting these users to “do their own research” within these spaces (Ballantyne and Dunning). However, users are in fragmented worldviews, not binaries of right and wrong, which leaves journalism and fact-checkers in a digital world that requires complex intervention (De Maeyer 22).
Analogies are one method of intervention. They offer explanation for the impact conspiracy has had on society, such as the polarisation of families (Andrews). Both conspiracists and public-facing groups have commonly used an analogy of war. The recent pandemic has also introduced analogies of virality (Hwang; Tardáguila et al.).
A war analogy places truth on a battleground against lies and fiction. “Doing your own research” is a combat maneuver for conspiracy proliferation through community engagement (Ballantyne and Dunning). Similarly, those fighting digital conspiracies have embraced the analogy to explain the challenges and repercussions of content. War suggests hardened battlelines, the need for public mobilisation, and a victory where truth prevails, or defeat where fallacy reigns (Shackelford).
Comparatively, a viral analogy, or “Infodemic” (United Nations), suggests misleading content as moving through a network like an infectious system; spreading through paths of least resistance or effective contamination (Scales et al. 678; Graham et al. 22). Battlelines are replaced with paths or invasion, where the goal is to infect the system or construct a rapid response vaccine that can stymie the ever-growing disease (Tardáguila et al.).
In both cases, victorious battles or curative vaccinations frame conspiracy and disinformation as temporary problems. The idea of the rise and falls of a conspiracy’s prominence as link to current events emulates Byung-Chung Han’s notion of the digital swarm, or fragmented communities that coalesce, bubble up into volatile noise, and then dissipate without addressing the “dominant power relations” (Han 12). For Han, swarms arise in digital networks with intensive support before disappearing, holding an influential but ephemeral life.
Recently, scholarship has applied a media ecology lens to recognise the interconnection of actors that contribute to these swarms. The digital-as-ecosystem approach suggests a network that needs to be actively managed (Milner and Phillips 8). Tangherlini et al.’s work on conspiracy pipelines highlights the various actors that move information through them to make the digital ecosystem healthy or unhealthy (Tangherlini et al.). Seeing the Internet, and the movement of information on it, as an ecology posits a consideration of processes that are visible (i.e., conspiracy theorists) and invisible (i.e., algorithms etc.) and is inclusive of human and non-human actors (Milner and Phillips).
With these analogies as frames, we answer Sobo’s call for a playful lens towards conspiracy alongside De Maeyer’s request for serious interventions by using serious play. If we can recognise both conspiracy and its formation as game-like and understand these analogies as explanatory narratives, we can use simulation game design to ask: how are these systems of conspiracy propagation being framed? What gaps in understanding arise when we frame conspiracy theory through the analogies used to describe it?
Research-Creation and Simulation Gaming
Our use of game design methods reframed analogies through “gaming literacy”, which considers the knowledge put into design and positions the game as a set of practices relating to the everyday (Zimmerman 24). This process requires constant reflection. In both the play of the game and the construction of its parts we employed Khaled’s critical design framework (10-11). From March to December 2021 we kept reflective logs, notes from bi-weekly team meetings, playtest observations, and archives of our visual design to consistently review and reassess our progression. We asked how the visuals, mechanics, and narratives point to the affordances and drawbacks of these analogies.
Visual and Mechanical Design
Before designing the details of the analogies, we had to visualise their environment – networked social media. We took inspiration from existing visual representations of the Internet and social media under the hypothesis that employing a familiar conceptual model could improve the intelligibility of the game (figs. 1 and 2). In usability design, this is referred to as "Jakob's law" (Nielsen), in which, by following familiar patterns, the user can focus better on content, or in our case, play.
Fig. 1: “My Twitter Social Ego Networks” by David Sousa-Rodrigues. A visual representation of Sousa-Rodrigues’s social media network. <https://www.flickr.com/photos/11452351@N00/2048034334>.
We focussed on the networked publics (Itō) that coalesce around information and content disclosure. We prioritised data practices that influence community construction through content (Bloom and Moskalenko 57), and the larger conspiracy pipelines of fragmented data (Tangherlini et al. 30).
Fig. 2: "The Internet Map" by Ruslan Enikeev. A visual, 2D, interactive representation of the Internet. <http://internet-map.net/>.
Our query focusses on how play reciprocated, or failed to reciprocate, these analogies. Sharp et al.’s suggestion that obvious and simple models are intuitively understood allowed us to employ simplification in design in the hopes of parsing down complex social media systems. Fig. 3 highlights this initial attempt where social media platforms became “networks” that formed proximity to specific groups or “nodes”.
Fig. 3: Early version of the game board, with a representation of nodes and networks as simplified visualisations for social networks.
This simplification process guided the scaling of design as we tried to make the seemingly boundless online networks accessible. Colourful tokens represented users, placed on the nodes (fig. 4). Tokens represented portions of the user base, allowing players to see the proliferation of conspiracy through the network. Unfortunately, this simplification ignores the individual acts of users and their ability to bypass these pipelines as well as the discovery-driven collegiality within these communities (Bloom and Moskalenko 57). To help offset this, we designed an overarching scenario and included “flavour text” on cards (fig. 5) which offered narrative vignettes that grounded player actions in dynamic story.
Fig. 4: The first version for the printed playtest for the board, with the representation of “networks” formed by a clustering of "nodes". The movement of conspiracy was indicated by colour-coded tokens.
Fig. 5: Playing cards. They reference a particular action which typically adds or removes token. They also reference a theory and offer text to narrativise the action.
Design demonstrates that information transmission is not entirely static. In the most recent version (fig. 6), this meant having the connections between nodes become subverted through player actions. Game mechanics, such as playing cards (fig. 5), make these pipelines interactive and visible by allowing players to place and move content throughout the space in response to each other’s actions.
Fig. 6: The most updated version of the board, now named "Lizards and Lies". Red regions are initial starting points for conspiracy to enter mainstream social media (purple).
Design adaptations focussed on making conspiracy theory dynamic. Player choice (i.e. where to add conspiracy) had to consider a continuously changing board created by other actors to reflect the adaptive nature of conspiracy theories. In this way, analogies came alive or died through the actions of players within a visually responsive system. This meant that each game had different swarms of conspiracy, where player decisions “wrote” a narrative through play. By selecting how and where conspiracy might be placed or removed, players created a narrative distinct to their game. For example, a conspiracy theorist player (one playable character) might explain their placing of conspiracy theory within the Chrpr/Twitter network as a community response to fact-checking (second playable character) in the neighbouring Shreddit/Reddit community.
Initial design took inspiration from wargaming to consider battlelines, various combatants, and a simulated conflict. Two player characters were made. Conspiracy theorists were posited against fact-checkers, where nodes and networks functioned as battlelines of intervention.
The war narrative was immediately challenged by the end-state. Either conspiracy overtook networks or the fact checkers completely stymied conspiracy’s ability to exist. Both end-states seemed wrong for players. Battle consistently felt futile as conspiracists could always add more content, and fact-checkers could always remove something.
Simply put, war fell flat. While the game could depict communities and spaces of combat, it struggled to represent how fragmented conspiracy theories are. In play, conspiracy theory became stagnant, the flow of information felt compelled, and the actors entered uneven dynamics. Utopia was never achieved, and war always raged on. Even when players did overtake a network, the victory condition (needing to control the most networks) made this task, which would normally be compelling, feel lacklustre.
To address this, we made changes. We altered the win condition to offer points at the end of each turn depending on what the player did (i.e., spreading conspiracy into networks). We expanded the number of networks and connections between them (fig. 3 and fig. 6) to include more fluid and fragmented pipelines of conspiracy dissemination. We included round-end events which shifted the state of the game based on other actors, and we pushed players to focus on their own actions more than those of the others on the board. These changes naturally shifted the battleground from hardened battle lines to a fragmented amorphous spread of disinformation; it moved war to virality.
As we transitioned towards the viral, we prioritised the reflexive, ephemeral movements of conspiracy proliferating through networks. We focussed less on adding and removing content and shifted to the movement of actors through the space. Some communities became more susceptible to conspiracy content, fact-checkers relied on flagging systems, and conspiracy theories followed a natural, but unexpected pipeline of content dissemination.
These changes allowed players to feel like individual actors with specific goals rather than competing forces. Fact-checkers relied on mitigation and response while conspiracists evaluated the susceptibility of specific communities to conspiracy content. This change illuminated a core issue with fact-checking; it is entirely responsive, endless, and too slow to stop content from having an impact. While conspiracists could play one card to add content, fact-checkers had to flag content, move their token, and use a player card to eliminate content – all of which exacerbated this issue. In this manner, the viral approach rearticulated how systems themselves afford the spread of conspiracy, where truly effective means to stop the spread relied on additional system actors, such as training algorithms to help remove and flag content.
While a more effective simulation, the viral analogy struggled in its presentation of conspiracy theory within social media. Play had a tipping point, where given enough resources, those stopping the spread of conspiracy could “vaccinate” it and clean the board. To alter this, our design began to consider actions and reactions, creating a push and pull of play focussed on balancing or offsetting the system. This transition naturally made us consider a media ecology analogy.
Replacing utopic end-states with a need to maintain network health reframed the nature of engagement within this simulation. An ecological model recognises that harmful content will exist in a system and aims not at elimination, but at maintaining a sustainable balance. It is responsive. It considers the various human and non-human actors at play and focusses on varied actor goals. As our game shifted to an ecological model, homogenous actors of conspiracists or fact-checkers were expanded. We transitioned a two-player game into a four-player variant, testing options like literacy educators, content recommending algorithms, and ‘edgelords'. Rather than defeating or saving social media, play becomes focussed on actors in the system.
Play and design demonstrated how actions would shape play decisions. Characters were seen as network actors rather than enemies, changing interaction. Those spreading conspiracy began to focus less on “viral paths”, or lines of battle, and instead on where or how they could impact system health. In some cases, conspiracists would build one network of support, in others they created pockets around the board from which they could run campaigns. Those stopping the spread came to see their job as management. Rather than try and eliminate all conspiracy, they determined which sites to engage with, what content held the greatest threat, and which tools would be most effective. Media ecology play focussed less on outsmarting opponents and instead on managing an actor’s, and other players’, goals within an evolving system.
Challenging Swarms and a Turn to Digital Ecology
Using games to evaluate analogies illuminates clear gaps in their use, and the value of a media ecology lens. A key issue across the two main analogies (war and virality) was a utopic endstate. The idea that conspiracy can be beaten back, or vaccinated, fails to consider the endless amount of conspiracy possible to be made, or the impossibility of vaccinating the entire system. As our transitionary design process shows, the notion of winners and losers misplaces the intent of various actors groups where conspiracy is better framed as community-building rather than “controlling” a space (Bloom and Moskalenko 57).
In design, while Han’s notion of the swarm was helpful, it struggled to play out in our simulations because fragments of conspiracy always remained on the board. This lingering content suggests that fact-checking does not actually remove ideological support. Swarms could quickly regrow around lingering support presenting them not as ephemeral as Han argued. As design transitioned towards ecology, these “fragments” were seen as part of a system of actors. Gameplay shows a deep interplay between the removal of content and its spread, arguing that removing conspiracy is a band-aid solution to a larger problem.
Our own simplification of analogy into a game is not without limitations. Importantly, the impact of user specific acts for interpreting a movement (Toff and Nielsen 640), and the underlying set of networks that create “dark platforms” (Zeng and Schäfer 122) were lost in the game’s translation. Despite this, our work provides directions for scholarship and those engaging with the public on these issues to consider. Reframing our lens to understand online conspiracy as an aspect of digital ecological health, asks us to move away from utopic solutions and instead focus on distinct actors as they relate to the larger system.
Employing serious play as a lens to our framing of digital conspiracy, this project emphasises a turn towards media ecology models. Game design functioned as a tool to consider the actors, behaviours, and interactions of a system. Our methodological approach for visualising war and viral analogies demonstrates how playful responses can prompt questions and considerations of theory. Playing in this way, offers new insights for how we think about and grapple with the various actors associated with conspiracy theory and scholarship should continue to embrace ecological models to weigh the assemblage of actors.
Andrews, Travis. “QAnon Is Tearing Families Apart.” Washington Post, 2020. <https://www.washingtonpost.com/technology/2020/09/14/qanon-families-support-group/>.
Ballantyne, Nathan, and David Dunning. “Skeptics Say, ‘Do Your Own Research.’ It’s Not That Simple.” The New York Times, 3 Jan. 2022. <https://www.nytimes.com/2022/01/03/opinion/dyor-do-your-own-research.html>.
Berkowitz, Reed. “QAnon Resembles the Games I Design. But for Believers, There Is No Winning.” Washington Post, 2021. <https://www.washingtonpost.com/outlook/qanon-game-plays-believers/2021/05/10/31d8ea46-928b-11eb-a74e-1f4cf89fd948_story.html>.
Bloom, Mia, and Sophia Moskalenko. Pastels and Pedophiles: Inside the Mind of QAnon. Stanford University Press, 2021.
De Maeyer, Juliette. “Taking Conspiracy Culture Seriously: Journalism Needs to Face Its Epistemological Trouble.” Journalism 20.1 (2019): 21–23. <https://doi.org/10.1177/1464884918807037>.
Graham, Timothy, et al. Like a Virus: The Coordinated Spread of Coronavirus Disinformation. The Australia Institute, 2020. <https://apo.org.au/node/305864>.
Han, Byung-Chul. In the Swarm: Digital Prospects. Trans. Erik Butler. MIT Press, 2017.
Hannah, Matthew N. “A Conspiracy of Data: QAnon, Social Media, and Information Visualization.” Social Media + Society, 7.3 (2021). <https://doi.org/10.1177/20563051211036064>.
Hochschild, Arlie Russell. Strangers in Their Own Land: Anger and Mourning on the American Right. The New Press, 2016.
Hwang, Tim. “Deconstructing the Disinformation War.” MediaWell, Social Science Research Council 1 June 2020. <https://mediawell.ssrc.org/expert-reflections/deconstructing-the-disinformation-war/>.
Itō, Mizuko. “Introduction.” Networked Publics. Ed. Kazys Varnelis. MIT Press, 2008.
Kaminska, Izabella. “The ‘Game Theory’ in the Qanon Conspiracy Theory.” Financial Times 16 Oct. 2020. <https://www.ft.com/content/74f9d20f-9ff9-4fad-808f-c7e4245a1725>.
Khaled, Rilla. “Questions over Answers: Reflective Game Design.” Playful Disruption of Digital Media. Ed. Daniel Cermak-Sassenrath. Singapore: Springer, 2018. 3–27. <https://doi.org/10.1007/978-981-10-1891-6_1>.
Milner, Ryan M., and Whitney Phillips. You Are Here. MIT Press, 2020. <https://you-are-here.pubpub.org/>.
Moskalenko, Sophia. “Evolution of QAnon & Radicalization by Conspiracy Theories.” The Journal of Intelligence, Conflict, and Warfare 4.2 (2021): 109–14. <https://doi.org/10.21810/jicw.v4i2.3756>.
Nielsen, Jakob. “End of Web Design.” Nielsen Norman Group, 2000. <https://www.nngroup.com/articles/end-of-web-design/>.
Scales, David, et al. “The Covid-19 Infodemic — Applying the Epidemiologic Model to Counter Misinformation.” New England Journal of Medicine 385.8 (2021): 678–81. <https://doi.org/10.1056/NEJMp2103798>.
Shackelford, Scott. “The Battle against Disinformation Is Global.” The Conversation 2020. <http://theconversation.com/the-battle-against-disinformation-is-global-129212>.
Sharp, Helen, et al. Interaction Design: Beyond Human-Computer Interaction. 5th ed. Wiley, 2019.
Sobo, Elisa Janine. “Playing with Conspiracy Theories.” Anthropology News 31 July 2019. <https://www.anthropology-news.org/articles/playing-with-conspiracy-theories/>.
Tangherlini, Timothy R., et al. “An Automated Pipeline for the Discovery of Conspiracy and Conspiracy Theory Narrative Frameworks: Bridgegate, Pizzagate and Storytelling on the Web.” PLoS ONE 15.6 (2020). <https://doi.org/10.1371/journal.pone.0233879>.
Tardáguila, Cristina, et al. “Taking an Ecological Approach to Misinformation.” Poynter 5 Dec. 2019. <https://www.poynter.org/fact-checking/2019/taking-an-ecological-approach-to-misinformation/>.
Toff, Benjamin, and Rasmus Kleis Nielsen. “‘I Just Google It’: Folk Theories of Distributed Discovery.” Journal of Communication 68.3 (2018): 636–57. <https://doi.org/10.1093/joc/jqy009>.
United Nations. “UN Tackles ‘Infodemic’ of Misinformation and Cybercrime in COVID-19 Crisis.” 2020. <https://www.un.org/en/un-coronavirus-communications-team/un-tackling-%E2%80%98infodemic%E2%80%99-misinformation-and-cybercrime-covid-19>.
Wardle, Claire, and Hossein Derakhshan. “Thinking about ‘Information Disorder’: Formats of Misinformation, Disinformation, and Mal-Information.” Journalism, ‘Fake News’ & Disinformation. Eds. Cherilyn Ireton and Julie Posetti. Paris: Unesco, 2018. 43–54.
Zeng, Jing, and Mike S. Schäfer. “Conceptualizing ‘Dark Platforms’. Covid-19-Related Conspiracy Theories on 8kun and Gab.” Digital Journalism 9.9 (2021): 1321–43. <https://doi.org/10.1080/21670811.2021.1938165>.
Zimmerman, Eric. “Gaming Literacy: Game Design as a Model for Literacy in the Twenty-First Century.” The Video Game Theory Reader 2. 2008. 9.