How Google Autocomplete Algorithms about Conspiracy Theorists Mislead the Public

How to Cite

Al-Rawi, A., Celestini, C., Stewart, N., & Worku, N. (2022). How Google Autocomplete Algorithms about Conspiracy Theorists Mislead the Public. M/C Journal, 25(1). (Original work published March 16, 2022)
Vol. 25 No. 1 (2022): conspiracy
Published 2022-03-16 — Updated on 2022-03-21

Introduction: Google Autocomplete Algorithms

Despite recent attention to the impact of social media platforms on political discourse and public opinion, most people locate their news on search engines (Robertson et al.). When a user conducts a search, millions of outputs, in the form of videos, images, articles, and Websites are sorted to present the most relevant search predictions. Google, the most dominant search engine in the world, expanded its search index in 2009 to include the autocomplete function, which provides suggestions for query inputs (Dörr and Stephan). Google’s autocomplete function also allows users to “search smarter” by reducing typing time by 25 percent (Baker and Potts 189). Google’s complex algorithm is impacted upon by factors like search history, location, and keyword searches (Karapapa and Borghi), and there are policies to ensure the autocomplete function does not contain harmful content.

In 2017, Google implemented a feedback tool to allow human evaluators to assess the quality of search results; however, the algorithm still provides misleading results that frame far-right actors as neutral. In this article, we use reverse engineering to understand the nature of these algorithms in relation to the descriptive outcome, to illustrate how autocomplete subtitles label conspiracists in three countries. According to Google, these “subtitles are generated automatically”, further stating that the “systems might determine that someone could be called an actor, director, or writer. Only one of these can appear as the subtitle” and that Google “cannot accept or create custom subtitles” (Google). We focused our attention on well-known conspiracy theorists because of their influence and audience outreach.   

In this article we argue that these subtitles are problematic because they can mislead the public and amplify extremist views. Google’s autocomplete feature is misleading because it does not highlight what is publicly known about these actors. The labels are neutral or positive but never negative, reflecting primary jobs and/or the actor’s preferred descriptions. This is harmful to the public because Google’s search rankings can influence a user’s knowledge and information preferences through the search engine manipulation effect (Epstein and Robertson). Users’ preferences and understanding of information can be manipulated based upon their trust in Google search results, thus allowing these labels to be widely accepted instead of providing a full picture of the harm their ideologies and belief cause.

Algorithms That Mainstream Conspiracies

Search engines establish order and visibility to Web pages that operationalise and stabilise meaning to particular queries (Gillespie). Google’s subtitles and blackbox operate as a complex algorithm for its search index and offer a mediated visibility to aspects of social and political life (Gillespie). Algorithms are designed to perform computational tasks through an operational sequence that computer systems must follow (Broussard), but they are also “invisible infrastructures” that Internet users consciously or unconsciously follow (Gran et al. 1779). The way algorithms rank, classify, sort, predict, and process data is political because it presents the world through a predetermined lens (Bucher 3) decided by proprietary knowledge – a “secret sauce” (O’Neil 29) – that is not disclosed to the general public (Christin). Technology titans, like Google, Facebook, and Amazon (Webb), rigorously protect and defend intellectual property for these algorithms, which are worth billions of dollars (O’Neil). As a result, algorithms are commonly defined as opaque, secret “black boxes” that conceal the decisions that are already made “behind corporate walls and layers of code” (Pasquale 899). The opacity of algorithms is related to layers of intentional secrecy, technical illiteracy, the size of algorithmic systems, and the ability of machine learning algorithms to evolve and become unintelligible to humans, even to those trained in programming languages (Christin 898-899). The opaque nature of algorithms alongside the perceived neutrality of algorithmic systems is problematic. Search engines are increasingly normalised and this leads to a socialisation where suppositions are made that “these artifacts are credible and provide accurate information that is fundamentally depoliticized and neutral” (Noble 25).

Google’s autocomplete and PageRank algorithms exist outside of the veil of neutrality. In 2015, Google’s photos app, which uses machine learning techniques to help users collect, search, and categorise images, labelled two black people as ‘gorillas’ (O’Neil). Safiya Noble illustrates how media and technology are rooted in systems of white supremacy, and how these long-standing social biases surface in algorithms, illustrating how racial and gendered inequities embed into algorithmic systems. Google actively fixes algorithmic biases with band-aid-like solutions, which means the errors remain inevitable constituents within the algorithms. Rising levels of automation correspond to a rising level of errors, which can lead to confusion and misdirection of the algorithms that people use to manage their lives (O’Neil). As a result, software, code, machine learning algorithms, and facial/voice recognition technologies are scrutinised for producing and reproducing prejudices (Gray) and promoting conspiracies – often described as algorithmic bias (Bucher). Algorithmic bias occurs because algorithms are trained by historical data already embedded with social biases (O’Neil), and if that is not problematic enough, algorithms like Google’s search engine also learn and replicate the behaviours of Internet users (Benjamin 93), including conspiracy theorists and their followers.

Technological errors, algorithmic bias, and increasing automation are further complicated by the fact that Google’s Internet service uses “2 billion lines of code” – a magnitude that is difficult to keep track of, including for “the programmers who designed the algorithm” (Christin 899). Understanding this level of code is not critical to understanding algorithmic logics, but we must be aware of the inscriptions such algorithms afford (Krasmann). As algorithms become more ubiquitous it is urgent to “demand that systems that hold algorithms accountable become ubiquitous as well” (O’Neil 231). This is particularly important because algorithms play a critical role in “providing the conditions for participation in public life”; however, the majority of the public has a modest to nonexistent awareness of algorithms (Gran et al. 1791). Given the heavy reliance of Internet users on Google’s search engine, it is necessary for research to provide a glimpse into the black boxes that people use to extract information especially when it comes to searching for information about conspiracy theorists.

Our study fills a major gap in research as it examines a sub-category of Google’s autocomplete algorithm that has not been empirically explored before. Unlike the standard autocomplete feature that is primarily programmed according to popular searches, we examine the subtitle feature that operates as a fixed label for popular conspiracists within Google’s algorithm. Our initial foray into our research revealed that this is not only an issue with conspiracists, but also occurs with terrorists, extremists, and mass murderers.


Using a reverse engineering approach (Bucher) from September to October 2021, we explored how Google’s autocomplete feature assigns subtitles to widely known conspiracists. The conspiracists were not geographically limited, and we searched for those who reside in the United States, Canada, United Kingdom, and various countries in Europe. Reverse engineering stems from Ashby’s canonical text on cybernetics, in which he argues that black boxes are not a problem; the problem or challenge is related to the way one can discern their contents. As Google’s algorithms are not disclosed to the general public (Christin), we use this method as an extraction tool to understand the nature of how these algorithms (Eilam) apply subtitles. To systematically document the search results, we took screenshots for every conspiracist we searched in an attempt to archive the Google autocomplete algorithm. By relying on previous literature, reports, and the figures’ public statements, we identified and searched Google for 37 Western-based and influencial conspiracy theorists. We initially experimented with other problematic figures, including terrorists, extremists, and mass murderers to see whether Google applied a subtitle or not. Additionally, we examined whether subtitles were positive, neutral, or negative, and compared this valence to personality descriptions for each figure. Using the standard procedures of content analysis (Krippendorff), we focus on the manifest or explicit meaning of text to inform subtitle valence in terms of their positive, negative, or neutral connotations. These manifest features refer to the “elements that are physically present and countable” (Gray and Densten 420) or what is known as the dictionary definitions of items.

Using a manual query, we searched Google for subtitles ascribed to conspiracy theorists, and found the results were consistent across different countries. Searches were conducted on Firefox and Chrome and tested on an Android phone. Regardless of language input or the country location established by a Virtual Private Network (VPN), the search terms remained stable, regardless of who conducted the search. The conspiracy theorists in our dataset cover a wide range of conspiracies, including historical figures like Nesta Webster and John Robison, who were foundational in Illuminati lore, as well as contemporary conspiracists such as Marjorie Taylor Greene and Alex Jones. Each individual’s name was searched on Google with a VPN set to three countries.

Results and Discussion

This study examines Google’s autocomplete feature associated with subtitles of conspiratorial actors. We first tested Google’s subtitling system with known terrorists, convicted mass shooters, and controversial cult leaders like David Koresh. Garry et al. (154) argue that “while conspiracy theories may not have mass radicalising effects, they are extremely effective at leading to increased polarization within societies”. We believe that the impact of neutral subtitling of conspiracists reflects the integral role conspiracies plays in contemporary politics and right-wing extremism. The sample includes contemporary and historical conspiracists to establish consistency in labelling. For historical figures, the labels are less consequential and simply reflect the reality that Google’s subtitles are primarily neutral. 

Of the 37 conspiracy theorists we searched (see Table 1 in the Appendix), seven (18.9%) do not have an associated subtitle, and the other 30 (81%) have distinctive subtitles, but none of them reflects the public knowledge of the individuals’ harmful role in disseminating conspiracy theories. In the list, 16 (43.2%) are noted for their contribution to the arts, 4 are labelled as activists, 7 are associated with their professional affiliation or original jobs, 2 to the journalism industry, one is linked to his sports career, another one as a researcher, and 7 have no subtitle. The problem here is that when white nationalists or conspiracy theorists are not acknowledged as such in their subtitles, search engine users could possibly encounter content that may sway their understanding of society, politics, and culture. For example, a conspiracist like Alex Jones is labeled as an “American Radio Host” (see Figure 1), despite losing two defamation lawsuits for declaring that the shooting at Sandy Hook Elementary School in Newtown, Connecticut, was a ‘false flag’ event. Jones’s actions on his InfoWars media platforms led to parents of shooting victims being stalked and threatened. Another conspiracy theorist, Gavin McInnes, the creator of the far-right, neo-fascist Proud Boys organisation, a known terrorist entity in Canada and hate group in the United States, is listed simply as a “Canadian writer” (see Figure 1).

Fig. 1: Screenshots of Google’s subtitles for Alex Jones and Gavin McInnes.

Although subtitles under an individual’s name are not audio, video, or image content, the algorithms that create these subtitles are an invisible infrastructure that could cause harm through their uninterrogated status and pervasive presence. This could then be a potential conduit to media which could cause harm and develop distrust in electoral and civic processes, or all institutions. Examples from our list include Brittany Pettibone, whose subtitle states that she is an “American writer” despite being one of the main propagators of the Pizzagate conspiracy which led to Edgar Maddison Welch (whose subtitle is “Screenwriter”) travelling from North Carolina to Washington D.C. to violently threaten and confront those who worked at Comet Ping Pong Pizzeria. The same misleading label can be found via searching for James O’Keefe of Project Veritas, who is positively labelled as “American activist”. Veritas is known for releasing audio and video recordings that contain false information designed to discredit academic, political, and service organisations. In one instance, a 2020 video released by O’Keefe accused Democrat Ilhan Omar’s campaign of illegally collecting ballots. The same dissembling of distrust applies to Mike Lindell, whose Google subtitle is “CEO of My Pillow”, as well as Sidney Powell, who is listed as an “American lawyer”; both are propagators of conspiracy theories relating to the 2020 presidential election.

The subtitles attributed to conspiracists on Google do not acknowledge the widescale public awareness of the negative role these individuals play in spreading conspiracy theories or causing harm to others. Some of the selected conspiracists are well known white nationalists, including Stefan Molyneux who has been banned from social media platforms like Twitter, Twitch, Facebook, and YouTube for the promotion of scientific racism and eugenics; however, he is neutrally listed on Google as a “Canadian podcaster”. In addition, Laura Loomer, who describes herself as a “proud Islamophobe,” is listed by Google as an “Author”. These subtitles can pose a threat by normalising individuals who spread conspiracy theories, sow dissension and distrust in institutions, and cause harm to minority groups and vulnerable individuals. Once clicking on the selected person, the results, although influenced by the algorithm, did not provide information that aligned with the associated subtitle. The search results are skewed to the actual conspiratorial nature of the individuals and associated news articles. In essence, the subtitles do not reflect the subsequent search results, and provide a counter-labelling to the reality of the resulting information provided to the user.

Another significant example is Jerad Miller, who is listed as “American performer”, despite the fact that he is the Las Vegas shooter who posted anti-government and white nationalist 3 Percenters memes on his social media (SunStaff), even though the majority of search results connect him to the mass shooting he orchestrated in 2014. The subtitle “performer” is certainly not the common characteristic that should be associated with Jerad Miller.

Table 1 in the Appendix shows that individuals who are not within the contemporary milieux of conspiracists, but have had a significant impact, such as Nesta Webster, Robert Welch Junior, and John Robison, were listed by their original profession or sometimes without a subtitle. David Icke, infamous for his lizard people conspiracies, has a subtitle reflecting his past football career. In all cases, Google’s subtitle was never consistent with the actor’s conspiratorial behaviour. Indeed, the neutral subtitles applied to conspiracists in our research may reflect some aspect of the individuals’ previous careers but are not an accurate reflection of the individuals’ publicly known role in propagating hate, which we argue is misleading to the public. For example, David Icke may be a former footballer, but the 4.7 million search results predominantly focus on his conspiracies, his public fora, and his status of being deplatformed by mainstream social media sites. The subtitles are not only neutral, but they are not based on the actual search results, and so are misleading in what the searcher will discover; most importantly, they do not provide a warning about the misinformation contained in the autocomplete subtitle.

To conclude, algorithms automate the search engines that people use in the functions of everyday life, but are also entangled in technological errors, algorithmic bias, and have the capacity to mislead the public. Through a process of reverse engineering (Ashby; Bucher), we searched 37 conspiracy theorists to decode the Google autocomplete algorithms. We identified how the subtitles attributed to conspiracy theorists are neutral, positive, but never negative, which does not accurately reflect the widely known public conspiratorial discourse these individuals propagate on the Web. This is problematic because the algorithms that determine these subtitles are invisible infrastructures acting to  misinform the public and to mainstream conspiracies within larger social, cultural, and political structures. This study highlights the urgent need for Google to review the subtitles attributed to conspiracy theorists, terrorists, and mass murderers, to better inform the public about the negative nature of these actors, rather than always labelling them in neutral or positive ways.

Funding Acknowledgement 

This project has been made possible in part by the Canadian Department of Heritage – the Digital Citizen Contribution program – under grant no. R529384. The title of the project is “Understanding hate groups’ narratives and conspiracy theories in traditional and alternative social media”.


Ashby, W. Ross. An Introduction to Cybernetics. Chapman & Hall, 1961.

Baker, Paul, and Amanda Potts. "‘Why Do White People Have Thin Lips?’ Google and the Perpetuation of Stereotypes via Auto-Complete Search Forms." Critical Discourse Studies 10.2 (2013): 187-204.

Benjamin, Ruha. Race after Technology: Abolitionist Tools for the New Jim Code. Polity, 2019.

Bucher, Taina. If... Then: Algorithmic Power and Politics. OUP, 2018.

Broussard, Meredith. Artificial Unintelligence: How Computers Misunderstand the World. MIT P, 2018.

Christin, Angèle. "The Ethnographer and the Algorithm: Beyond the Black Box." Theory and Society 49.5 (2020): 897-918.

D'Ignazio, Catherine, and Lauren F. Klein. Data Feminism. MIT P, 2020.

Dörr, Dieter, and Juliane Stephan. "The Google Autocomplete Function and the German General Right of Personality." Perspectives on Privacy. De Gruyter, 2014. 80-95.

Eilam, Eldad. Reversing: Secrets of Reverse Engineering. John Wiley & Sons, 2011.

Epstein, Robert, and Ronald E. Robertson. "The Search Engine Manipulation Effect (SEME) and Its Possible Impact on the Outcomes of Elections." Proceedings of the National Academy of Sciences 112.33 (2015): E4512-E4521.

Garry, Amanda, et al. "QAnon Conspiracy Theory: Examining its Evolution and Mechanisms of Radicalization." Journal for Deradicalization 26 (2021): 152-216.

Gillespie, Tarleton. "Algorithmically Recognizable: Santorum’s Google Problem, and Google’s Santorum Problem." Information, Communication & Society 20.1 (2017): 63-80.

Google. “Update your Google knowledge panel.” 2022. 3 Jan. 2022 <>. 

Gran, Anne-Britt, Peter Booth, and Taina Bucher. "To Be or Not to Be Algorithm Aware: A Question of a New Digital Divide?" Information, Communication & Society 24.12 (2021): 1779-1796.

Gray, Judy H., and Iain L. Densten. "Integrating Quantitative and Qualitative Analysis Using Latent and Manifest Variables." Quality and Quantity 32.4 (1998): 419-431.

Gray, Kishonna L. Intersectional Tech: Black Users in Digital Gaming. LSU P, 2020.

Karapapa, Stavroula, and Maurizio Borghi. "Search Engine Liability for Autocomplete Suggestions: Personality, Privacy and the Power of the Algorithm." International Journal of Law and Information Technology 23.3 (2015): 261-289.

Krasmann, Susanne. "The Logic of the Surface: On the Epistemology of Algorithms in Times of Big Data." Information, Communication & Society 23.14 (2020): 2096-2109.

Krippendorff, Klaus. Content Analysis: An Introduction to Its Methodology. Sage, 2004.

Noble, Safiya Umoja. Algorithms of Oppression. New York UP, 2018.

O'Neil, Cathy. Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. Crown, 2016.

Pasquale, Frank. The Black Box Society. Harvard UP, 2015.

Robertson, Ronald E., David Lazer, and Christo Wilson. "Auditing the Personalization and Composition of Politically-Related Search Engine Results Pages." Proceedings of the 2018 World Wide Web Conference. 2018.

Staff, Sun. “A Look inside the Lives of Shooters Jerad Miller, Amanda Miller.” Las Vegas Sun 9 June 2014. <>.

Webb, Amy. The Big Nine: How the Tech Titans and Their Thinking Machines Could Warp Humanity. Hachette UK, 2019.


Table 1: The subtitles of conspiracy theorists on Google autocomplete

Conspiracy Theorist

Google Autocomplete Subtitle

Character Description

Alex Jones

American radio host

InfoWars founder, American far-right radio show host and conspiracy theorist. The SPLC describes Alex Jones as "the most prolific conspiracy theorist in contemporary America."

Barry Zwicker

Canadian journalist

Filmmaker who made a documentary that claimed fear was used to control the public after 9/11.

Bart Sibrel

American producer

Writer, producer, and director of work to falsely claim the Apollo moon landings between 1969 and 1972 were staged by NASA.

Ben Garrison

American cartoonist

Alt-right and QAnon political cartoonist

Brittany Pettibone

American writer

Far-right, political vlogger on YouTube and propagator of #pizzagate.

Cathy O’Brien

American author

Cathy O’Brien claims she was a victim of a government mind control project called Project Monarch.

Dan Bongino

American radio host

Stakeholder in Parler, Radio Host, Ex-Spy, Conspiracist (Spygate, MAGA election fraud, etc.).

David Icke

Former footballer

Reptilian humanoid conspiracist.

David Wynn Miller

(No subtitle)

Conspiracist, far-right tax protester, and founder of the Sovereign Citizens Movement.

Jack Posobiec

American activist

Alt-right, alt-lite political activist, conspiracy theorist, and Internet troll. Editor of Human Events Daily.

James O’Keefe

American activist

Founder of Project Veritas, a far-right company that propagates disinformation and conspiracy theories.

John Robison


Foundational Illuminati conspiracist.

Kevin Annett

Canadian writer

Former minister and writer, who wrote a book exposing the atrocities to Indigenous Communities, and now is a conspiracist and vlogger.

Laura Loomer


Far-right, anti-Muslim, conspiracy theorist, and Internet personality. Republican nominee in Florida's 21st congressional district in 2020. 

Marjorie Taylor Greene

United States Representative

Conspiracist, QAnon adherent, and  U.S. representative for Georgia's 14th congressional district.

Mark Dice

American YouTuber

Right-wing conservative pundit and conspiracy theorist.

Mark Taylor

(No subtitle)

QAnon minister and  self-proclaimed prophet of Donald Trump, the 45th U.S. President.

Michael Chossudovsky

Canadian economist

Professor emeritus at the University of Ottawa, founder of the Centre for Research on Globalization, and conspiracist.

Michael Cremo
(Drutakarmā dāsa)

American researcher

Self-described Vedic creationist whose book, Forbidden Archeology, argues humans have lived on earth for millions of years.

Mike Lindell

CEO of My Pillow

Business owner and conspiracist.

Neil Patel

English entrepreneur

Founded The Daily Caller with Tucker Carlson.

Nesta Helen Webster

English author

Foundational Illuminati conspiracist.

Naomi Wolf

American author

Feminist turned conspiracist (ISIS, COVID-19, etc.).

Owen Benjamin

American comedian

Former actor/comedian now conspiracist (Beartopia), who is banned from mainstream social media for using hate speech.

Pamela Geller

American activist

Conspiracist, Anti-Islam, Blogger, Host.

Paul Joseph Watson

British YouTuber

InfoWars co-host and host of the YouTube show PrisonPlanetLive.

QAnon Shaman (Jake Angeli)

American activist

Conspiracy theorist who participated in the 2021 attack on Capitol Hil.

Richard B. Spencer

(No subtitle)

American neo-Nazi, antisemitic conspiracy theorist, and white supremacist.

Rick Wiles

(No subtitle)

Minister, Founded conspiracy site, TruNews.

Robert W. Welch Jr.

American businessman 

Founded the John Birch Society.

Ronald Watkins

(No subtitle)

Founder of 8kun.

Serge Monast


Creator of Project Blue Beam conspiracy.

Sidney Powell

(No subtitle)

One of former President Trump’s Lawyers, and renowned conspiracist regarding the 2020 Presidential election.

Stanton T. Friedman

Nuclear physicist

Original civilian researcher of the 1947 Roswell UFO incident.

Stefan Molyneux

Canadian podcaster

Irish-born, Canadian far-right white nationalist, podcaster, blogger, and banned YouTuber, who promotes conspiracy theories, scientific racism, eugenics, and racist views

Tim LaHaye

American author

Founded the Council for National Policy, leader in the Moral Majority movement, and co-author of the Left Behind book series.

Viva Frei

(No subtitle)

YouTuber/ Canadian Influencer, on the Far-Right and Covid conspiracy proponent.

 William Guy Carr

Canadian author

Illuminati/III World War Conspiracist

Google searches conducted as of 9 October 2021.

Author Biographies

Ahmed Al-Rawi, Simon Fraser University

Ahmed Al-Rawi (Ph.D. University of Leicester) is an assistant professor in the School of Communication at Simon Fraser University, where he also manages The Disinformation Project. His research interests include social media, news, and global communication with an emphasis on Canada and the Middle East. 

Carmen Celestini, Simon Fraser University

Carmen Celestini (Ph.D. University of Waterloo) is a researcher of disinformation with the Disinformation Project at Simon Fraser University. Her research interests include conspiracy, apocalyptic thought, extremism, and nationalism/populism.

Nicole Stewart, Simon Fraser University

Nicole K. Stewart (M.A. Simon Fraser University) is a doctoral candidate in the School of Communication and a researcher with The Disinformation Project at Simon Fraser University. Her research interests include platforms, digital skills, new media, and communication technology.

Nathan Worku, Simon Fraser University

Nathan Worku is a Master of Public Health student at Simon Fraser University where he works as a part of The Disinformation Project. His research interests include health equity, health communication, and knowledge translation.