A Comparative Algorithm Audit of Conspiracies on the Net: Abstract and Introductionby@browserology

A Comparative Algorithm Audit of Conspiracies on the Net: Abstract and Introduction

tldt arrow

Too Long; Didn't Read

A comparative algorithm audit of the distribution of conspiratorial information in search results across five search engines.
featured image - A Comparative Algorithm Audit of Conspiracies on the Net: Abstract and Introduction
Browserology: Study & Science of Internet Browsers HackerNoon profile picture

This paper is available on arxiv under CC 4.0 license.


(1) Aleksandra Urman, She is a corresponding author from Department of Informatics, University of Zurich, Switzerland;

(2) Mykola Makhortykh, Institute of Communication and Media Studies, University of Bern, Switzerland;

(3) Roberto Ulloa, GESIS - Leibniz-Institut für Sozialwissenschaften, Germany;

(4) Juhi Kulshrestha, Department of Politics and Public Administration, University of Konstanz, Germany.


Web search engines are important online information intermediaries that are frequently used and highly trusted by the public despite multiple evidence of their outputs being subjected to inaccuracies and biases. One form of such inaccuracy, which so far received little scholarly attention, is the presence of conspiratorial information, namely pages promoting conspiracy theories. We address this gap by conducting a comparative algorithm audit to examine the distribution of conspiratorial information in search results across five search engines: Google, Bing, DuckDuckGo, Yahoo and Yandex. Using a virtual agent-based infrastructure, we systematically collect search outputs for six conspiracy theory-related queries (“flat earth”, “new world order”, “qanon”, “9/11”, “illuminati”, “george soros”) across three locations (two in the US and one in the UK) and two observation periods (March and May 2021). We find that all search engines except Google consistently displayed conspiracy-promoting results and returned links to conspiracy-dedicated websites in their top results, although the share of such content varied across queries. Most conspiracy-promoting results came from social media and conspiracy-dedicated websites while conspiracy-debunking information was shared by scientific websites and, to a lesser extent, legacy media. The fact that these observations are consistent across different locations and time periods highlight the possibility of some search engines systematically prioritizing conspiracy-promoting content and, thus, amplifying their distribution in the online environments.


Web search engines (SEs) are crucial information gatekeepers in contemporary high-choice information environments (Van Aelst et al., 2017) with internet users turning to them on a daily basis (Urman and Makhortykh, 2021). At the same time, as demonstrated by a mounting body of evidence, search results can be inaccurate or biased (Kay et al., 2015; Kulshrestha et al., 2017; Makhortykh et al., 2020; Noble, 2018; Otterbacher et al., 2017). Still, search outputs are highly trusted by people and can influence their opinions on matters ranging from commercial brands to elections (e.g., Fisher et al., 2015; Nichols, 2017). Thus, malperformance of SEs can cause societal problems by leading, for example, to the spread of misinformation or of racial stereotypes (Noble, 2018).

While the explorations of bias in search results are increasingly common (see below), other forms of SE malperformance, in particular the one related to inaccurate search results, remain under-studied with a few notable exceptions (Bernstam et al., 2008; Bradshaw, 2019; Cooper and Feder, 2004). Unlike biased outputs, which tend to disproportionately amplify a particular point of view - e.g., by associating modern technology with Whiteness (Makhortykh et al., 2021a), - inaccurate outputs contain factually incorrect information (e.g., that the Earth is flat). Consequently, inaccurate outputs have higher potential for misinforming the users of SEs, which in some cases can pose a threat for their individual well-being as well as the society. It is particularly valid for outputs promoting conspiracy theories[1] , which unlike other forms of incorrect or biased search outputs has so far received meager attention from the scholarly community. As shown by the ongoing COVID-19 crisis (European Commission, 2021), conspiracy theories diminish trust towards authorities and scientific community which can undermine societal cohesion and lead to radicalization, in particular at the time of crises.

In this paper, we address the above-mentioned gap by investigating the presence of content promoting conspiracy theories in web search results through a systematic comparative algorithm impact audit. We rely on virtual agent-based infrastructure to systematically collect search outputs for six conspiracy theory-related queries on five most popular SEs across three locations and two waves (in March and in May 2021). Out of six utilized queries, three correspond to specific conspiracy theories (“flat earth”, “new world order”, “qanon”) - and are likely to be utilized by users interested in respective theories. Another three broadly refer to subjects about which many conspiracy theories circulate (“9/11”, “illuminati”, “george soros”) - and can be utilized by users broadly interested in related topics, without specific interest in conspiracy theories. We then conduct a qualitative analysis of all retrieved results to establish their stance on conspiracy theories (e.g., promoting/debunking) and their sources (e.g., social media or scientific websites), and compare our observations across locations and time periods. With this paper, we contribute, first, to the body of research on the spread of conspiracy theories through online platforms by analyzing their presence in web search results which were not studied in this context before; and second, to the literature on algorithm auditing and quality of information provided by web search engines.

The rest of the paper is organized as follows: we first review the state of research on inaccurate and biased information in web search and on conspiracy theories online. Then, we build on this review to formulate concrete research questions and describe the methodology of our study in detail. Finally, we summarize our results and discuss their implications as well as the limitations of the current research.

[1] We adhere to the formal definitions of “conspiracy theory” provided by the Merriam Webster dictionary: “A theory that explains an event or set of circumstances as the result of a secret plot by usually powerful conspirators”; “a theory asserting that a secret of great importance is being kept from the public” (Definition of CONSPIRACY THEORY, n.d.)