A Comparative Algorithm Audit: Conspiracy Theories in the Online Environmentsby@browserology
268 reads

A Comparative Algorithm Audit: Conspiracy Theories in the Online Environments

tldt arrow

Too Long; Didn't Read

A comparative algorithm audit of the distribution of conspiratorial information in search results across five search engines.
featured image - A Comparative Algorithm Audit: Conspiracy Theories in the Online Environments
Browserology: Study & Science of Internet Browsers HackerNoon profile picture

This paper is available on arxiv under CC 4.0 license.


(1) Aleksandra Urman, She is a corresponding author from Department of Informatics, University of Zurich, Switzerland;

(2) Mykola Makhortykh, Institute of Communication and Media Studies, University of Bern, Switzerland;

(3) Roberto Ulloa, GESIS - Leibniz-Institut für Sozialwissenschaften, Germany;

(4) Juhi Kulshrestha, Department of Politics and Public Administration, University of Konstanz, Germany.

Conspiracy theories in the online environments

Multiple studies suggest that the rise of online platforms have been conducive to the broader circulation and proliferation of conspiracy theories (Stano, 2020). While this suggestion has been criticized (e.g., by arguing that conspiracy theories can convince only those who already have predispositions to believe in them, Uscinski et al., 2018), even critics agree that the internet amplifies the spread of conspiratorial information and facilitates exposure to it even for individuals who do not hold conspiratorial ideas (Uscinski et al., 2018).

While the spread of conspiratorial content by itself is unlikely to make the society as a whole to believe conspiracy theories, it is nevertheless concerning. It can increase belief in conspiracy theories in those parts of the population that have relevant predispositions. The spread of conspiratorial content might not lead to striking increases in the share of people who believe in conspiracy theories (Douglas et al., 2019), but it can have other problematic societal consequences such as decreasing prosocial behaviour and acceptance of science (van der Linden, 2015).

Broad circulation and “normalization” of conspiratorial information (Aupers, 2012) in the online environments has prompted increased scholarly interest in the phenomenon (for an overview see Douglas et al., 2019). Over the last decade, a range of studies has explored the way conspiracy theories are spread and discussed online (e.g., Bessi et al., 2015; Harambam, 2021; Lewandowsky et al., 2013; Mahl et al., 2021; Metaxas and Finn, n.d.; Mohammed, 2019; Röchert et al., 2022; Samory and Mitra, 2018; Uscinski and Parent, 2014; Wood and Douglas, 2013). Most of this research has focused on social media platforms which are deemed a favourable environment for the spread of conspiratorial content (Stano, 2020): not only do the (conspiracy theory-sharing) users cluster together there (Bakshy et al., 2015), but also false information tends to rapidly spread on social media (Vosoughi et al., 2018). Despite the importance of social media-focused research on how conspiracy theories are spread, we suggest that other online information retrieval and curation platforms - such as SEs - can contribute to the proliferation of conspiracy theories online. At the same time, to our knowledge, no systematic analysis of the presence of conspiratorial information in web search results has been conducted yet[2] and we aim to address this gap.

[2] One recent study examined the proliferation of conspiracy theories related to COVID-19 in Google’s autocomplete suggestions (Houli et al., 2021), but not search results.