This paper is available on arxiv under CC 4.0 license.
Authors:
(1) Aleksandra Urman, She is a corresponding author from Department of Informatics, University of Zurich, Switzerland;
(2) Mykola Makhortykh, Institute of Communication and Media Studies, University of Bern, Switzerland;
(3) Roberto Ulloa, GESIS - Leibniz-Institut für Sozialwissenschaften, Germany;
(4) Juhi Kulshrestha, Department of Politics and Public Administration, University of Konstanz, Germany.
We have analyzed the presence of conspiratorial content in top (desktop) web search results across the five most popular SEs, three locations and two time periods for six conspiracy-related queries. Our findings contribute to both, the scholarship on inaccuracies and bias in web search, and research on the spread of conspiracy theories online. We also expect our observations to be robust considering there were only minor differences between locations and observation rounds.
Our analysis demonstrates that while there are large query-based differences, conspiracy-promoting content routinely resurfaces in the first page of results on all SEs except Google. Furthermore, Google’s results also contain the highest proportion of scientific sources debunking the conspiracy theories. We suggest that the absence of conspiratorial information in Google’s results is attributed to the company actively trying to reduce the share of misinformation and bias in its search outputs (Kayser-Bril, 2020). The company even explicitly covers the issue of conspiratorial content (including a specific example related to the 9/11 conspiracies) in its guidelines for search results’ quality raters (Google, 2021).
Another finding concerns the types of web resources prioritized by different search engines. Besides Google, all other search engines we examined link to conspiracy theory-dedicated websites with Yandex being the engine with the highest share of links to conspiratorial content. At the same time, most content on all engines (except Yandex) comes from media and reference websites which tend to contain little conspiracy-promoting content. In line with the previous findings (e.g., Douglas et al., 2019), we observe that conspiratorial content is found mainly on dedicated niche websites or social media, but is rare on broad-reaching sites such as media, scientific and reference websites. The latter observation highlights that one of the ways to cull dissemination of conspiratorial information on search engines can be to implement domain-based filters de-prioritizing resources known to be involved in promoting such information, even though the long-term resilience of this solution can be questioned.
Filtering out conspiracy-promoting websites from the top search outputs is important for (at least) two reasons. While it can be assumed that only users who are already interested in conspiratorial content purposefully navigate to the dedicated niche websites (Douglas et al., 2019), their appearance in top search results, especially for queries that do not denote conspiracy theories per se (e.g., “9/11” or “george soros”) can potentially lead to incidental exposure to conspiracy theories. This is troubling: given people’s high trust in search results (2021 Edelman Trust Barometer | Edelman, n.d.), conspiratorial information found by users merely exploring the topic in top search results can induce the formation of conspiratorial beliefs. This might be especially the case if users have limited knowledge on a given topic - which is partially implied by them turning to SEs to explore it - as lower knowledge on a subject is associated with higher beliefs in related conspiracy theories (Sallam et al., 2020).
In the cases when the exposure to conspiratorial content via search engines is not incidental - i.e., if a person already interested in a conspiracy theory searches for related content online (e.g., “flat earth” or “illuminati”) - the appearance of conspiracy-promoting information in top search results is also concerning. If such a person is simply interested - but not yet believing - in conspiracy theories, conspiratorial content coming from a highly trusted source can foster conspiratorial belief development. If that person already, at least partially, believes in a conspiracy theory, their belief can be reinforced by conspiracy-promoting web search results due to high public trust in SE results coupled with confirmation bias (Knobloch-Westerwick et al., 2015; Nichols, 2017; Suzuki and Yamamoto, 2020). Due to the latter, the presence of even a single conspiracy-promoting result in top results might be enough to reinforce conspiracy beliefs.