Background and Related Work on Dark Patterns in UX Design

Written by feedbackloop | Published 2024/01/16
Tech Story Tags: dark-patterns-in-ux | user-experience | ui-ux | design-ethics | web-augmentation | interaction-design | human-centered-computing | end-user-empowerment

TLDRThis article unveils a comprehensive exploration of dark patterns in UX design, offering insights from co-design workshops and a technology probe study. From Brignull's coining of the term to the formation of a SIG at CHI 2023, the study delves into the prevalence and evolving nature of dark patterns. It analyzes efforts in addressing dark patterns, from the designer's ethical considerations to policymaking endeavors. The article emphasizes the often-overlooked end users' autonomy and introduces a human-centric intervention approach, covering awareness, action, and the integration of web augmentation. Gain valuable insights into user perception, web augmentation, and the ongoing battle against deceptive design choices.via the TL;DR App

Authors:

(1) Yuwen Lu, he contributed equally to this work, University of Notre Dame, USA;

(2) Chao Zhang, he contributed equally to this work and work done as a visiting researcher at the University of Notre Dame;

(3) Yuewen Yang, Work done as a visiting researcher at the University of Notre Dame;

(4) Yaxin Yao, Virginia Tech, USA;

(5) Toby Jia-Jun Li, University of Notre Dame, USA.

Table of Links

Introduction

Background and Related Work

Co-Design Workshops

Technology Probe Study

Results

Scaling Up: A Research Agenda

Limitations and Future Work

Conclusions and References

Appendix

2 BACKGROUND AND RELATED WORK

2.1 Studies of Dark Patterns

Brignull coined the term “dark pattern” (also known as “deceptive design pattern”) which refers to “a user interface that has been carefully crafted with an understanding of human psychology to trick users into doing things that they did not intend to” [38]. Such dark patterns are prevalent—a previous study analyzed 240 popular mobile apps and found that 95% of them contained at least one instance of dark patterns [26]. They commonly come in a variety of types across the web and mobile platforms [44] in different cultural context [48], exploiting users’ attention [43], time [20, 86], money [81], privacy [16] and autonomy in outright or subtle ways [24, 121]. At CHI 2023, a new SIG was formed to combat the growing issue of dark patterns in tech design through research, regulation, and interdisciplinary collaboration [36, 41].

The prevalence and “dark” nature of dark patterns arose in a wide range of work published in the past years by HCI and CSCW academics. Brignull established a site2 to collect examples of dark patterns and divided them into different types [15]. Gray et al. [38] introduced “dark patterns” as an ethical phenomenon in design and identified five manipulative design strategies: nagging, obstruction, sneaking, interface interference, and forced action. This foundational work has led researchers to uncover dark patterns on gaming [1, 66, 132], robotics [61], IoT devices [58], and social platforms [83, 85–87, 107], thereby establishing both generic [24, 38] and domain-specific [16, 20, 43, 66, 81] taxonomies of dark patterns. To unify these diverse taxonomies, Mathur et al. [82] proposed six design attributes to characteristic dark patterns at a high level of generality. They described how dark patterns modify the disclosed information and underlying choice architecture for users, helping us to disclose manipulative mechanisms and provide targeted alternatives in our study.

Previous work also examined the designers’ perspective regarding their intents and stakeholder values leading to “dark” designs [21]. Moreover, studies on user attitude and perception have also expanded the dark pattern literature [11, 20, 35]. They investigated users’ accounts of felt manipulation [35], unintended behaviors [20], and perceived nuances between dark patterns and “asshole design [37],” foregrounding the need for users to have agency over their online experience [82]. Therefore, our work builds on previous efforts to address dark patterns (Section 2.2) and understand user awareness (Section 2.3), using co-design methods to explore a new intervention approach to empower users against online manipulation.

2.2 Efforts in Addressing Dark Patterns

Previous work investigated how designers, educators, and regulators can contribute to addressing dark patterns’ adverse influence on end users [11, 21, 35, 77, 101].

From the perspective of designers, a growing number of researchers called for the incorporation of ethics into the design process [92, 114]. Chivukula et al. [21] revealed that designers often have dark and tacit intentions to persuade users with business purposes of satisfying stakeholders, even with sensitivity to user values. Academics, therefore, have proposed design methods to foster better alignment with user values, such as value-centered design [37]. From an education standpoint, Gray et al. [38] encouraged UX professional organizations to build ethical education into the fabric of HCI/UX education and practice. Educators can also offer courses to deepen users’ understanding of dark patterns [26], train users to identify them [76], and increase their resistance through longterm boosts [47, 130]. In terms of policymaking, efforts to investigate how dark patterns hurt user benefits (Section 2.1) have been raised as a space for new policies to be formed. Regulators can implement economic incentives and regulatory interventions to force companies to reduce dark patterns in their services [7, 65, 80]. For example, recently published official reports from the European Union Commission [23], the European Data Protection Board (EDPB) [8], and the Federal Trade Commission (FTC) [22] that specifically outline taxonomies of dark patterns, examples of violations, and opportunities for characterization and governance interventions.

However, most of these efforts overlooked the end users’ autonomy of self-protection [2, 13]. End users have strong incentives and the desire to protect themselves from threats in their online experiences, but often lack the capacity to do so [53, 75, 112, 136]. Meanwhile, dark patterns are shape-shifting and continuously evolving, making it hard to completely ban them with policies.

There is no one-size-fits-all solution. Previous studies have coined many intervention techniques to change individual types of dark patterns, such as enforcing consent [45], hiding or disabling [57, 75], adding friction [89], and using “bright patterns” [42]. Due to the diversity of dark patterns, these techniques can hardly be effective for all. To add an additional challenge, users’ preferences for intervention techniques can change with their evolving understandings and perceptions [35, 75]. Therefore, we need to better understand end users’ expectations of interventions and their spontaneous approach to self-protection.

In this work, inspired by previous studies of user awareness of dark patterns [11, 20, 35, 78] and end-user web augmentation [56, 60, 71, 118, 133], we take a human-centered approach to support end users, by disclosing the presence and impact of dark patterns, and empowering users to “fix” the undesired ones with pre-defined UI alternatives.

2.3 User Perception of Dark Patterns

Several researchers have conducted empirical studies to understand users’ perception of dark patterns [11, 20, 73, 78]. For example, Gray et al. [35] identified qualitatively supported insights to describe end users’ experiences of being manipulated. They found a broad awareness from users that something is “off” or “not correct,” but still lacking the ability to precisely describe what drives the feeling of being manipulated [35]. Maier and Harr [78] suggested users’ perception of dark patterns goes through four stages—impression, assessment, balance, and acceptability. They have to get an impression of dark patterns, assess their convenience and manipulation, balance the trade-off, and then accept or reject dark patterns. However, the obscurity of design intents and the abuse of cognitive biases [81] make it difficult for users to comprehensively understand dark patterns (impression), and therefore hinder the subsequent assessment and balance processes. Therefore, we conducted the first-phase co-design workshops, seeking to answer what information users need to make up for the lack of transparency. Based on the findings, we propose the awareness intervention, aiming to empower the end users of an interface to recognize dark patterns (impression), understand the potential effects on their choice architecture and welfare (assessment), and balance the tension between user values and manipulation (balance). Furthermore, Bongard-Blanchy et al [11]. discovered that the awareness of dark patterns does not necessarily lead to the ability to oppose adverse manipulative influences. A single “transparency” intervention may be insufficient to help users counteract dark patterns, implying the dual role of raising users’ awareness and empowering them to take actions. Therefore, we propose another intervention called action which is based on an end user web augmentation approach.

2.4 Web Augmentation

Web augmentation [100] allows end users to customize existing web interfaces for personalized user experiences. GreaseMonkey[3] is among the earliest browser extensions that manage user scripts to augment websites, many of which target adapting web UIs. Since GreaseMonkey requires users to write code scripts, it is mostly used by people with programming skills. Later on, many low-code or no-code web augmentation tools have been designed to lower the technical barrier [10, 18, 71, 104, 133], allowing end users to change websites by direct interaction with UI elements [56, 93, 94, 102] or replacing components with defined alternatives [33, 60, 64]. This interaction paradigm makes it easier for end users without programming expertise due to its naturalness [91].

Many of these tools adopt a community-driven approach, where users share their web augmentations to be re-used by others (e.g. GreaseSpot[4] and Arc Boosts Gallery[5]). However, these communities and their dynamics seem to be under-studied in CSCW, with only a few papers from adjacent research communities [4, 31, 95, 103].

Previous work has investigated the use of web augmentation to address specific dark patterns. For example, Nouwens et al. [100] designed a browser extension, Consent-O-Matic, that automatically responds to consent pop-ups based on the user’s preferences. Kollnig et al. [57] proposed an approach named GreaseDroid, enabling Android users to remove dark patterns in mobile applications with “patches”. While these two technical-centric studies extended the feasibility of dark pattern interventions through web/mobile augmentation, they did not fully investigate end users’ needs for such interventions through a user-centered lens. In this work, our two-phase co-design study seeks to complement these technical UI augmentation work by trying to understand end users’ needs, preferences, and expectations for interventions through UI augmentation. Our insights provide inspiration for community creators in designing alternatives to unethical design patterns in the future.


[3] https://addons.mozilla.org/en-US/firefox/addon/greasemonkey/

[4] https://wiki.greasespot.net/

[5] https://arc.net/boosts

This paper is available on arxiv under CC 4.0 license.


Written by feedbackloop | The FeedbackLoop offers premium product management education, research papers, and certifications. Start building today!
Published by HackerNoon on 2024/01/16