Authors:
(1) Yuwen Lu, he contributed equally to this work, University of Notre Dame, USA;
(2) Chao Zhang, he contributed equally to this work and work done as a visiting researcher at the University of Notre Dame;
(3) Yuewen Yang, Work done as a visiting researcher at the University of Notre Dame;
(4) Yaxin Yao, Virginia Tech, USA;
(5) Toby Jia-Jun Li, University of Notre Dame, USA.
The engagement validates the feasibility of our probe (engineering goal). In total, 48,368 (mean=3224.53, std=2836.14, min=352, max=9112) action logging entries and 115 (mean=7.67, std=4.84, min=1, max=17) diary note entries were created. The participants visited a total of 13,611 (mean=907.4, std=819.88, min=58, max=3220) distinct web pages on their browsers instrumented with Dark Pita, where our probe was triggered 4,834 (mean=322.27, std=316.65, min=28, max=1188) times. 10 participants (66.7%) visited all three types of online services, 4 participants (26.7%) visited video streaming platforms and social media platforms, and 1 participant only visited video streaming platforms over the course of the study. The participants set up UI enhancements for dark pattern instances 280 (mean=18.67, std=8.93, min=1, max=34) times. During the two weeks, these UI enhancements were triggered 14,355 times (mean=957, std=1178.48, min=2, max=4621) in total. For each UI enhancement, the average number of times that it was triggered was 463.06. 2 UI enhancements (i.e., counterfactual thinking (Appendix A.4) for the discount price on Amazon and reflection for the remaining time on Netflix) were not successfully used due to technical problems caused by website updates on Amazon and Netflix. Fig. 6 shows the daily engagement of participants with our probe. Overall, the logs indicate that most participants were actively engaged with our probe. They visited example websites that contain instances of dark patterns, modified the interfaces of these websites with UI enhancements to mitigate the impact of dark patterns, and submitted diary entries.
Our findings from the entry, mid-study, and exit interviews with 15 participants, together with their 115 diary entries, provide key insights (KIs) for understanding the reactions and needs of end users on awareness of and action for dark patterns (social science goal) and demonstrate the usefulness of our end-user-empowerment intervention approach (engineering goal).
KI1: Providing information about specific instances of dark patterns allows users to gain transferable knowledge. The users found the information presented about dark patterns on the awareness panel to be educational. PB12 found that attribute tags are “great information” and helped him better understand the design rationale of designers hidden behind dark patterns. PB16 added that Dark Pita made him know “what are the certain things that kind of triggered me into going down a hole”. Equipped with such knowledge, the participants developed new perspectives on online services. PB3, PB6, PB6, and PB12 mentioned Dark Pita helped them more explicitly see a large number of dark pattern instances on Twitter, which reduced their level of trust in the platform. PB6 became more critical of disguised ads on some platforms, and PB10 behaved more cautiously to avoid dark patterns. By seeing the dislike count on YouTube again (enabled by a UI enhancement of Dark Pita) (Appendix A.4), PB7 was able to gain a more comprehensive view of the videos they watch. This finding showed that improved awareness through information disclosure on dark patterns can change users’ perception of digital platforms, extending previous survey results of end users’ mistrust [35].
Importantly, users were able to transfer their newly learned knowledge of dark patterns to other platforms Dark Pita did not yet support. For example, PB12 would investigate “how the similar designs (of dark patterns) might be applied to other interfaces that I use”. PB2, PB5, PB9, and PB12 started to think about dark patterns in mobile apps and desired to use Dark Pita on phones. The probe even inspired PB9 to ponder “textual” dark patterns (e.g., misleading and manipulative language on interfaces). To summarize, these findings demonstrated that providing information about dark patterns can not only raise users’ awareness, but also inspire them to transfer the learned knowledge to dark patterns on other platforms.
KI2: The capability to modify existing interfaces boosts the user perception of empowerment and autonomy. In our deployment study, 7 participants (46.7%) explicitly mentioned the feeling of empowerment of being able to change interfaces, as they were no longer just passive consumers of decisions made by designers. PB3 mentioned that “the most empowering was being able to highlight algorithm-recommended content on Twitter... it provided a level of consciousness (during browsing)”. Previous work on dark patterns in HCI and CSCW has primarily regarded end users as passive receivers of these deceptive practices [11, 35]. These findings show the benefit of our end-user-empowerment approach, treating users as engaged actors in mitigating dark patterns.
Notably, participants emphasized the importance of having this support from a third-party tool. PB5, PB6, and PB12 mentioned that some platforms also allow users to make interface changes; for example, on Facebook, users can remove disguised ads and select why they are not interested. While this allows the user to hide the ad, it serves the company’s business interest; some users realized this and chose not to use it. PB12 mentioned that users and companies have contradictory goals: users want to see fewer targeted ads, while the company wants to make ads more personalized to generate more revenue. On the contrary, a third-party tool like Dark Pita presents no conflict of interest with users, leading to enhanced user trust.
KI3: The dynamic goals and usage contexts of users when using online services determine their desired UI enhancements. Users have diverse goals online: for example, PB10 wants to reduce their time on Twitter, while PB3 and PB5 do not mind spending more time on Twitter browsing. This is in line with our workshop finding on users’ perceptions of dark patterns (WF2). In the deployment study, these differences in goals shaped users’ choices of UI enhancements.
Even a single user’s goals can change with different usage scenarios, which in turn modifies their choice of UI enhancements. For instance, PB2, PB3, and PB14 separately reported that they did not mind the “video autoplay on hover” feature on YouTube homepage but disabled it on individual video-watching pages. This was because the feature was helpful for previewing content on the homepage, but distracting and time-wasting when watching individual videos. PB14 also turns the focus mode on YouTube on and off between “when I want to focus... and when I just want to have fun or just relax... I think they get changed based on what I wanted to do at the time.”
Users’ goals often reflect and shape their personal relationship with a service platform. During our study, the ability to modify dark patterns reminded users of their long-term goals against impulsive behaviors. In the exit interview, PB12 mentioned that seeing so many dark patterns explicitly marked on Twitter made him want to use the platform less. They described the feeling as “someone nagging you should stop doing this”. Although sometimes annoying, they still found it helpful for their long-term goal of reducing Twitter usage. This is also related to studies on self-control such as [6, 49].
Previous discussions mostly view the “darkness” level of deceptive patterns as an objective attribute [38]; however, this insight extends such narrative by showing the personal and dynamic nature of such “darkness”. End users’ goals and usage contexts are dynamic and individualized. Therefore, a one-size-fits-all approach or a designer- or policymaker-initiated approach cannot fully accommodate them. By offering users the choice of multiple UI enhancements, our approach enables them to customize their intervention for a dark pattern based on individual contextual preferences.
Our findings offer design implications (DI) for future techniques, strategies, and interfaces of end-user-empowerment interventions for dark patterns (design goal).
DI5: Design non-intrusive and less-interrupting UI enhancements. Users prefer non-intrusive and less-interrupting UI enhancements. Visually non-intrusive UI enhancements that do not interfere with users’ normal browsing experiences were greatly appreciated during our study. For example, PB15 shared that the highlighted disguised ads with thick red borders (Appendix A.4) became annoying, so he chose to directly hide the ads (Appendix A.4) instead. On the contrary, blocking the previews of recommended videos (Appendix A.4) is a “gentle” method to remove distracting content while avoiding users’ fear of missing out on information [75].
DI6: Provide fine-grained control over modification of dark patterns. Participants expected future interventions to give them more fine-grained control over dark patterns. For example, PB13 wanted to “control the quantity” of promoted content on Twitter and leave approximately one-third on their feed instead of removing all of them. Similarly, PB2 envisioned a filter to “control content” based on personal interests, i.e., automatically identifying information potentially beneficial to her and removing the rest.
DI7: Improve transparency and provide global control for the UI enhancements. In addition to the “design transparency” of dark patterns, our participants also wanted transparency in the design of UI enhancements. They expected to see the intentions and mechanisms of these UI enhancements, so they could understand how these work against dark patterns and select the ones that fit their needs. For example, PB5 and PB15 suggested providing explanations on how Dark Pita calculates the time or money spent on reflection (Appendix A.4). PB15 also mentioned that they wanted a global control panel and dashboard for all active UI enhancements so that they could quickly get explanations of them, view their status, and change their configurations.
DI8: Contemplate with the boundary between UI enhancements and dark patterns. UI enhancements usually involve a certain degree of persuasive design or nudge techniques themselves, which is similar to dark patterns. Based on what we have learned from our studies, it is important to align the goals of the user and the goals of the intervention tools (KI2 and KI3) to protect the welfare of users (e.g. privacy data) [82]. In addition, according to Hansen and Jespersen’s framework [46], the dividing line between manipulative and beneficial nudges is transparency (i.e., if the user can perceive the intentions and means behind the nudge) (DI7). Such potential alignments should be clearly explained to users to help them make informed decisions about adopting UI enhancements. With carefully set boundary between UI enhancements and dark patterns through goal alignment and transparency, we can meaningfully prevent further manipulation against end users’ will.
This paper is available on arxiv under CC 4.0 license.