Privacy is a common concern. While there are severe issues with broken platforms such as Facebook and sneaky techniques such as fingerprinting, other methods are used to manipulate users' choices. It's not uncommon for websites and applications to use dark patterns to trick users into granting their consent.
"Privacy by design" is certainly better than "privacy by policy", as privacy is built into the system at every layer, and protection is enabled by default.
Instead of reacting to a data breach, the architecture proactively protects users by minimizing data collection, lowering the potential damage during an incident.
Besides, users benefit from end-to-end protection, with authentication and encryption everywhere the data goes.
In such user-centric systems, users can either grant or revoke consent to use their data: users own their data.
Websites have to obtain consent to track users.
There are multiple ways to influence the final choice, though. A widespread technique used by almost all platforms consists of using legalese to describe requested permissions, so the users cannot determine whether the request is legitimate or not.
Technically speaking, it's consent, and it's perfectly legal. However, I've sometimes had difficulties understanding the subtle differences between a typical scam and an app permissions screen.
It may vary from one country to another, but websites must provide ways to opt out of tracking as a legal duty.
In practice, the accept button is often designed with big font size, a visible color, and a recognizable icon, while the link to opt-out is hidden below a pretty long text and left unstyled, so it's very easy to miss it.
While many websites only display a newsletter form at the end of posts and in some frequently visited areas, others trigger a popup inviting you to subscribe as soon as you arrive.
A similar pattern described in the last section can be used: the email input is enormous in the pop-up, with a big button to validate the subscription, and there's no close button.
Instead, you get an unstyled link with a small size displaying an unclear message such as "No, I do not want to enjoy the latest news," when you just want to close the popup. Sometimes, it even shames users that do not wish to subscribe with inappropriate messages such as "No, I don't want to be smart," which is a psychological bias.
These dark patterns may appear pretty lame, but they are pretty efficient at grabbing emails in the short term, which is why many websites use them.
Some practices with ads are questionable. Many websites display them near or over the content, but when the user tries to close these unwanted windows, it still triggers the click because the close button is deliberately too small.
This technique is heavily used to artificially increase impressions, especially with touch screens and mobile devices. It's a striking example of dark patterns, as the user has to close the ad to see the content.
In addition, advertisers and marketers now use appealing terms such as "customized experience" or "personalized ads," as more and more people suspect there could be bad practices behind the word "tracking".
Platforms such as YouTube claim they help fight against fake news by censoring suspicious content. I seriously doubt it.
Their AI collects all kinds of data to make personalized recommendations and, above all, keep the user on the platform. People susceptible to conspiracy theories and dark thoughts get even more biased content than they want.
YouTube's algorithms aren't meant to make people think outside the box. Quite the contrary, users get content they already approve of and like, which might be palatable for entertainment but not for political information.
Such platforms would likely amplify misinformation rather than deconstruct it.
I mentioned a few patterns you might already know in other contexts. While websites and platforms did not invent those tricks, technology put the scam to an unprecedented level, automating data abuses and deception.
Even if authorities try to regulate the phenomenon, it is difficult to stop, and dark patterns allow data brokers to circumvent legal duties and harm users' privacy.
First published here.
Featured image by Nick Fewings on Unsplash.