paint-brush
What About The Illusion Of Choice?by@diego-lopez-yse
198 reads

What About The Illusion Of Choice?

by Diego Lopez YseSeptember 20th, 2020
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

In a deterministic world, digital users can become more predictable and monetizable than ever. Companies can exploit the fact that human behaviour is hardwired to choose the path of least resistance. Designers can design environments that make it easier for people to choose what you want. A nudge is any aspect of the choice architecture that alters people’s behaviour in a predictable way without forbidding any options or significantly changing their economic incentives. Dark patterns are subtle ploys many companies use to manipulate you into buying or signing up for something, or disclosing personal or financial details.

Companies Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - What About The Illusion Of Choice?
Diego Lopez Yse HackerNoon profile picture

Do you think your actions are the result of your own free choices? What if those actions are the inevitable and necessary consequence of antecedent states of affairs? What does this mean for your free will?

In a deterministic world where there’s an exclusive future for all our actions, digital users can become more predictable and monetizable than ever. In fact, by using creative designs and deceptive strategies, companies can create deterministic worlds and exploit the fact that human behaviour is hardwired to choose the path of least resistance.

Sites and apps designs become highly relevant in this scenario, because if you know how people think, you can design choice environments that make it easier for people to choose what you want. Several companies observe and learn an incredible amount of information about user behaviour in order to refine what is called “choice architectures,” discrete design elements intended to influence human behaviour through how decisions are presented.

There are choice architectures all around you, and they are never neutral: they always influence user behaviour, even when they fail to accomplish its objective or there’s no explicit strategy behind.

Nudging

“Nudging” refers to how users can be driven towards making certain choices by appealing to psychological biases. A nudge is any aspect of the choice architecture that alters people’s behaviour in a predictable way without forbidding any options or significantly changing their economic incentives. It represents a small feature of the environment that alters people’s behaviour but does so in a non-enforced way.

It’s true that choice architectures influence by default (even when there’s no apparent intention behind), so a nudge is best understood as the intentional attempt at influencing choice.

(Example of nudging: the pre-selected amount suggests is the ‘right’ one. Source: The Internet Patrol)

Nudges work without limiting the user’s original set of choices, and can produce positive results like:

Improving health caring. Sending people a simple reminder to schedule a dental check-up proved to double the rate of people who signed up for an appointment.Improving financial decisions. Sending students a few personalized text messages helped many of them remember to refile their application for student aid.Increasing choices that benefit others. Using an opt-out system for organ donations where people are automatically registered as organ donors can significantly increase the number of donors.

Whether through reminders, personalized notifications, awards or default settings, nudges can steer people to make better choices. But the question is: better to whom? What if there were opposite interests behind the act of driving user’s behaviours?

The dark side

There are tricks that go beyond nudges to influence decision making, and cause users to do things they may not otherwise do. Dark patterns are subtle ploys many companies use to manipulate you into doing something, such as buying or signing up for something, or disclosing personal or financial details. A design that delivers the best conversion rate might not be the same that delivers the best user experience.

(Example of a dark pattern applied to hide extra costs. Source: Shopify)

Dark patterns may exploit timing to make it harder for a user to be fully in control of its faculties during a key decision moment. In other words, the moment when you see a notification can determine how you respond to it. Or if you even notice it.

Dark patterns also leverage on a very human characteristic: we’re lazy by design, so producing friction is always an effective strategy. Designs that require lots more clicks/taps and interactions discourage users into engaging with that content.

(In this example, the user must actively unselect the extra product, or otherwise it will appear in the checkout. Source: Econsultancy)

The site Darkpatterns.org details some of these deceptive mechanisms like “roach motel” (when the design makes it simple to sign up but hard to cancel), “disguised ads” (ads masqueraded as content that isn’t trying to sell you something), or “privacy Zuckering” (named after Facebook CEO Mark Zuckerberg, is the trick of getting you to overshare data).

These strategies extend everywhere on the internet, and although some countries try to regulate this behaviour, the truth is that not all governments seem to be taking action:

Google has been accused of using manipulative design practices to trick users into clicking on ads, making it nearly impossible to tell ads apart from organic search results.TurboTax is a US tax preparation software to file taxes online that has been accused of making it ridiculously complicated to find the actual free part of the service, and even making sure to hide it from Google results.Microsoft has also been accused of using dark patterns to push users into opening cloud accounts, making the offline-account option harder to find.

As digital users, we are in a very hard spot. The fact that many US banking sites are harder to read than Moby-Dick (making 58% of US bank content not readable for the average American), and about 11% of retail websites contain dark patterns, reveals that we are continuously under siege. Is there any way out?

Final thoughts

Manipulation through behavioural techniques can occur quietly and leave no trace. Since companies can drive customer’s decisions through heavy analytics and user interfaces, it’s easy to imagine a digital future in which social platforms employ algorithms to predict the virality and monetizability of each post, only accepting or highlighting the ones that could generate sufficient revenue.

Companies are super focused on testing and experimenting with different techniques to get the most desirable responses, and since they are incredible experts in the discipline, it seems hard to avoid being deceived. The good news is that education is a powerful tool, and by knowing some of their strategies and trying to be aware of our cognitive biases, we might be able to sidestep some of the traps.

Sites like Darkpatterns.org or Princeton’s Web Transparency have lots of examples regarding dark patterns and awesome material. I suggest you take some time and go through them. It will pay off. You can also explore @darkpatterns or use the hashtag #darkpatterns on Twitter to call out what you’ve found or discover what others have found. Social media engagement is a good way to put pressure on companies to stop using these practices.

If you can detect deception you're more likely to avoide them.

Interested in these topics? Follow me on Linkedin or Twitter