Work with teams to build tech, ethically. Born in Pakistan, based in London, working in Rwanda.
As our lives become mediated by screens, our behaviours become shaped more and more by their user interface. Here, I want to talk about what it means for a user interface - or UI - to be ethical.
The user interface brings together a product's features where you can access them. And it channels and manipulates your behaviour. Both these things bring huge value to users, and both bring ethical questions for us to think through.
To start thinking through them, let's start with this old UX Hero article, called Quality is Fractal (2009).
Its main point is this: Windows 7 has a terrible interface, and one of the main reasons is the lack of paths through it.
This sentiment has defined 'good UI' for the last 10 years. Minimal design. Lots of handrails. Fewer options. Recognising Kirkegaard's old maxim that anxiety is the dizziness of freedom, user interfaces optimise to make you less dizzy by giving you less choice. In 2000, researchers showed that people were more likely to buy a jar of jam with fewer options. So, it's logical that in the virtual world, fewer options also equals less paralysis.
Minimalism what makes UI slick. Gone are the 50+ icons of the Windows Control Panel. Instead, here are the five (or three, or one) things you really need to know or decide. Less cognitive overload. More simplicity, ease, and joy.
It can come across as an objectively good thing, but there's a flipside to this minimalism. It's this: as UI becomes minimal, it becomes opinionated. The interface - and the people that built it - decide what you see and set the bounds within which you can act. All UI channels behaviour. The more minimal the UI, the stronger the manipulation.
I've argued before that any tech that claims to make people's lives better should give them more agency over their lives. For all it might be overwhelming, the Windows Control Panel of the 2000s assumes competence and tells the user: "here's everything on offer, do what you want". It gives you your freedom, even as it makes your life harder.
Now, here's a personal story about what happens when a company doesn't give you everything on offer.
A few months ago, I was the victim of a social engineering hack on Whatsapp. No long-term damage, but it was hugely stressful. Later, I realised that the whole thing could have been avoided if I'd set a PIN. Why hadn't I? Because the option was buried, four clicks from the home screen. I didn't know setting a PIN was something I could do.
Now, say Whatsapp had the 'set PIN' option on the screen you see when you load up the app. A lot of people might think it clutter. But at least they'd know the option was there. And I'd wager most of them would - at some point - decide to use it.
I don't know why Whatsapp doesn't bring 'set PIN' to the front of the app. Maybe a PIN would stop people from spending as much time on the app, which is directly counter to their parent company's business model¹? Maybe the tech-savvy designer just doesn't think the PIN is as important an option because who on earth would ever fall prey to a hack? Or maybe designers see users not setting PINs, conclude that they don't want to, and so keep the option buried. Not thinking that people don't set PINs because the option isn't visible². Or maybe it is just a commitment to a de-cluttered look and feel.
All this brings us to our central question. When UI aims to minimise your options, who decides what those options are? What are their incentives, their biases, their reasoning? In short: who is channeling your behaviour?
So there are two ways UI can make people's lives better. One is by making the interface less paralysing, by giving users fewer options. And another is making sure the user can shape their own experience. Dizziness, or freedom? It's a tricky trade-off, but then that's what ethics in technology often is.
For designers of modern, minimal UI, my wish is they think less like Kirkegaard, and more like Aristotle. The Greek philosopher's doctrine of the mean asks us to express concern for all relevant considerations, without paying too much or too little heed to any one of them. If you're a judge, find the middle ground between being too skeptical, and too believing.
And if you're a designer: find the middle ground between minimalism and clutter. Between overwhelming your users, and taking away their autonomy. In this deliberation, sits the ethical user interface.
¹ Any company that buries privacy and data options, or hides the ability to mute notifications, or promotes "sponsored content" is doing the same thing. Prioritising what makes them money on the interface, over giving you autonomy.
² This hypothetical example is part of the very real problem: 1) people choose what is presented to them. 2) metrics-driven tech companies prioritise that thing on their interface. 3) people keep choosing that thing. 4) an endless feedback loop. Here's an example from Spotify.
If you enjoyed this, you can read the rest of my writing on asadrahman.io
Title image from From Wikimedia Commons, the free media repository.
Create your free account to unlock your custom reading experience.