paint-brush
The Dark Systems Manualby@walo
2,136 reads
2,136 reads

The Dark Systems Manual

by walo, the underscore.October 9th, 2024
Read on Terminal Reader
Read this story w/o Javascript

Too Long; Didn't Read

This article examines the interplay between human irrationality and systems design, focusing on how user input validates systems. It highlights the role of pain in guiding decision-making and the psychological factors influencing our acceptance of certain systems over others. Ultimately, it encourages designers to understand these dynamics to create effective, user-friendly systems.
featured image - The Dark Systems Manual
walo, the underscore. HackerNoon profile picture


A system is a concert of self-sustaining rules that reduce uncertainty.


Reflect on your earliest interaction with a toilet. Oh shut up. How did you make it work? One, decisive action.


But you didn’t build it, did you? You wish. You may not even know what a valve is, but you became a part of the toilet, as you do with every other system you interact with.


Without your input, the system is purposeless. Everything valuable to you exists because of systems, from the weave of your clothes to the oxygen in the air. Your relationship with everything is systemic, even with yourself.


So why do some systems work and others don’t?


Despite our best intentions, sick patients will abandon their medicine, children will hide their vegetables, crime pays quickly and you’ll return to your terrible ex or habits. Jung hated it, Freud tried to treat it and most religions try to cure it.


Humans aren’t rational

No sane man takes a shit in his bed and goes to sleep beside it. So why do some systems draw out our submission—and even loyalty—but not others?


Why do we deliberately pay for harmful experiences, yet struggle to accept healthy ones? Why do most African countries still face crippling corruption amidst so much wealth? Why do we stick with the awful job yet never apply for a better one? Why do we complain about our lives and change nothing?


Human irrationality is a response to uncertainty. So when seeking systems for certainty, why do we ignore the healthy options and pledge ourselves to cycles of pain? There must be some hidden forces outside of logic and reason responsible. Darker forces.


And if there are, what are their implications for the systems designer who wants his systems accepted?


Pay attention, to the Dark Systems Manual.


The System Creates Itself

A system is incomplete without a willing user input. Until someone decides to opt in, you might as well be playing at whatever you’re building. If a system is a concert of self-sustaining rules that reduce uncertainty, you’re equally at the mercy of the user as they are yours.


Input validates the system. But if it said "Press", you'll hear all manner of excuses, debates and protests—with hardly any pressing.


If we took you apart, down to the smallest cell, and reassembled you after, we couldn’t bring you back or create a new you. Different systems require different validating inputs, but they are always fundamental to the system and irreplaceable.


Until a user sees the need, there will be no input, especially if the system is presented upfront. Even when the need is clear, logical, and reasonable, a user may choose not to input or to stall.


For the sake of this piece, we’ll focus on monetary inputs: a user is not in the system till they’ve spent money. Opinions and debates are cute, but money is finite, tangible, and a clear indicator of decision-making.


So, do we create our systems, or do they create themselves?



Think back to the last thing you spent your money on. Did someone need to convince you, or did you convince yourself it was worth paying for? Even if someone talked you into it, it worked because you wanted it to. It wasn’t about features or components.


The systems designer was also limited; in their understanding of you, your needs, and how to reach you. Yet they did.


The product may not even have been satisfactory. It may even have been completely useless to you, or an outright scam. Yet they got your input.


The customer is always right where they need to be.


What inside us pushes us to opt into some systems above others?


Irrationality Rules

The system that gets accepted never considers irrationality a hindrance.


In fact, if we expect that each member of the system will be irrational in their pursuit of certainty, there are fewer variables to be concerned about. It will also help us design more accurately from failure to acceptance.


Certainty is a single destination. By providing multiple routes there, we take control of the journey and dissolve objections upfront.


"But I thought-"


If a system is a concert of self-sustaining rules that reduce uncertainty, the roles of its designer are threefold:


  1. To reduce uncertainty
  2. To establish the self-sustaining rules to achieve (1)
  3. To incorporate these rules into a simple, repeatable experience for the user


And what you get is a button or lever you push to flush your poop far away to a magical land. No education is required. Sit however you like, just do your business inside the bowl, and we’ll handle the rest.


If we accept irrationality as a component and not a bug in every system, we arrive at its fundamental trigger:


Pain guides our irrational decision-making

In a world without pain, we are doomed to our ignorance. It’s in studying our pains and the pains of others that we’ve learned, innovated, and developed new systems and technologies for ourselves over generations. And in our daily decision-making, we’re constantly guarding against pain.


Is the system that gets accepted always the one with the least pain?

A Flavour Explosion

There’s nothing more difficult than trying to change someone’s mind.


Once a person is comfortable in certain systems or habits, changing their “quantum state” even in the face of directly contradicting evidence is almost impossible. In those moments, you find just how irrational the mind can be.


You can’t change a mind that doesn’t want to. A mind that does, will hold your hand by themselves and ask for directions.


I once emailed the author of a book that really improved my business accounting to find out which was more difficult between selling his book and getting people to adopt the systems he taught in it.


Unsurprisingly, getting people to pay for the book was the easy part. His years of pain, failure, and frustration were things they wanted to avoid in theory, and paying the cost of the book was an easy way to feel like they were doing something about the problem.


He was/is selling valuable wisdom, but you can only lead a horse to water. There are always more snake oil buyers than sellers; because half-measures are always easier and more attractive. Also, having someone to blame is far easier than taking responsibility for our own lives.


So we fall for the scams, the cons, and the too-good-to-be-trues—not because we didn’t know any better, but because we didn’t want to. We even scam ourselves for the same reason, like in the case of the book.


Just believe!


Most complain about their status quo, but only accept as little change as possible. But if pain guides our irrational decision-making, there will be a turning point.


At this point, the pain is enough to force a mind out of comfort or apathy and into input mode—beginning their interaction with the system in a desired direction. Without it, the system does not tick. It is of no relevance to the user, and therefore there is neither user nor system in existence.


Without this particular explosion of pain, everything that the system sets out to achieve, is capable of achieving, and even its subsequent feedback loops are futile.


Therefore, the first and most fundamental question in systems design is:


Where must it hurt?

Take 2 pains and call me in the morning.


By locating this pivotal moment and setting up camp around it, your system feeds off its relevance. If pain is the true fuel of any system, the designer must identify which events work best to trigger input and start the loop. These become your self-sustaining rules.


In oppressive systems, the pain is typically artificial, introduced at a crucial point in the user’s life to force their behavior. It’s simply not enough to hope or expect that humans will follow the order or obligations set because they exist. As we’ve seen, even when they are for our own benefit, we still ignore systems.


At what point did you get tired of wearing masks during COVID? Did you sanitize religiously? Did you keep away from all loved ones?


No. At some threshold of inconvenience, you said “fuck it” and opted out.


Apart from where it must hurt, two other crucial factors affect a user’s input:

  • Their psychological/emotional state
  • Their knowledge about the system or its components


The combination of these three factors decides if a user will validate a system through input or not.


Technology & Systems

In converting self-sustaining rules into a simple experience for a user, there’s only one thing to keep in mind:


That the user is never at any point unnecessarily strained

To begin, opt for preexisting software and systems rather than building from scratch. We’ve come too far as a species to labor on new architecture just to see if a system works. As distinguishing as it will be to own proprietary technology in your systems, you lose time and labor figuring out integrations and maintenance.


When your system is mature enough to support itself, you can begin to introduce such measures where you see necessary. While we may *coughs will* potentially introduce artificial pain to guide a user’s behavior, too much baseless discomfort will cause any user to forfeit the process.


Spent 5 years building an ecosystem no one uses.


If your systems operate remotely, you want to ensure the user feels confident that there’s someone available to support them a message away. I’ve seen digital businesses fail just because their users couldn’t get in touch with people when it mattered most.


State your terms upfront and stick to them. A system exists only because users agree to trust it. When users go through the loop and find you trustworthy, they can recommend your system to others.


To reduce the strain on yourself and the user, keep the touchpoints to a minimum, and work hard to underpromise.


Software is only a digital representation of our physical behavior. This means that if you thought web3 was a place of safety, you can still get scammed. Nothing manmade is foolproof. The internet is only a digital outlet, human nature remains the same.


Systems design principles remain the same as well. If the hardware or real-world systems aren’t in place, there’s nothing software can do to replace them.


Destroying Systems

The greatest strength of a system is always its greatest weakness. This applies to you too.


Good and evil are merely agendas in a system, perpetrated by the agents and designers maintaining it. If a system no longer serves or represents a user, they have the power to deactivate it at different levels.


And I would've gotten away with it too, if not for you meddling kids.


Fundamentally, because we’ve established that the input of the user is the validator of the system, they hold clear sway over the system as long as they understand its nature, components and the value of their input.


If a system is a concert of self-sustaining rules that reduce uncertainty, each of the three conditions can be altered to destroy it.


First, by creating discord within the components or communications necessary to keep the rules functioning.

Second, by altering the rules so that they no longer allow the feedback loop to continue the system.

Third, by increasing uncertainty in such a way that invalidates the existence of the previous two conditions.


Since they are so easily created and destroyed, the incorruptible system therefore is one with all power to corrupt all. So we preserve it, to preserve ourselves.


But we cannot expect perfection in any system. Efficiency and accuracy are more important, because a system’s functional purpose is to keep running.


Go forth, and build…


or destroy!