When you think of the future, what does it look like? Maybe flying cars? Glass cities? Holographic advertisements? Going to school or work in virtual reality? Robots and AI assistants?
You likely didn’t imagine the people behind those innovations.
All of the technology we have today is a result of human invention and ingenuity. We have computers to help, but ultimately, even those tools were created by us. We cannot escape the simple fact that tech is, fundamentally, a human enterprise.
How that human enterprise develops to help or harm us is largely a question of ethics. But on a more individual level, we can start to analyze these ideas by taking a closer look at our biases.
Everyone has biases, no matter how empathetic or caring we try to be. And that’s ok. These quick judgments were built into our brains as a shortcut to help us manage vast amounts of information. It prevents cognitive overload to make us more efficient and is what has helped us evolve to survive dangerous situations in our evolutionary past.
But therein lies the flaw. These instincts were built by evolution for survival in a fight or flight environment, not to navigate our nuanced and intellectual modern world. In situations where critical thinking and open-mindedness are needed most, this inherent human trait can fail us if we’re not careful. And in our ever-increasingly technical society, these failures left unchecked can (and already do) lead to tangible harms.
While we cannot eliminate these biases completely, we can make strides to systematically weed some of them out in our work. Here’s how to do it.
According to Stanford professor and author of “Biased: Uncovering the Hidden Prejudice That Shapes What We See, Think, and Do” Jennifer Eberhardt, the first step in managing your bias is to become aware of the biases that you currently have.
But if these biases are unconscious, how can we possibly become more aware of them?
Luckily, researchers at Harvard University already thought of that and have created a free test as part of their Project Implicit initiative to discover your implicit bias. The test is split into a list of different categories including (but not limited to) religion, age, race, sexuality, and disability with the intent to break down your biases into manageable categories.
As part of the assessment, they provide the option for you to report your attitudes and beliefs about the category prior to taking the test. This helps them understand the type of people willing to self-reflect on these issues, and is anonymized. Even if you do not wish to share that information with the researchers, it can become a useful reference to compare to when you see your results. I recommend taking your own notes if you feel comfortable.
You may find with this test that your conscious opinion of a subject does not match your results. If this is the case, don’t be alarmed; this is exactly the purpose of the test. We all grow up being exposed to various different viewpoints and these implicit biases might not always match what we consciously believe.
Remember: the fact that you’re doing this is already taking a step in the right direction. Focus on progress.
Now that you know what your biases are, the next step is to catch yourself in the act.
In cognitive behavioral therapy and zen meditation alike, a technique called grounding is used to bring an individual’s attention back to the present moment. When it comes to biased decision-making, this can also come in handy in your bias prevention toolkit.
Before you make a decision, stop yourself and take a deep breath. Take that moment to slow down and reflect.
Are you being objective with this decision?
Are there any biases you can identify that might be clouding your judgment?
Are there other options I haven’t yet considered?
These questions are simply a guide to get started.
What’s important here is that, especially in the beginning, you try to do this as often as you can remember. Because the situations where we need to evaluate our bias most are likely the ones we wouldn’t even think to consider.
Consider leaving a cryptic sticky note on your desk as a visual reminder to yourself if you think you’ll forget.
In order to make sure you do this as often as possible, we want to create a systematic habit that sticks. There are many ways to do this (and plenty of articles on Medium which cover the topic), but here I will mention a few that are particularly useful in this context.
If you’re a project manager who already has a detailed schedule of tasks for your project, explicitly incorporate bias reflection into the early phase of your project. This could start as your own internal reflection, but you should consider including this as part of a team effort, not just so you can bounce ideas off of each other and work around each others’ biases, but because it helps you stay accountable.
Research from the American Society of Training and Development (ASTD) discovered that having at least one other person to keep you accountable for achieving a goal makes you 65% more likely to achieve it. The more the merrier. But you may also choose to start small with a single trusted colleague who shares your values first.
If you don’t have that level of influence or produce mostly your own independent work, try creating behavior chains. This is also known as if-then planning and is a surprisingly simple yet powerful tool. Hundreds of studies from diet and exercise to negotiation and time management have shown that deciding in advance when you will take a specific action can double or triple your chances of success.
As an example,
If I am about to implement a new feature in my code,
Then I will pause to reflect on any biases that might affect my implementation decisions.
If you’re working on a facial recognition algorithm, this might mean checking your dataset to ensure it represents a spectrum of ages, gender presentations, skin tones, and even differences in musculature before you start creating a machine learning model.
Lastly, keep track of your progress for 30 days (I offer a workbook that can help you through this process). Habits typically start to set in around the 30-day mark, so make a resolution to focus on this for an entire month and before you know it, it will already be a habit.
Now that you know what you’re dealing with and have strategic ways to implement changes in your own behavior, we have to come to our final realization:
Our work here is never done.
We’ve only scratched the surface of what can truly be achieved here and as we learn more as a society about how bias works, what biases we have, and how we can combat them, we are always finding new ways to improve.
Take the time to dive into publications from tech ethics publications like All Tech is Human.
Watch documentary films that expose major ethics issues in tech, such as Coded Bias by Shalini Kantayya, which explores the fallout of MIT Media Lab research Joy Buolamwini’s discovery that facial recognition does not see dark-skinned faces accurately.
Take time to consider the perspective of others around you, whether it is listening to a colleague’s account of bias against them or quietly gauging the biases of those around you so you can act to make a difference.
You might also consider following the demographics your implicit biases are against on social media, so you can gain more empathy for their perspectives.
We cannot completely remove our biases, but working towards a system that actively considers them in our everyday work is a crucial first step toward creating a society that can respect and better serve those who are currently falling between the cracks of our consideration.
Also Published Here