paint-brush
Algorithms aren’t racist. Your skin is just too dark.by@Joy.Buolamwini
19,220 reads
19,220 reads

Algorithms aren’t racist. Your skin is just too dark.

by Joy BuolamwiniMay 29th, 2017
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

The rise of artificial intelligence necessitates careful attention to inadvertent bias that can perpetuate discriminatory practices and exclusionary experiences for people of all shades. To address these questions and others related to bias in artificial intelligence, I am starting the Illuminate Series. The goal of the series is to broaden the public discourse on artificial intelligence and it’s impact on everyday people. Still, the valid points commenters have brought up that: 1.) default camera settings do not properly expose dark skin and 2.) algorithms are not intentionally developed to be racist warrant further discussion. To address these questions and others related to bias in artificial intelligence, I am starting the Illuminate Series. The goal of the Illuminate Series is to broaden the public discourse on artificial intelligence and it’s impact on everyday people

People Mentioned

Mention Thumbnail

Companies Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - Algorithms aren’t racist. Your skin is just too dark.
Joy Buolamwini HackerNoon profile picture

Lately, I have been in the press discussing the need for more inclusive artificial intelligence and more representative data sets. The associated headlines and comments frame the conversation in a number of ways that at times overshadow the main message:


The rise of artificial intelligence necessitates careful attention to inadvertent bias that can perpetuate discriminatory practices and exclusionary experiences for people of all shades.


Still, the valid points commenters have brought up that:


1.) default camera settings do not properly expose dark skin and

2.) algorithms are not intentionally developed to be racist warrant further discussion.


To address these questions and others related to bias in artificial intelligence, I am starting the Illuminate Series. The goal of the Illuminate Series is to broaden the public discourse on artificial intelligence and it’s impact on everyday people. Please let me know if you would like to participate by submitting a question, guest posting, leading a discussion group, creating educational demos, or in some other capacity.

ILLUMINATE PART 1 — Cameras, Contrast, and The Coded Gaze.

Since sharing my TED Talk demonstrating facial detection failure, I am asked variations of this hushed question:


Isn’t the reason your face was not detected due to a lack of contrast given your dark complexion?


This is an important question. In the field of computer vision, poor illumination is a major challenge. Ideally you want to create systems that are illumination invariant and can work well in many lighting conditions. This is where training data can come in. One way to deal with the challenges of illumination is by training a facial detection system on a set of diverse images with a variety of lighting conditions.


There are of course certain instances where we reach the limits of the visible light spectrum. (Infrared detection systems also exist). My focus here is not on the extreme case as much as the everyday case. The demo in the TED talk shows a real-world office environment. My face is visible to a human eye as is the face of my demonstration partner, but the human eye and the visual cortex that processes its input are far more advanced than a humble web camera. Still, even using the web camera, you can see in the demo that my partner’s face is not so overexposed as to be inscrutable nor is my face so underexposed that there is significant information loss.


Though not as pertinent to my demo example, there are extreme failure cases where there is overexposure or underexposure and of course the case of no light at all in the image. Cameras, however, are not as neutral as they may seem.

Are Cameras Objective?

The default settings for digital cameras are influenced by the history of color film which itself exposes an optimization for lighter skin. This VOX video provides an informative overview of how camera technology evolved with a privileging of lighter skin. https://www.vox.com/2015/9/18/9348821/photography-race-bias

Defaults are not neutral


I was amused to learn that complaints from chocolate and furniture companies who wanted their products better represented led to improved representation of darker tones in the earlier days of photography. In the digital era, the LDK series was developed by Phillips. The cameras explicitly handled skin tone variation with two chips — one for processing darker tones and another for processing lighter tones. The Oprah Winfrey show used the LDK series for filming because there was an awareness of the need to better expose darker skin.


With inclusion in mind, we can make better sensor technology as well as better training data and algorithms.


We have to keep in mind that default settings are not neutral. They reflect the Coded Gaze, the preferences of those who have the opportunity to develop technology. Sometimes these preferences can be exclusionary.

Exclusion Overhead

More than a few observers have recommended that instead of pointing out failures, I should simply make sure I use additional lighting. Silence is not the answer. The suggestion to get more lights to increase illumination in an already lit room is a stop gap solution. Suggesting people with dark skin keep extra lights around to better illuminate themselves misses the point.


Should we change ourselves to fit technology or make technology that fits us?


Who has to take extra steps to make technology work? Who are the default settings optimized for?


One of the goals of the Algorithmic Justice League is to highlight problems with artificial intelligence so we can start working on solutions. We provide actionable critique while working on research to make more inclusive artificial intelligence. In speaking up about my experiences, others have been encouraged to share their stories. The silence is broken. More people are aware that we can embed bias in machines. This is only the beginning as we start to collect more reports.


One bias-in-the-wild report we received at AJL shares the following story.

“A friend of mine works for a large tech company and was having issues being recognized by the teleconference system that uses facial recognition.

While the company has some units that work on her dark skin, she has to make sure to reserve those rooms specifically if she will be presenting. This limits her ability to present and share information to times when these rooms are available.”

This employee is dealing with the exclusion overhead that can result when we do not think through or test technology to account for our differences.

Full Spectrum Inclusion Mindset

Questioning how different skin tones are rendered by cameras can help us think through ways to improve computer vision. Let’s also keep in mind that there is more at stake than facial analysis technology. The work of AJL isn’t just about facial detection or demographic bias in computer vision. I use the mask example as a visible demonstration of how automated systems can unintentionally lead to exclusionary experiences.


I know we can do better as technologists once we see where problems arise and commit to addressing the issues. We can also be better about gaining feedback by asking for and listening to people’s real-world experiences with the technology we create. We can build better systems that account for the variety of humanity. Let’s not be afraid to illuminate our failures. We can acknowledge physical and structural challenges, interrogate our assumptions about defaults, and move forward to create more inclusive technology.

Action Step: Share Your Story

Your voice matters. Speaking up for inclusion is the first step towards change. It took me five years to muster up the courage to share my truth. Even though I was afraid, I am glad I did. The response has been overwhelming. I have witnessed so many people who want to be part of creating a better future. My inbox is overflowing with messages from others who thought they were alone until they heard my story. If you have a story to share please consider submitting it to AJL.


Illuminate bias and fight the Coded Gaze here at http://fight.ajlunited.org


What question should the Illuminate Series address next? Share your suggestions in the comments below.

Joy Buolamwini is a poet of code on a mission to show compassion through computation. She writes about incoding and creates learning experiences to develop social impact technology. Find her on twitter @jovialjoy and @ajlunited connect on LinkedIn or Facebook.