In 1878, Christopher Sholes was granted the patent on the QWERTY keyboard layout. He’d been granted the patent on the typewriter in 1868, 10 years earlier. His original typewriter had problems with tangling mechanical keys and the QWERTY keyboard was his solution to broken keyboards and frustrated typists. The QWERTY keyboard favored left-handed typists (at a time when most technology favored right-handed users) and placed less commonly used letters under a typists resting fingers. Sholes, thinking very much like a founder who wanted to sell a lot of product, ensured the word TYPEWRITER was designed into the top row of the keyboard ensuring salespeople could easily demo the product. That one decision played a big part in the makeup of the typewriter.
You are probably reading this article on a computer that has the same QWERTY keyboard layout as a typewriter from 1878. You and everyone who owns one of the ~270M computers that were sold last year. While there are other layouts, QWERTY has become the default layout.
Joy Buolamwini finally woke up to the problem with accepting defaults when, while using generic facial recognition software on video captured from her webcam, her face could not be detected by the software. The same software detected the faces of her roommate. Joy had experienced this same issue playing peek-a-boo with social robots in her undergraduate days at MIT, but she’d let it slide back then as it was someone else’s problem to fix.
On a trip to Singapore, a few years after the undergrad incident, another social robot, this one built by a Singaporean startup, again failed to detect Joy’s face. While detecting the faces of her fellow travelers. Thousands of miles across the world, Joy realized that this Singaporean startup had used the same generic facial recognition software she’d used at MIT to develop their robot. The issue was that no black faces had been included in the data set used to train the machine learning software. Joy had found ‘Algorithmic Bias’; there was a problem with this default code base that was being used to build facial recognition software.
In Originals, Adam Grant highlights the result of a study by Mike Housman, which showed that customer service & call center employees who changed the default browser (Internet Explorer) on their work computers to Chrome or Firefox performed better than their peers who didn’t. The suggestion is that those who keep their default browsers tend to take fewer original steps in their lives. The suggestion is that these people who do not accept the default have the ability to improve things for themselves. Might they be able to improve things for others by not accepting other defaults that we all experience?
There are defaults all around us, guiding and molding our lives without us realizing. Embedded in these defaults are the biases and quirks of the people who set up these structures. Some of them we are starting to fix (like above) and some we’ve only just started to question. We live in a time when we can use the technology available to us to change a lot of the defaults that we now take for granted.
Thankfully, Joy is doing the work of removing the biases inherent in the defaults in machine learning software. She’s launched the Algorithmic Justice League (AJL) and we can all sign up to reconfigure the defaults. As facial recognition software becomes a bigger part of policing and our societal decision-making structures, we all need to work with entities like AJL to ensure the defaults do not disenfranchise a swathe of society.
It’s time for us to look around. What defaults have you discovered that need to be changed?
Please like, tweet, share and heart the article if you enjoyed reading it. Sign up for my Polymathic Monthly Newsletter here, you’ll love it. Also, check out HarperJacobs for compelling content creation for your company.