Network Theory Predicts Bias

Written by Thematizer | Published 2017/11/08
Tech Story Tags: women-in-tech | probability | feminism | female-founders | mathematics

TLDRvia the TL;DR App

The real reason for gender disparities in tech

This week while buying fall boots at a local vintage store in my neighborhood, I happened to mention to the shop owner that I worked in tech. She immediately asked me if I had personally experienced sexism.

What does that tell you? Nothing good about our industry.

The truth is that most male programmers I know try hard not to be sexist. But they have not lived through the same types of experiences women have, and sometimes don’t quite believe that they are real.

At the end of August, I started looking for ways to predict and explain the effects of bias using a model based in statistics and probability theory. I was surprised at the starkness of the results.

I am working with Joe Benson, Ph.D., Professor of Mathematics at Macalester College, to publish in the academic press. However, given the level of polarization that former Google employee James Damore has brought to discourse on this topic, we felt it would be worthwhile to share our preliminary findings sooner rather than later.

The following statistical model explores what happens when all members of a group are equally qualified, but certain members “vote out” other members of a group based on hidden biases. Bias does not have to be conscious, and everybody harbors at least some bias, pro or con. It is clear to me that not all people are equally biased. Moreover, scientific evidence like this Yale study shows us that bias is real and that individuals often act on bias.

Our model looks at the consequences of those actions.

It assumes that 20% of the people in an organization exceed some threshold of bias, and that 40% of the organization belongs to a disadvantaged group (in this example, women).

Agreeable Alice starts her first job. She attends her first meeting. Somebody who is influential (Biased Brian) says she is “not technical” and “doesn’t seem confident.” Biased Brain is not her direct supervisor, but his opinion carries great weight. She gets passed over for the opportunity to work on a more challenging project.

Meanwhile Blameless Bob presents at the next meeting. Biased Brian has his back. Biased Brian says only good things about Bob. Bob gets the opportunity to work on the more challenging, high visibility project. Because Bob, like Alice, is highly qualified, he quickly moves through the organization’s ranks.

Alice, meanwhile is stuck in a role with little opportunity for advancement or personal growth. She leaves the organization. If Alice had gotten positive feedback from somebody other than Biased Brian, she would have been fine. But because Biased Brian spoke up first, she never got that opportunity.

After just two iterations, 40% of women in the organization have disappeared.

Why such a steep decline? The explanation is very intuitive and simple.

At each iteration, unbiased individuals leave the voting process, while those with bias remain entrenched. Groups quickly polarize and the “unbiased middle” falls out from the picture.

The model can be used to explain behavior in settings as diverse as the Google peer review process or retweets on social media services like Twitter and Reddit. Our first draft did not even look at gender as a factor — only the interplay between more biased and less biased individuals. Our goal is to provide a model that explains how the majority of people within a system can be relatively unbiased, yet the effects of bias are still prevalent. Our source files and poster are available upon request.

Excerpt from poster depicting preliminary models. Click here to view the full poster.

If programmers have a single redeeming character trait, it is not being smart. Big whoop! Plenty of people are smart, in plenty of other professions.

Rather, their best trait is knowing how to handle criticism. When Neha Narula of MIT Media Lab pointed out a security flaw in the IOTA hash function, the response of the IOTA team was not to trash her or attack her personally. Rather, Iota explained how it had fixed the problem and what it would do to improve security in the future.

Whether you are reporting a bug or disclosing a serious vulnerability the response of an experienced coder is not to dig in and become more defensive. Rather, it is to listen, learn, and try to fix the problem.

Why do I choose to share a model and not my own personal experience?

Because if programmers have a universal language, it is not C or JavaScript or Python.

It is math.

I floated an early draft of this model on a hacker forum I frequent. I was relieved that most of the participants immediately grasped how permutations of selection bias in a decentralized, non-hierarchical network would produce the type of results described.

The more disturbing response I got was, “Why go to all that trouble? Isn’t it easier to just assume that women aren’t as good as men at computer science?”

The answer is that I love the tech community, in particular, the world of open source. I love its sense of humor, its quirkiness, the in-jokes, and above all, the passion and enthusiasm of the men and women who are part of it. In my role as CEO of a tech startup, I found that the people we choose for our teams were as important as the technologies employed. With that said, this work wasn’t done to convince white male programmers that women are their intellectual peers. They’ll either get that, or they won’t.

I spent time on this question because people tend to become what they believe.

I wanted to reach young women who might unfortunately come to believe the narrative that they have no place in STEM. Very specifically, I wanted to reach young women who might, in a year or two, be deciding whether or not to take a computer science elective in high school, who might be concerned it was too hard and hurt their chances of getting into college.

My advice is to take that course.

Girls, we’ve got this one.

Tess Gadwa is founder and former CEO of Yesexactly.com. She is also responsible for creating Zappen, the first fully functional, open source cross-platform mobile visual search solution, licensed under the LGPL 3.0. Her current projects include Beerious? and ROSECODE: The Interactive Cyberpunk Thriller. Follow her on Twitter at twitter.com@thematizer.

Licensed under a Creative Commons Attribution-NoDerivatives 4.0 International License.


Published by HackerNoon on 2017/11/08