Apple Card "Sexism:" A Real Technical Blunder, or Dirty Marketing?

Written by FrederikBussler | Published 2019/11/12
Tech Story Tags: ai | algorithms | bias | apple-card | apple | apple-card-sexism | hackernoon-top-story | sexism | web-monetization

TLDR Apple Card's credit worthiness algorithm appears to discriminate against women. The question is not if but why? Why was this bias not caught when Apple engineers were testing the algorithm? Some would suggest that Apple intentionally released a biased algorithm. Amazon previously had deployed a biased hiring algorithm and Google had a biased facial recognition software, but no one would claim that these were good things for the companies. The Apple Card is now being investigated, so we'll find the answer in the coming months. What really happened is up to the regulators to decide.via the TL;DR App

Unless you've been living under a rock, then you probably heard all about #applecard. It's Apple's latest innovation, bringing the simplicity and design of their traditional products to the credit card space. But in the last few days, a shit storm emerged on Twitter about apparent discrimination by the Apple Card's credit worthiness algorithm.
It started with a few viral tweets, one by David H. Hansson, a Danish programmer, who noticed that his wife was discriminated by the algorithm behind the Apple Card. Apple-founder Steve Wozniak jumped on-board and retweeted Hansson's tweet, noting that something similar happened to him.
Apple Card seems to discriminate against women, so the question is not if but why? Why was this bias not caught when Apple engineers were testing the algorithm? Was the bias actually caught, and if so, why did they deploy the algorithm anyway?
Given that biased algorithms have already made big waves in the media in the past, some would suggest that Apple intentionally released a biased algorithm. They knew it would bring massive publicity to Apple Card, and then they could soon release a fix and bring the focus to equality for women, garnering greater support. What really happened is up to the regulators to decide, but there's no doubt that this is drumming up massive awareness of the Apple Card.
Of course, intentionally deploying a biased algorithm as a PR tactic would be incredibly dangerous if anyone found out. Such maliciousness would be unwarranted by Apple, making it more likely that it was out of ignorance. Amazon previously had deployed a biased hiring algorithm and Google deployed a biased facial recognition software, but no one would claim that these were good things for the companies. This reminds me of Hanlon's Razor:
"Never attribute to malice that which can be adequately explained by stupidity."
But even that is unlikely—Apple's engineers are highly trained and must have been aware of the risks of bias in algorithms, and yet, they failed. The Apple Card is now being investigated, so we'll find the answer in the coming months.

Written by FrederikBussler | Published author and writer.
Published by HackerNoon on 2019/11/12