Facial Recognition Tech has a Worryingly High Failure Rate in People of Color

Written by mosesconcha | Published 2023/01/26
Tech Story Tags: technology | facial-recognition-tech | biometric-technology | for-non-tech-people | algorithmic-bias | hackernoon-tech-news | facial-recognition | hackernoon-top-story | hackernoon-es | hackernoon-hi | hackernoon-zh | hackernoon-vi | hackernoon-fr | hackernoon-pt | hackernoon-ja | hackernoon-tr | hackernoon-ko | hackernoon-de | hackernoon-bn

TLDRThe recent use of facial recognition systems in Louisiana has resulted in the mistaken-identity arrest of Randall Reid, a black man from Georgia. Local authorities used facial recognition technology to pin him for stealing purses from a store in a New Orleans suburb – one in an entirely separate state. Reid is yet another in an __[ongoing trend](https://www.wired.com/story/wrongful-arrests-ai-derailed-3-mens-lives/?redirectURL=https%3A%2f%2Fwww. wired.com%2C%2A%20Facial-recognition-misidentify-jail.htmlvia the TL;DR App

The recent use of facial recognition systems in Louisiana has resulted in the mistaken-identity arrest of Randall Reid, a black man from Georgia. Local authorities used facial recognition technology to pin him for stealing purses from a store in a New Orleans suburb – one in an entirely separate state Reid claims to have never once been to.

Reid is yet another in an ongoing trend of similar major misidentifications of fellow people of color by facial recognition technology (FRT) within the past few years.

After police in Woodbridge, New Jersey had a fake ID belonging to a suspected shoplifter assessed by FRTs in early 2019, Nijeer Parks, who worked and lived 30 miles away in Paterson, N.J., served 10 days in jail and spent thousands of dollars defending himself against a crime, he was not involved in. Proof of him sending money at a Western Union at the time of the incident helped get him off the hook.

https://www.youtube.com/watch?v=nGStQVeCYuw&embedable=true

Michael Oliver was wrongfully accused of attempting to destroy a teacher’s phone on camera in May 2019. Based on video evidence captured by the teacher, Detroit Police used FRTs to link Michael Oliver to the crime, although obvious physical differences noted by his attorney – such as his forearm tattoos and lighter skin tone – ultimately absolved him of any wrongdoing.

In January 2020, Robert Williams spent over an entire day in jail after allegedly being caught on video stealing nearly $4,000 worth of luxury watches from a Shinola store in Detroit. His charges were dropped two months after new evidence revealed he was singing on Instagram Live 50 miles away at the time of the crime.

These cases are some of the United States’ most significant major misidentifications of people of color within the past five years. They serve as a direct reflection of the state of facial recognition technologies and their ability to effectively discern and differentiate individuals of color.

Recognizing The Problems

Facial recognition technology thrives, or falters, on the vital biometric data – photos of various faces and other physical characteristics – it is given to assess. The set of data the system receives is what ultimately determines the overall effectiveness of the system as a whole.

That said, these systems cannot recognize faces belonging to those of a specific race if the datasets used to support and train these systems contain minimal data on the race in question.

Yashar Behzadi, CEO and Founder of Synthesis AI, says, "Certain demographics are often underrepresented in these datasets, whether they were collected from images on the internet or other conventional means. The result is that the training data used to power the AI becomes imbalanced, resulting in model bias.”

In other words, the less biometric data there is on people of color, the less likely facial recognition technology is to successfully identify people of color.

Until recently, FRTs have been “primarily developed and tested on datasets that had a majority of fair-skin individuals,” according to Tatevik Baghdasaryan, Content Marketer at SuperAnnotate. This greatly limits their scope of analysis, yielding far more errors when attempting to identify people of color compared to their well-recorded, fairer-skinned counterparts.

https://www.youtube.com/watch?v=TWWsW1w-BVo&t=298s&embedable=true

“As a result, the algorithms used in facial recognition technology perform worse on people with darker skin tones and specific facial features such as broader noses and fuller lips,” says Baghdasaryan. “This leads to higher rates of false positives and false negatives."

For instance, a 2018 landmark study by Joy Buolamwini and Timnit Gebru found that many algorithms responsible for analyzing key facial features in FRTs have been known to misidentify black women more than 33% of the time.

FRTs Around the World

Facial recognition technology has become quite ubiquitous in the tech world and is now being used by nearly 100 countries across the globe.

Singapore, famously known for its Smart Nation initiative, is no stranger to emerging technologies and has consistently been at the bleeding edge of technological innovation for the past few decades.

In late 2020, Smart Nation added a facial recognition feature to SingPass, the country’s primary personal authentication system where users can access various government services online. Since then, Singapore has also installed self-service kiosks throughout the country that employ FRTs in an effort to make access to public services as convenient and seamless as possible.

However, while the use of facial recognition technologies has become widely accepted, there still remains a handful of countries that limit their use or, in some cases, outright refuse them. Countries such as Belgium and Luxembourg fall into the latter category, opting to ban FRTs entirely, with other European countries starting to follow suit.

Argentina serves as a unique example; a country that, at first, adopted the technology with open arms and then later changed its stance on the controversial technology in response to a series of misidentifications that led to the wrongful detainment of several people.

What Can Be Done?

Currently, it has become clear that facial recognition technology’s biggest issues stem from the quality and type of data its systems receive.

If the system’s data is not representative of a diverse body of demographics – only including data for those with lighter skin, for example – or the quality of the images assessed by the system is poor – blurry, dimly lit, taken from nonoptimal angles, etc. – errors like false positives in people of color become far more likely to occur.

Thus, the simplest solution to this long-standing problem with FRTs is to incorporate higher volumes of data that represent those with a variety of skin tones and facial features.

If we, as a people, must trust and rely on this technology to aid us in matters of equal and fairly dispensed justice, the least we can do is learn more about facial recognition’s core issues and how they affect its ability to properly identify people of color.


Written by mosesconcha | Journalist, copywriter and passionate storyteller with a lifelong love of video games.
Published by HackerNoon on 2023/01/26