A man wearing eyeglasses with green code reflecting on his face.

The Coded Bias in Facial Recognition Technology

Facial recognition software has been struggling to save face for a while. So it wasn’t a good look that when week – in the midst of the protests, no less – the ACLU accused the Detroit police office with what they’re calling the first known wrongful arrest involving facial-recognition technology.

Robert Williams was arrested in his driveway and detained for 30 hours under suspicion of theft. Images of the suspect, stealing from a watch store in downtown Detroit, were run through facial recognition software, and Robert Williams was a match. It wasn’t until officers interrogated him that they realized his face didn’t match the pictures – at all (CNN Business).


Sign the petition banning law enforcement from using facial recognition. (The bill referenced on the petition is this one, which I discuss at the end of this email).

Learn how facial recognition is being used in your local community. Here is a map with countless examples across the U.S.

It might not be surprising to know that Robert Williams is Black. If you’ve been following the facial recognition conversation over the past few years, you might have guessed from the beginning. Because there have been dozens of studies that show that facial recognition software can be disproportionately inaccurate when it tries to identify Black people and other people of color.

Joy Buolamwini, a researcher at the M.I.T. Media Lab and founder of the Algorithmic Justice League, published one of the first comprehensive studies on facial recognition bias in 2018 after her firsthand experience (more via the NYTimes). The study found that software was much more likely to misidentify the gender of Black women than white men. Her work, including her popular Ted Talk, paved the way for larger discussion on the flaws of facial recognition.

Facial recognition is increasingly penetrating our lives, but there is still time to prevent it from worsening social inequalities. To do that, we must face the coded gaze.

Joy Buolamwini in her op-ed for the NYTimes

More reports were quick to follow, include one from the National Institute of Standards and Technology that found that African American people were 10 to 100 times more likely to be misidentified than Caucasians, and the Native American population had the highest error rates. (Full study on the NIST website). It also found that women were more likely to be misidentified than men. The elderly and children were more likely to be misidentified than those in other age groups. Middle-aged white men generally benefited from the highest accuracy rates (Washington Post). Another study by UC Boulder found that facial analysis services are also “universally unable to classify non-binary genders” (Eureka Alert).

A main reason for these discrepancies is that facial recognition software can only be as smart as the data that feeds it. And most facial recognition training data sets are estimated to be more than 75% male and more than 80% white (Brookings). Unsurprisingly, the lack of diversity in tech also means that there are few women or people of color that are on the teams building this software, and increasing representation is likely to create a more equitable product (USA Today).

Have you tried opening your iPhone while wearing a face mask, and have it not work? That type of facial recognition error is simply a slight annoyance. But consider its application in policing, especially knowing the systemic racism persistent without the use of technology. And then consider that other algorithms used in criminal justice are also biased, like this algorithm that tries to assess the likelihood of future crimes (ProPublica). I don’t think we need another way to discriminate against those systemically marginalized. More on the dangers of policing in this article in The Week.

And its applications extend beyond just dangerous policing to nearly everything we do. It’s being used to monitor fans at concertsauthorize us at the airport, and even as security in schools. It’s also not just a tool, but a weapon: the stories of the Chinese government using advanced facial recognition technology to track and control the Uighurs, a Muslim minority, is bonechilling (NYTimes).

Even if you haven’t seen the news around facial recognition software, it’s likely seen you: over half of all Americans are in a law enforcement face recognition network (Georgetown Law). So the next time the police run a grainy photo of a suspect in a robbery, they could arrest you in their place.

The Facial Recognition and Biometric Technology Moratorium Act of 2020, a federal bill announced yesterday, is a step to introduce federal regulation to ensure the safety of everyone, particularly those systemically marginalized, even going so far as divesting funds from law enforcement that uses it inappropriately (The Verge). 

Facial-recognition technology doesn’t just pose a grave threat to our privacy, it physically endangers Black Americans and other minority populations in our country. As we work to dismantle the systematic racism that permeates every part of our society, we can’t ignore the harms that these technologies present.

Oregon Sen. Edward Markey via Fortune Magazine
1920 1280 Nicole Cardoza

Nicole Cardoza

Nicole is an entrepreneur, author, investor, speaker and magician passionate about reclaiming our right to be well.

All stories by : Nicole Cardoza
Start Typing
%d bloggers like this: