Amazon’s Facial Recognition Tech Is Racially Bias, Study Says

Amazon’s facial recognition technology, Amazon Rekognition, has been misidentifying women, especially those with darker skin, according to researchers.

CAMBRIDGE, Mass. — Despite having the potential to be a powerful security tool, facial recognition still has some kinks to work out.

Amazon’s facial-detection technology, Amazon Rekognition, has been misidentifying women, especially those with darker skin, according to a team of researchers at MIT and the University of Toronto.

Over the last two years, the service was being marketed to law enforcement personnel as a way to identify objects, people, text, scenes and activities, as well as detect inappropriate content, according to Amazon.

The company said the facial analysis would be able to determine things like age range, facial hair, emotions and more.

Researchers found that the technology had trouble differentiating the gender of female faces and darker-skinned faces in photos, reports The New York Times. The service labeled darker-skinned women as men 31% of the time. Lighter skin women were misidentified 7% of the time.

When it came to lighter-skinned men, however, the service made zero errors.

Because of this racial bias and discrimination against minorities, privacy and civil rights advocates, like the ACLU, have demanded Amazon to cease the marketing of Amazon Rekognition technology and selling it to police.  Investors in the company have also asked marketing to stop to avoid potential lawsuits.

Using artificial intelligence (AI) for security and surveillance has its advantages, like identifying faces in crowds or age detection. These advantages could be particularly useful in helping law enforcement catch criminals or find missing children.

The latest debate, however, is if this crosses the line when it comes to a person’s privacy, and whether and how Congress should regulate such powerful technologies.

The ACLU says the technology “is primed for abuse in the hands of governments, poses a grave threat to communities already unjustly targeted in the current political climate, and undermines public trust in Amazon.”

The new study, which will be presented at an artificial intelligence and ethics conference, warns of potential abuse and threats to privacy from the facial-detection technology as well.

Matt Wood, the general manager of AI with Amazon, says the study only focused on facial analysis, a technology that can spot features such as mustaches or expressions such as smiles, and not facial recognition, a technology that can match faces in photos or video stills to identify individuals. Wood says Amazon markets both services.

“It’s not possible to draw a conclusion on the accuracy of facial recognition for any use case — including law enforcement — based on results obtained using facial analysis,” Dr. Wood said in a statement. He added researchers did not test the latest version of Rekognition.

According to the study, Microsoft’s facial recognition technology misclassified darker-skinned women as men one in five times, but Amazon seems to be catching most of the criticism.

One reason could be that Amazon has been less willing to talk about the concerns that have been voiced about their products, especially when compared to other tech companies like Microsoft and IBM.

California Representative Jimmy Gomez has been investigating Amazon’s facial recognition practices and believes the company needs to publicly address these issues.

“I also want to know if law enforcement is using it in ways that violate civil liberties, and what — if any — protections Amazon has built into the technology to protect the rights of our constituents,” he said.

Amazon responded to Gomez by saying all Rekognition customers must follow the company’s policies on civil rights and other laws. However, Amazon does not audit its customers, making it difficult to know for sure how the product is being used.

Wood confirmed that Amazon has updated its technology since the study, re-tested it and found “zero false-positive matches,” according to ABC News.

The website also credits Rekognition for helping the Washington County Sheriff Office speed up the process of identifying suspects from thousands of photo records, and ultimately, catching a criminal.


This article first appeared on SSI sister publication Campus Safety.

If you enjoyed this article and want to receive more valuable industry content like this, click here to sign up for our FREE digital newsletters!

Security Is Our Business, Too

For professionals who recommend, buy and install all types of electronic security equipment, a free subscription to Commercial Integrator + Security Sales & Integration is like having a consultant on call. You’ll find an ideal balance of technology and business coverage, with installation tips and techniques for products and updates on how to add to your bottom line.

A FREE subscription to the top resource for security and integration industry will prove to be invaluable.

Subscribe Today!

Leave a Reply

Your email address will not be published. Required fields are marked *

Get Our Newsletters