Skip navigation
PYHS - Header
× You have 2 more free articles available this month. Subscribe today.

Government Study Finds Facial Recognition Sorely Lacking in Accuracy

The study was thorough; it used 189 different facial recognition algorithms submitted by 99 companies.

The problem is exacerbated by the fact that the technology returns an inordinate number of false-positives targeting minorities.

The study found that African American people were 100 times more likely to be misidentified than white men, depending on the algorithm used.

The faces of African American women were falsely identified by facial recognition programs most often, when the most common kind of search used by police investigators where an image is compared to thousands or millions of other images in hopes of a match.

However, the highest false-positive rate of all ethnicities, according to the study, was for Native Americans. Middle-age men consistently had the highest accuracy rate. That’s also the group that promotes the use of facial recognition software in fighting crime.

Last year, the American Civil Liberties Union (“ACLU”) conducted an experiment using Amazon’s “Rekognition” facial recognition software and found that 28 members of Congress were identified as criminals. This was just using facial recognition and not based on any acts.

In response, Amazon said the ACLU used its software the wrong way. When it was given a chance to participate in the NIST study to prove its software was accurate, Amazon elected to sit out the government study.

Up ahead, the Department of Homeland Security wants to use facial recognition software to screen all air travelers at international airports. While a few lawmakers want to pause to see how the error rates can be corrected before implementing this policy, many are pushing ahead full speed, using the public as guinea pigs. This creates real-world consequences of the false alerts using facial-recognition software for many innocent people.

Also noteworthy: With such a low accuracy rate, the goal of using facial recognition software to find the person who did commit a crime becomes even more remote, especially when law enforcement puts its focus on the wrong person, while the real person casually walks away. 

 

As a digital subscriber to Criminal Legal News, you can access full text and downloads for this and other premium content.

Subscribe today

Already a subscriber? Login

 

 

PLN Subscribe Now Ad
Advertise here
Disciplinary Self-Help Litigation Manual - Side