by Christopher Zoukis
A new report from the Electronic Frontier Foundation (“EFF”) has revealed some disturbing facts about facial recognition systems, which are becoming very popular law enforcement investigative tools.
According to the February 2018 report, one in every two American adults is already in a law enforcement facial recognition database. And the systems are known for incorrectly matching blacks, young people, women, and other ethnic minorities more often than older, white men.
Facial recognition systems use computer algorithms to match a person’s face from a photo or video to an image stored in a database. According to the EFF, law enforcement has already used the technology at political protests and may soon combine facial recognition software with body cameras. There are few checks on the use of this powerful surveillance technology, and that has privacy advocates concerned.
“People should not have to worry that they may be falsely accused of a crime because an algorithm mistakenly matched their photo to a suspect,” said EFF senior staff attorney Jennifer Lynch, who authored the report. “They shouldn’t have to worry that their data will end up in the hands of identity thieves because face recognition databases were breached. They shouldn’t have to fear that their every move will be tracked if face recognition is linked to the networks of surveillance cameras that blanket many cities. Without meaningful legal protections, this is where we may be headed.”
What’s more, facial recognition systems amplify the racial bias entrenched in police practices. There are already a disproportionate number of minorities in criminal databases. Add to that a facial recognition algorithm that incorrectly identifies minorities with more frequency than white men, and the result is a recipe for racial discrimination. Lynch said that law enforcement officials haven’t done enough to ameliorate this problem.
“The FBI, which has access to at least 400 million images and is the central source for facial recognition identification for federal, state, and local law enforcement agencies, has failed to address the problem of false positives and inaccurate results,” said Lynch. “It has conducted few tests to ensure accuracy and has done nothing to ensure its external partners—federal and state agencies—are not using face recognition in ways that allow innocent people to be identified as criminal suspects.”
As a digital subscriber to Criminal Legal News, you can access full text and downloads for this and other premium content.
Already a subscriber? Login