Skip navigation
The Habeas Citebook Ineffective Counsel - Header
× You have 2 more free articles available this month. Subscribe today.

Groups Challenge Face Recognition Algorithm in Florida Case

by Kevin Bliss

The ACLU, the Electronic Frontier Foundation, the Georgetown Center on Privacy and Technology, and the Innocence Project have teamed up to battle the Florida court system over the constitutional right to confront a defendant’s accuser when that accuser is a computer algorithm. 

Willie Lynch, a black Florida resident, was tried and convicted of selling crack cocaine to two undercover officers in 2015 based solely on identification evidence from a facial-feature recognition program. Instead of immediately arresting the individual selling them the drugs, the officers tried to discretely photograph the subject with an old cellphone the officer held to his ear as if having a conversation. The result: several poor-quality photos of a poorly framed subject. The officers could not identify the person in the photos on their own so sent them to a crime analyst. The Face Analysis Comparison Examination System was used to compare them to the county’s mugshot database. The program identified several suspects, with Lynch rating a “one star” or low level of confidence match.

During trial, Lynch’s attorney argued that Lynch was not the suspect in question. The algorithm match was incorrect. He examined the crime analyst and her understanding of the operation of the programming. She stated that she had no idea how the algorithm or the star-rating system worked when assessing matches. Yet the crime officers acted on her suggestive recommendation and arrested Lynch.

Lynch’s attorney also asked for the other photos of the program’s possible matches. The prosecution refused to supply this potentially exculpatory evidence to the defense, and the courts upheld that decision. Even the appeals court found that this failure to disclose was within the bounds of law.

The stated advocacy organizations feared this abuse of power flaunted the U.S. Supreme Court decision requiring prosecution to disclose any favorable evidence they had in their possession; they filed a motion in support of Lynch to the Florida Supreme Court.

Called an amicus curiae or friend of the court brief, they point out that the algorithms are inaccurate and biased. They state that the systems have been tested and proven to be less accurate when the photos have different lighting qualities, backgrounds, and expressions, or when used to match black people, women, and the young. “This is partly because the data sets used to train the algorithms have historically been composed of face images that are not representative of the full diversity of the population,” stated an ACLU blog on the subject.

Amazon’s Rekognition program was tested with photos of members of Congress. The program incorrectly matched 28 photos to mugshots in its database, with people of color being almost four times as likely to be incorrectly matched.

Lynch is the first case in Florida to challenge this technology, and the ACLU and other advocacy organizations hope to earn Lynch and others like him a chance at a fair trial with all the evidence available to them for their defense.

Source: aclu.org

As a digital subscriber to Criminal Legal News, you can access full text and downloads for this and other premium content.

Subscribe today

Already a subscriber? Login

 

 

The Habeas Citebook Ineffective Counsel Side
Advertise here
Stop Prison Profiteering Campaign Ad 2