Skip navigation
Disciplinary Self-Help Litigation Manual - Header
× You have 2 more free articles available this month. Subscribe today.

More Facial Recognition Failures

by Michael Dean Thompson

Facial Recognition Technology (“FRT”) misuses continue to pop up in the media. In August of 2023, police went to the home of Porcha Woodruff and arrested her for carjacking in front of her daughters. The kicker in her story was that she was eight months pregnant. She pointed to her belly and asked, “Are you kidding?” The cops refused to comment on whether the actual carjacker was visibly pregnant. Instead, she spent 11 hours in jail before being released on a $100,000 bond. Porcha was immediately hospitalized after her release due to dehydration, sharp pain, and contractions. One would hope that cops who encounter a visibly pregnant carjacking suspect would revisit the information leading to the arrest before arraigning her. But no, she was held in jail until she bonded out solely on the basis of a positive match by a facial recognition system.

In 2019, the National Institute of Science and Technology (“NIST”) released a study of FRT solutions. The report found a wide discrepancy between FRT treatment of white men and Black women. While Black and Asian people have about a 100 times greater chance of false identification than white men, Black women fared the worst in matching responses.

The problem is that cops continue to rely on the flawed technology to do their legwork. Is it too much to ask of cops that they check the details before they make an arrest?

It is not just Black, indigenous, and people of color citizens who suffer under FRT false matches, though they are clearly at the greatest risk. Harvey Eugene Murphy Jr., a 61-year-old white man, was falsely arrested for the armed robbery of a Houston Sunglass Hut, a retail partner of Macy’s. Murphy’s arrest led to him being beaten and raped by three men in jail. Another little detail the cops missed—Murphy had been in California at the time the armed robbery was committed. It took a few days for cops and his court-appointed attorney to confirm his alibi, but he spent 10 days locked in a cell with predators and praying for release.

As it is, companies that provide facial recognition services do so with little oversight. The NIST study highlighted the challenges of even the best systems, yet many of the providers are far from the best—or even tested at all. Nevertheless, some of them, such as Data Works Plus, allow the user to manipulate the source image until it generates a response the user wants. That sort of process inherently injects unnecessary subjectivity into an already flawed system. It can only increase the number of false arrests. Somehow though, other than a few bans and moratoriums among U. S. cities, the government has refused to act. Until federal laws provide some constraint to Facial Recognition Technology, the long-term negative effects of the faulty technology and sloppy policing will continue to pile up.  

 

Sources: TechDirt.com, StopSpying.org, The Guardian

As a digital subscriber to Criminal Legal News, you can access full text and downloads for this and other premium content.

Subscribe today

Already a subscriber? Login

 

 

Prison Phone Justice Campaign
Advertise here
The Habeas Citebook Ineffective Counsel Side