Skip navigation
Disciplinary Self-Help Litigation Manual - Header
× You have 2 more free articles available this month. Subscribe today.

One Year of New Orleans Police Department Facial Recognition Data

by Michael Dean Thompson

About a year after the New Orleans Police Department (“NOPD”) performed its first facial recognition scan under a new policy that reauthorized its use, they have little to show for it. That is according to NOPD’s own data, which was analyzed by Politico. The new policy reintroducing automated facial recognition (“AFR”) was instituted in response to a jump in the crime rate. Businesses, police, and the mayor supported AFR use as an “effective, fair tool to identifying criminals quickly,” according to Politico. Instead, the data show AFR use is focused on Black people, and it has been associated with comparably few arrests, giving it a low effectiveness.

It makes sense that police departments would reach to tools to help them with recognizing faces because people tend to be fairly bad at it, especially with regard to identifying people of other “races” than themselves. What the NOPD data show, however, is that AFR has amplified the underlying human biases they are trying to correct.

New Orleans Councilmember At-Large J. P. Morrell, a Democrat who voted against using the technology, told Politico, “The data has pretty much proven that [civil rights] advocates were mostly correct” and added, “It’s primarily targeted towards African Americans and it doesn’t actually lead to many, if any, arrests.”

Nevertheless, a slim majority of New Orleans City Council members, all Democrats, support the use of AFR. City Councilor Eugene Green voted for lifting the AFR ban that had been put into place in the wake of George Floyd. Although civil rights advocates have long pointed out AFR’s biases, Green—a Black Councilor representing a majority Black district—sides with Mayor LaToya Cantrell and a coalition of businesses in support of its use. “If we have it for 10 years and it only solves one crime, but there’s no abuse, then that’s a victory for the citizens of New Orleans.” It seems unlikely that the people whose hard-earned dollars pay for the AFR would agree that such poor effectiveness is worth it from a cost-benefit perspective. Nor is it clear that there has been no abuse, much less that there won’t be any for nine more years.

There were just 19 recorded facial recognition requests in the year that followed New Orleans’ lifting of the ban. All the recorded requests were for serious crimes, like murder and armed robbery. Two of the 19 requests were canceled as police identified the suspects by other means before the results came back. Two more requests were denied because the supporting investigations did not meet the required severity of the crime threshold. Of the 15 remaining requests, nine were unable to match. Six of the 15, less than half, returned matches, and half of those were wrong, i.e., false positives. So, there were 15 accepted requests, and only three correct matches.

Three false positives out of six matches is remarkable. The first failure happened as police searched for a gunman in November of 2022. The AFR returned a match on a suspect photo, but police discovered that the actual suspect was someone else through monitored jail phone calls. In February of 2023, police submitted an image to the state police for facial recognition. They do not say why the image match was wrong, only that they found the correct suspect through other means. In April 2023, NOPD received another bad match from a photo provided with a tip. The police later learned the person the AFR identified as being in the photo was not in the area during the murder.

NOPD claims the data show they followed policy and readily point out that there were no arrests based solely on positive matches. Investigators instead sought corroborating evidence rather than relying solely on the technology.

Randal Reid from Georgia has a somewhat different story to tell. In late 2022, cops submitted an image of a suspect during a credit card theft. The AFR identified Reid, and the local police in Georgia were sent a warrant. They did not tell Reid’s hometown police that the arrest warrant was the result of a positive match that came from ClearView AI’s facial recognition software with whom the police had contracted. Fortunately for Reid, his attorney had sharp ears and overheard an officer refer to Reid as a “positive match.” It turned out that the thief and the “positive match” possessed some glaring differences such as an obvious facial mole and about 40 pounds of weight. Reid spent six days in jail and thousands of dollars in attorney fees because ClearView AI, which sources its images from social media, news feeds, and other web sources as well as mugshot databases, found an image it determined similar. As it turned out, Randal Reid had never even been to Louisiana, much less New Orleans. No doubt, he has a very different view of AFR than Councilor Green.

Politico was able to retrieve information on the 19 AFR requests by NOPD because New Orleans had implemented a requirement that AFR use be reported to the city council. It was the first city in the U.S. to mandate transparent use of AFR. At the time of this writing, however, it is not clear why Reid’s case was not included in Politico’scoverage, though his case was covered at the time by the Associated Press and Criminal Legal News. Shortly after Politico’spiece, it was also discussed by The New Yorker.

There is not a lot of transparency around AFR’s use. An examination of federal law enforcement use of AFR by the Government Accountability Office (“GAO”) in 2023 found approximately 63,000 uses between October 2019 and March 2022. That number, however, was a known undercount. The FBI, the largest admitted consumer of commercial AFR services in federal law enforcement by a long shot, could not account for its use of two of the three services it used. Yet another agency, Customs and Border Patrol (“CBP”) had no data on its use of two services. A previous GAO report from 2021 had found that many agencies could not even list which AFR services were used. All three of the largest police departments in the U.S. use AFR. The New York City Police Department (“NYPD”) has been using it since 2011. In 2019, the NYPD reported it had used AFR 9,850 times that year alone. Of that figure, the NYPD claimed there were 2,510 potential matches—just over 25%. However, they failed to indicate how many of those were false positives.

NOPD’s use of AFR prior to the 2020 ban was no different. When the police asked the city to reauthorize its use, the city asked for data on its previous use. The NOPD admitted then that it had no data on how AFR had previously been used or even how successful it had been. Nonetheless, the city reauthorized its use, wrongly believing it could help with investigations, though they mandated some data collection. Among the details to be collected were the officer making the request, the crime being investigated, a declaration of reasonable suspicion, the suspect’s demographic information, the officer’s supervisor who approved the search, matches found, and the investigation’s result. Those requirements probably have discouraged excessive requests over the first year NOPD returned to using the technology. “We needed to have significant accountability on this controversial technology,” said Helena Moreno, a council member who coauthored the 2020 ban on AFR.

As part of the ban being lifted, there was an unofficial agreement between the City Council and NOPD that quarterly reports would show how the NOPD is using AFR. The first quarterly report, which covered October through December 2022, likely did not give them much hope. NOPD used AFR six times during that period. Of those, three had no match. Another photo had a match, but it was a false positive. The remaining two cases were still open when Politico published its article. The next report showed four more requests, including three with no matches and another false positive.

Corporations and police foundations have been pushing hard for police departments throughout the country to use unproven new technologies, the so-called “bleeding edge” in information technology circles. For example, the International Association of Chiefs of Police and the Integrated Justice Information Systems, both of which are funded by private and corporate donations, produced a catalog that instructed police departments to speak out about the effectiveness of AFR in supporting investigations. Based on what has been discussed here, that should be a tough sell. Jeff Asher, who was hired by the New Orleans City Council as a criminal justice consultant, read the data and came to a different conclusion from the International Association of Chiefs of Police and the Integrated Justice Information Systems. “It’s unlikely that this technology will be useful in terms of changing the trend [of rising crime rates],” he said in a September 2023 interview. “You could probably point to this technology as useful in certain cases, but seeing it as a game changer, or something to invest in for crime fighting, that optimism is probably misplaced.”

A look at overall trends shows even that statement was likely a bit too optimistic. Research at Georgia State University by Thaddeus Johnson took a bigger picture view. He found that among the police departments that have adopted AFR, there has been a 55% increase in the arrests of Black adults and a corresponding 21% decrease in the arrests of white adults. There was not enough data according to Johnson, who had also been a Memphis police officer, to draw any causal links for the skewed results. One possible contributing factor may be that many AFR systems draw their photos from mugshot databases. Since Black people are substantially overrepresented in these databases, the systems would be more inclined to return Black matches. Johnson points out, “If you have a disproportionate number of Black people entering the system, a disproportionate number being run for requests for screenings, then you have all these disproportionalities all cumulatively building together.” It is worth noting that IDEMIA, a French software company that provides the AFR technology to the Louisiana State Police—and therefore, the NOPD—sources its data from mugshots.

Politico notes that in nearly every publicized case of false arrest based on AFR, the technology’s victim has been Black. That includes two arrests in Detroit, one of which involved a pregnant woman. She was accused of robbery and carjacking. It also includes the Georgia resident Randal Reid. Of the 15 fulfilled AFR requests by the NOPD, only one person was white. Yet, an NOPD spokesperson addressing a question about the disparity said, “Race and ethnicity are not a determining factor for which images and crimes are suitable for racial recognition review.” Despite the rhetoric, the numbers seem to tell a different story.  

 

Sources: politico.com, GAO report: Facial Recognition Services (September 2023)

As a digital subscriber to Criminal Legal News, you can access full text and downloads for this and other premium content.

Subscribe today

Already a subscriber? Login

 

 

Prison Phone Justice Campaign
Advertise Here 4th Ad
Prison Phone Justice Campaign