Recent reports reveal that police use of facial recognition software may be far more pervasive than we’ve been led to believe.
Clearview AI markets its facial recognition software to law enforcement agencies around the country, boasting the ability to take a photo from a crime scene and match it to a known person.
This ability has been described as the “holy grail” of investigating, presumably because tracking down suspects based on photographic evidence alone is often overwhelmingly difficult.
When this technology has been used in police investigations, however, flaws in this system have reportedly resulted in serious abuses. Police in Michigan and New Jersey have arrested persons simply because the program said they were likely to be the culprit, but direct observation of the “suspect” compared to the photo evidence later revealed the error to be self-evident (after an embarrassing or possibly violent arrest). And research has shown that such programs are less reliable when attempting to match faces of woman and non-white people.
This has led oversight groups to take an interest in unregulated police use of facial recognition. In early April 2021, Buzzfeed announced a leaked list of “1,803 publicly funded agencies whose employees are listed in the data as having used or tested the controversial policing tool before February 2020.” According to Reason.com, these include local and state police, U.S. Immigration and Customs Enforcement, the Air Force, state healthcare organizations, offices of state attorneys general, and even public schools.
This number likely underrepresents the number of agencies actually using Clearview AI’s product since Regional Information Sharing Systems (“RISS”) was on the list, and such groups provide resources to over 9,000 smaller law enforcement agencies around he country. Charles Wynn, police chief of Chino Valley, Arizona, told Buzzfeed that his agency “does not have any type of facial recognition software” but rather submits such requests through the RISS for his region.
Widespread use may be driven in part by the ease of access Clearview AI provides. Any person may take advantage of a free 30-day trial by simply stating they are part of a federal, state, or local law enforcement organization and “have received authorization from your supervisor at that law enforcement organization to request trial access to Clearview AI,” according to the company.
In addition to documented police abuses of facial recognition, there are also the undocumented cases to cause worry. Not only are innocent people arrested for crimes they didn’t commit, but the Associated Press reported in 2016 that “police officers across the country misuse confidential law enforcement databases to get information on romantic partners, business associates, neighbors, journalists and for other reasons that have nothing to do with daily police work.”
It’s clear that in the absence of public accountability, law enforcement has been silently expanding its use of this reportedly flawed tool and that Clearview AI would like to keep its use by agencies a secret despite the resulting errors and abuses.
As a digital subscriber to Criminal Legal News, you can access full text and downloads for this and other premium content.
Already a subscriber? Login