Skip navigation
The Habeas Citebook: Prosecutorial Misconduct - Header
× You have 2 more free articles available this month. Subscribe today.

Predictive Policing Doesn’t Reduce Crime but Does Increase Targeting of Vulnerable Communities

by Casey J. Bastian

Our society is increasingly reliant on data and technology in nearly every sector, both public and private. Law enforcement and public safety institutions are regular consumers of data offerings. Tech companies like Accenture, Microsoft, Oracle, HunchLab, ShotSpotter, and PredPol are maximizing financial opportunities by “improving” policing. In theory, Predpol uses crime data-driven algorithms to issue “future crime predictions” in an effort to decrease crime. In practice, there is no discernable decrease in areas using PredPol software. In fact, PredPol appears only to perpetuate biases and saturate vulnerable communities with policing.

PredPol’s name is derived from the words predictive and policing. The concept behind PredPol’s business model is known as “predictive analytics.” PredPol was created through grants of over $1.7 million from the National Science Foundation. The company was also funded by investments from Plantronics in the amount of $3.7 million between 2012 and 2014. PredPol has police departments create an automated feed of crime report data into its algorithm. Police and public safety experts frequently include data reported by both officers and the public. Police then choose which crimes they want “predicted.” The algorithm utilizes three variables to create future crime predictions. These include the date and time, the location, and the type of past crime reports.

A PredPol prediction features a map on which 500-by-500-foot boxes are marked and lists the police shift during which the occurrence of the predicted crime is most likely. The PredPol software suggests that officers “get in the box” during available time on shift. Certain city officials have revealed that officers will drive in anticipation to prediction locations and simply engage in tasks such as paperwork while in the box. The intent is to be there before they are actually needed.

Reports examined by Gizmodo and The Markup reveal the extent of PredPol’s usage. These reports indicate that over three percent of all Americans were likely subjected to “police patrol decisions directed by crime-prediction software called PredPol” between 2018 and 2021. Analyses of the PredPol data verified over 5.9 million of these predictions were sent to 38 law enforcement agencies nationwide during that period. PredPol has operated without scrutiny for years according to American University law professor Andrew Ferguson. He says, “This is actually the first time anyone has done this, which is striking because people have been paying hundreds of thousands of dollars for this technology for a decade.”

PredPol is marketed as analytics software that uses reported crime data to help place law enforcement in areas they may be needed most. An examination of the reports of PredPol’s usage appear to reveal an inherent bias. A 2018 paper by PredPol’s creators admitted that “place-based crime prediction algorithms” tend to focus on areas already heavily policed, creating a “feedback loop.” This leads to increased arrests in an area and then more predictions. It truly becomes a corrosive cycle for poor, disadvantaged communities.

The reports examined more than 270,000 arrests in 11 cities that use PredPol. Areas with excessive predictions had much higher rates of arrest generally than the surrounding areas. This confirms that PredPol typically recommends that patrols be dispatched to areas already frequented by police. PredPol reports also indicate that far fewer patrols are sent to areas considered “[w]hiter” or more middle-to-upper-income. Years can elapse where these areas do not receive one single prediction.

PredPol software targets neighborhoods for increased police presence that are typically home to “Blacks, Latinos, and families that would qualify for the federal free and reduced lunch program.” The areas of PredPol’s recommended increased patrols are usually near or in areas of public and subsidized housing. The software seems to target the poorest of the poor. “Communities with troubled relationships with police — this is not what they need. They need resources to fill basic social needs,” says Jay Stanley, a senior policy analyst at the ACLU Speech, Privacy, and Technology Project.

It isn’t just simply that there are more patrols in these particular neighborhoods; these communities are targeted relentlessly. In many, PredPol predicted crimes every day, even multiple times a day, and even in multiple locations in the same neighborhood at the same time on the same day. PredPol sent out “thousands upon thousands of crime predictions” in the last few years to the same disadvantaged neighborhoods. Multiple neighborhoods were each subjected to over 11,000 predictions during that period, according to the reports.

For example, half the residents of Birmingham, Alabama, are Black. Yet the areas with the most predictions are the neighborhoods where the Latino population is more than double the city average. The fewest predictions go to the white communities. Soto Garcia, a Birmingham-based activist says, “It’s a reason to keep doing what they’re already doing, which is saying ‘This area sucks.’ And now they have the data to prove it.”

PredPol recommended extra patrols focusing only on areas of Portage, Michigan, that have nine times the proportion of Black residents per capita. Local activist Quinton Bryant believes that this is “just giving them a reason to patrol these areas that are predominantly Black and Brown and poor folks.”

In Los Angeles, even when the predictions went to “majority white” communities like Northridge, the predictions were “clustered on the blocks that are almost 100% Latino.” Overall, these neighborhoods are disproportionality poorer and more heavily Latino than the city overall. “These are the areas of L.A. that have had the greatest issues of biased policing,” said Thomas A. Saenz, President and general counsel for the civil rights group MALDEF.

In Haverhill, Massachusetts, predictions went to neighborhoods with “three times the Latino population and twice the low-income population as the city average.” In the Chicago suburb of Elgin, the fewest crime predictions went to the neighborhoods where the average annual income exceeds $200,000. Again, PredPol regularly sent patrols to the Latino and poor areas. Elgin Police Department deputy chief Adam Schuessler called the use of PredPol “bias-by-proxy.” That department has already stopped using PredPol.

PredPol CEO Brian MacDonald was asked about the racial and income disparities within the predictions. MacDonald wouldn’t answer directly but instead argued that PredPol is used “to help direct scarce police resources to protect the neighborhoods most at risk of victimization.” The assertion made is that as PredPol does not include race or demographic information in its algorithms, this “eliminates the possibility for privacy or civil rights violations seen with other intelligence-led or predictive policing models.” However, this assertion does not seem to match the outcomes of its predictive policing software.

Elgin is not the only city growing disillusioned by PredPol’s outcomes. In Tracy, California, police chief of staff Sgt. Craig Koostra said, “As time went on, we realized that PredPol was not the program that we thought it was when we had first started using it.” Two other California cities have also stopped using the software. Milpitas Police Department lieutenant Greg Mack called it “time consuming and impractical” while finding no evidence it helped reduce crime rates. The Los Angeles Police Department stopped using PredPol in 2021.The examined reports show that of the 38 agencies identified, only 15 are still PredPol customers; two of those said they are still paying for it but not using it.

PredPol has also lost luster in academic circles. Over 1,400 mathematicians signed an open letter begging colleagues not to work with law enforcement in such capacities, singling out PredPol. Ferguson unfortunately believes these companies are here to stay. “These big companies that are going to hold the contracts for police [data platforms] are going to do predictive analytics,” said Ferguson. Just because PredPol changed its name to Geolitica does not mean it is going to change its business model. Ferguson added, “They’re just not going to call it predictive policing. And it’s going to be harder to pull apart for journalists and academics.” 

Source: gizmodo.com

As a digital subscriber to Criminal Legal News, you can access full text and downloads for this and other premium content.

Subscribe today

Already a subscriber? Login

 

 

Prisoner Education Guide side
CLN Subscribe Now Ad
CLN Subscribe Now Ad 450x600