Skip navigation
CLN bookstore
× You have 2 more free articles available this month. Subscribe today.

Your Neighborhood Does Not Need an Automated License Plate Reader

by Anthony W. Accurso

Companies are targeting Homeowners Associations and Neighborhood Associations as potential customers for Automated License Plate Readers (ALPRs), promising the devices will keep neighborhoods safe. However, there is no evidence these devices improve safety, though they have been proven to compromise your privacy.

ALPRs, such as those being sold by companies like Flock Safety and Vigilant Solutions, generally feature one or more high-definition cameras attached to a computer running AI-based visual recognition software (to read the license plates of passing vehicles) and a software database to store the gathered info.

When these devices work correctly — which is most of the time, but not always — they can create a detailed list of all the vehicles that come and go in a community. It is easy to believe this information will be used to more easily catch a suspect after a crime has been committed, or that posting notices about an ALPR being in use will prevent crime by making would-be criminals think twice.

There is no evidence that either conclusion is true, the Electronic Frontier Foundation reports. Criminals can just as easily steal someone else’s car to commit a crime. And if a crime is committed, everyone who has driven through the neighborhood becomes a suspect, not just the drivers from outside the community who just drive through that day.

Even when police use this information, it merely helps them get around Fourth Amendment protections meant to safeguard the privacy of innocent citizens; it doesn’t actually narrow the suspect pool.

Recently, a state audit in Vermont found that 11% of ALPR searches violated restrictions on when cops can and can’t access such data. This means police are already abusing their access to this information. But under the Constitution, a private entity like an HOA can legally collect this information and hand it to police, circumventing constitutional restrictions.

And this is the best-case scenario. Is this really the kind of information you want your Homeowners Association (“HOA”) collecting? With this data, it becomes trivial for a person to pick out variations in routines and draw conclusions. If someone reliably goes to work every weekday and then stops, did they get fired from their job? If a married couple’s vehicles are rarely at home at the same time, are they preparing to divorce? If someone frequently has lots of visitors, could that data be used to infer they’re dealing drugs and lead to police searching the home? Do you really want your HOA committee members to know all these things?

When errors are made, what other nightmare scenarios might arise? We know that when Black men and women are profiled as “dangerous” or “criminal,” interactions with the police often turn deadly. In August 2020, a mistaken ALPR hit led police to pull over a Black family, point guns at them, and have them — including two children, ages 6 and 8 — lie on their bellies in a parking lot. This was not the first time such a mistake was made due to “machine error,” but at least nobody died that time.

The software is largely confined to recognizing license plates for now, but it would be relatively easy to configure it to gather other data.

In summary, ALPRs provide no real benefit to a community other than to soothe vague fears about criminality. As crimes rates are dropping around the country, such tools only serve to compromise the privacy of community members and enrich big data companies that prey on fear.

 

Source: eff.org

As a digital subscriber to Criminal Legal News, you can access full text and downloads for this and other premium content.

Subscribe today

Already a subscriber? Login

 

 

Prisoner Education Guide side
Advertise here
The Habeas Citebook Ineffective Counsel Side