Skip navigation
The Habeas Citebook: Prosecutorial Misconduct - Header
× You have 2 more free articles available this month. Subscribe today.

A New Age of Video Analytics

Video footage from security cameras is common fare in a variety of modern movies and television shows, but depictions in popular culture rarely offer an accurate portrayal of the state of surveillance technology or the way police or other security personnel utilize it. Gone are the days of VCRs in out-of-the-way cabinets and disinterested security guards sitting in front of a bank of screens. Nor is it safe to assume that images from camera networks are stored in servers somewhere and only accessed after a crime by investigators willing to sift through hours of footage.

The world has changed. A study by the Electronic Frontier Foundation (“EFF”) has revealed the extent to which “advanced algorithms are watching every frame on every camera and documenting every person, animal, vehicle, and backpack as they move through physical space, and thus camera to camera, over an extended period of time.”

Modern video analytics has evolved in the hands of police and private security firms. The huge number of cameras that nearly blanket mass public settings provide a digital puzzle that is being pieced together through machine learning, artificial intelligence, and software that ties different surveillance networks into a seamless whole.

EFF has, in its Atlas of Surveillance project, found 35 law enforcement agencies that use advanced video analytics technology, and that number continues to grow. Use of that technology by private firms is harder to track because private companies are not required to report contracts and purchases like government agencies. To investigate the scope and capability of the latest video analytics technology, EFF has acquired user manuals for two of the most common analytics systems on the market today. What they discovered was, according to EFF, “even scarier than we thought.”

Briefcam, often packaged with Genetec video technology, is popular with police agencies nationwide and is a particularly effective tool at real-time crime centers where police combine and search footage from across a jurisdiction and communicate with field personnel as the situation develops. Briefcam can sift through hours of images from multiple cameras and perform a variety of routines, including narrowing in on a particular face, backpack, or article of clothing to track a surveillance subject.

The capabilities of Briefcam and similar products have raised concerns about police tracking and cataloging the identities of protesters or other citizens who have not committed a crime. There are, at present, no statutes prohibiting the use of video analytics in this way.

Private firms favor a different product—Avigilon systems, a subdivision of Motorola Solutions. Employed for private security networks, Avigilon systems analyze footage to identify objects, read license plates, and recognize faces. In San Francisco, for example, Avigilon cameras and software are used in six special districts (business improvement districts and community benefit districts) created to revitalize urban areas. These districts are extensively surveilled, with the footage analyzed by private security personnel who relay their findings to the police.

Both these systems can perform a variety of functions that use machine learning. They can detect “unusual activity” involving cars and people after “training” for a week on a particular series of cameras. These systems can also detect “tampering,” which can involve changing a camera’s field of vision or altering lighting in an area under surveillance.

These systems also have features that have drawn attention from public health officials. Briefcam can track proximity and duration of contact between a variety of individuals. Avigilon claims it can detect elevated skin temperature. Critics are concerned that features like these might, in the age of COVID-19, allow companies and police to rebrand invasive technology as a public health solution, a process known as “COVID-19 washing,” and consequently sow more distrust of government in situations where public collaboration could save lives.

There is almost no public oversight of the acquisition or deployment of surveillance technology or analytics tools like those described above. Even as the use of these tools grows, there are voices who question their efficacy. Dr. RaShall Brackney, chief of the Charlottesville, Virginia, police, asserted at a recent panel on technology in law enforcement that video analytics technology perpetuates racial bias and often “creates false positives in identifying suspects.”

Despite concerns about police reliance on secretive technology and the possibility of inappropriate monitoring of religious or political activity, video analytics is likely here to stay. More than 12 cities have banned police use of facial recognition, but without action at the federal level, this type of technology will continue to grow out of the public eye. 

 

Source: eff.org

As a digital subscriber to Criminal Legal News, you can access full text and downloads for this and other premium content.

Subscribe today

Already a subscriber? Login

 

 

Prison Phone Justice Campaign
Advertise here
Federal Prison Handbook - Side