by Michael Dean Thompson
Over 50 years ago, fear of crime was even then associated in the minds of the governing bodies with Black and brown communities. An effort to combat crime based on that fear spurred the creation of software that has since grown to become the predictive policing platforms known as fusion centers. No valid metrics could justify such immense expenditure, but nevertheless, billions are spent annually across the nation to enhance predictive policing with invasive surveillance via cameras, facial recognition, automated license plate readers, cell-site simulators and more.
It all began in 1966, not quite one year after President Johnson established the President’s Commission on Law Enforcement and the Administration of Justice, when the President added the Science and Technology Task Force to the Commission in an effort to combat crime. As the head of the task force, the Commission appointed Samuel L. Gass who at that point managed all of IBM’s federal system projects. As a result, the Police Beat Algorithm was created. The goal in creating the software was to determine how to divide a municipality based on demographics and geography in order to achieve an effective deployment of police resources. In direct response to recent events, including the Watts riots, Gass often pointed to the need to create “contingency riot and other emergency plans.”
The Police Beat Algorithm’s designer attempted to compute crime patterns by creating arbitrary weighting scales. Burglary, larceny, and auto theft received the same weight of 4 that was also given to homicide and rape. Meanwhile, traffic accidents got a score of 2 while drunkenness managed a 1. That they justified weighing a traffic accident that presumably had no intent higher than drunkenness reflects the task force’s biases about the nature of criminality. Likewise, weights were given to geographical boundaries based on the amount of crime within them. Criminality was also normed within the boundaries based on particular groups such as white and Black residents. That is, they attempted to determine how much crime is common for each group in that area for each crime type and, therefore, what levels would represent unusual activity.
Considering the technology available at the time, their goals were audacious. The task force wanted to create a system that would identify patterns within crime data, associate those patterns with given subjects, attach suspects to past crimes, and predict where to place police resources. These were tremendous requirements from computers that pale in comparison to the modern cellphone and which were fed through punch cards.
Weighted geography and criminality meant that profiling was built into the system. Black and brown communities were preset to receive the lion’s share of police attention. For communities already suffering from aggressive policing came a scientific-seeming rationale to dedicate more police rather than to find a solution. The Police Beat Algorithm merely justified a bigger hammer for the nail. It was intended as a proof of concept, but in 1968, the Kansas City Police Department put it to use. The impact of the implicit racism became obvious.
The same biases that fed into the Police Beat Algorithm and lent it the veneer of scientific authority infect the modern predictive policing systems today. And with the apparent backing of science, artificial intelligence, which is incapable of “thinking outside the box,” guides more hammers with devastating impact.
Source: Slate.com; The 1960s Experiment Created Today’s Biased Police Surveillance
As a digital subscriber to Criminal Legal News, you can access full text and downloads for this and other premium content.
Already a subscriber? Login