The Orlando Police Department in Florida announced that it will continue to test facial recognition software developed by Amazon, despite concerns of abuse by police—and privacy issues.
Amazon’s facial recognition program, called Rekognition, was designed to allow its customers to upload an image and quickly find a match with images in a database created by the customer.
However, law enforcement has found a new twist for the technology.
Police and city officials in Orlando said in a statement that using Amazon’s software would allow police to locate a fugitive before he commits another crime, identify a sex offender hanging around schools, or find a missing child.
“The suspects in these examples would have their images entered into the system and perhaps could be spotted by one of the many cameras, and never allowed to get anywhere near the victims or a large gathering,” officials said. The database would compare an uploaded image to images captured by eight surveillance cameras around the city.
“Amazon Rekognition is primed for abuse in the hands of governments,” a letter by 34 civil rights groups, including the ACLU, sent to Amazon CEO Jeff Bezos, warned. “This product poses a grave threat to communities, including people of color and immigrants,” the letter said. People should be “free to walk down the street without being watched by the government.”
“Amazon should be protecting its customers and communities,” ACLU attorney Matt Cagle said. “It should not be in the business of powering dangerous surveillance.”
As a digital subscriber to Criminal Legal News, you can access full text and downloads for this and other premium content.
Already a subscriber? Login