SAN FRANCISCO — The city of Orlando’s police department has ended its test of a facial recognition program created by Amazon that has come under fire from privacy advocates. But other law enforcement organizations say they continue to use it to solve crimes.
Amazon’s Rekognition software works by comparing images provided by the customer to a database of images the customer has also provided. It searches for a match using the computing power of Amazon’s cloud computing network AWS.
It has come under heavy fire from privacy advocates, who fear it could be used to unfairly target protesters, immigrants and any person just going about their daily business. In May the ACLU and civil rights groups demanded that Amazon stop selling the software tool.
A joint statement issued Monday from the city of Orlando and the Orlando Police Department made clear the city isn’t backing down on using tech when it feels it is warranted.
“Partnering with innovative companies to test new technology — while also ensuring we uphold privacy laws and in no way violate the rights of others — is critical to us as we work to further keep our community safe,” the statement read.
On Monday, the American Civil Liberties Union of Florida sent a letter to the city calling use of the software a potential invasion of residents’ privacy, free speech and due process rights. The letter demanded that the city stop using Rekognition.
Orlando’s pilot test had ended last week.
The city had created a database composed of pictures of the faces of a handful of Orlando police officers who volunteered to participate in the test, then compared those faces to images from eight city-owned surveillance cameras to see if it could correctly identify the officers when they were in the images from the cameras.
Orland did not use the technology in an investigative capacity or utilize any images of members of the public for testing, Sgt. Eduardo Bernal told USA TODAY.
The ACLU of Florida’s said the restricted scope of the test didn’t mean it would stay restricted forever.
“No City policies or rules meaningfully restrict the Police Department from rapidly expanding the system in the nearfuture by, for example, activating it across the City’s public-facing cameras or adding it to the many body cameras Orlando police officers use every day,” the letter said.
Other law enforcement agencies continue to use the program, though in much more limited ways than privacy advocates have raised concerns over.
In Washington County, Oregon, the Sheriff’s Department has used Rekognition for a year and a half, said Deputy Jeff Talbot. The department confirms each match made through the Rekognition software by another method and it is only used in criminal investigations, he said.
It’s used the software to assist in identifying criminal suspects against the department’s own jail booking photos, which are public record, he said.
“The Sheriff’s Office has not, and will not, utilize this technology for mass or real-time surveillance. That use is prohibited by both Oregon state law and our own policy,” said Talbot.
The software has been used for non-crime purposes as well. For example, during the wedding of Prince Harry and Meghan Markle, Sky News created a database of royals, celebrities and then compared it with photos of the people entering St. George’s Chapel at Windsor Castle to attend the ceremony.
Using Rekognition, Sky News was able to quickly identify who was in the images, allowing it to run their names as subtitles on the screen as they walked into the church.
More: Amazon should stop selling facial recognition software to police, ACLU and other rights groups say
More: Amazon investors join ACLU urging halt to facial recognition tool used by police