This article is more than 1 year old

AWS won't sell facial-recog tool to police for a year – other law enforcement agencies are in the clear

Cop block doesn't mention BLM, nods to legislative interest in writing rules for AI ethics

Updated Amazon Web Services has announced a one-year moratorium on police use of its facial-recognition technology.

"We've advocated that governments should put in place stronger regulations to govern the ethical use of facial recognition technology, and in recent days, Congress appears ready to take on this challenge," the company wrote in a Wednesday blog post. "We hope this one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested."

AWS's facial-recognition tech is called Rekognition and the company says it "provides highly accurate facial analysis and facial search capabilities that you can use to detect, analyze, and compare faces for a wide variety of user verification, people counting, and public safety use cases."

Such tech is controversial for a host of reasons. Clearview AI has been roundly criticised for building a colossal database of facial images by scraping the web and then selling it to Police. It's feared that Clearview's tech could be used to identify those who attended recent Black Lives Matter protests, but as facial-recognition models have often proved inaccurate and biased their use is problematic.

Many people, expressions of fear

Facial recognition software easily IDs white men, but error rates soar for black women


Rekognition has been found to misidentify members of Congress as criminals and to be at its worst when identifying people of colour. It's also been found to identify women as men 19 percent of the time.

The test of Congressional representatives was conducted by the American Civil Liberties Union (ACLU), which has campaigned against use of the facial-recognition tech by law enforcement agencies and called out Amazon's tech as especially flawed.

Amazon's decision to deny police access to its tech for a year follows IBM's decision to quit facial recognition altogether and work with Congress to promote racial equality. But while IBM explicitly linked its decision to the technology's potential to "promote discrimination or racial injustice", the AWS statement on the matter mentions only US legislators' increasing interest in regulation and its desire to help devise new rules and is silent on intentions to sell Rekognition to other law enforcement agencies.

Police are, of course, the focus of anti-brutality and anti-racism protests around the world. Other law enforcement and intelligence agencies aren't quite under the same spotlight, yet could be just as susceptible to misuse of facial recognition as police.

The moratorium also lets Amazon wait a year until everything's calmed down and well out of the news cycle before, presumably, quietly resuming its AI service.

AWS has named some users it will allow to continue using Rekognition, naming Thorn, the International Center for Missing and Exploited Children, and Marinus Analytics, which use the tech "to help rescue human trafficking victims and reunite missing children with their families." ®

Updated to add

Microsoft has pledged to stop selling facial-recognition services to police until America passes a federal law regulating the technology.

More about


Send us news

Other stories you might like