This article is more than 1 year old
Politicians fume after Amazon's face-recog AI fingers dozens of them as suspected crooks
Everyone jokes congressfolk are crims but... sheesh, take it easy, AWS
Updated Amazon’s online facial recognition system incorrectly matched pictures of US Congress members to mugshots of suspected criminals in a study by the American Civil Liberties Union.
As a result, the ACLU, a nonprofit headquartered in New York, has called for Congress to ban cops and Feds from using any sort of computer-powered facial recognition technology due to the fact that, well, it sucks.
Amazon’s AI-powered Rekognition service was previously criticized by the ACLU when it revealed the web giant was aggressively marketing its face-matching tech to police in Washington County, Oregon, and Orlando, Florida. Rekognition is touted by the Bezos Bunch as, among other applications, a way to identify people in real time from surveillance camera footage or from officers' body cameras.
The results from the ACLU's latest probing showed that Rekognition mistook images of 28 innocent members of Congress for mugshots of cuffed people suspected of crimes. The incorrect matches were skewed towards people of color and, including six members of the Congressional Black Caucus. Specifically, nearly 40 per cent of the wrong matches were of people of color despite them making up 20 per cent of Congress.
Did you see this? @amazon face surveillance technology FALSELY matched me w/ someone else’s mugshot. I’m outraged & worried by the impact this tool will have on #CommunitiesOfColor when put in the hands of law enforcement! @JeffBezos: We need to talk ASAP. https://t.co/xFOy8duef1— Rep. Jimmy Gomez (@RepJimmyGomez) July 26, 2018
“If law enforcement is using Amazon Rekognition, it’s not hard to imagine a police officer getting a “match” indicating that a person has a previous concealed-weapon arrest, biasing the officer before an encounter even begins," ACLU officials detailed earlier today.
"Or an individual getting a knock on the door from law enforcement, and being questioned or having their home searched, based on a false identification.”
Facial recognition software easily IDs white men, but error rates soar for black womenREAD MORE
The test involved comparing images of every senator and representative – all 535 of them – against 25,000 publicly available photos of arrested Americans using Rekognition to see if Amazon's cloud-based system matched any. It only cost $12.33 to run – cheaper than the price of a large pizza.
Those 28 mismatches therefore represent a five per cent error rate. On the one hand, that sounds reasonably low. On the other hand, if you apply that to thousands upon thousands of faces pulled from security and body camera footage, that's a lot of innocent people being incorrectly fingered by software.
Accusations by machines can be verified by human officers, of course. However, we all know that cases of mistaken identity can quickly get out of hand, especially if it involves the automatic issuing of fines and other punishments. If a computer spots you and wrongly thinks you're an arrested crook breaking their conditions of bail, by the time you've proved you're innocent, it may be too late.
It’s not totally clear why Amazon’s face recognition technology is so inaccurate. The biz is rather secretive of how its code works. “Amazon hasn’t disclosed how it tests for bias or accuracy. The burden should be on Amazon to explain why these results are happening,” Jacob Snow, a technology and civil liberties attorney working on behalf of the ACLU in Northern California, told The Register on Thursday.
Biases in training data are known to trickle through to machine learning systems. It could be that the Rekognition and the mugshot dataset contained a disproportionate number of men and people of color.
“We don’t have access to Amazon’s training data,” Snow said. “We’re also not disclosing any of the images we used or any biographic or demographic details out of privacy concerns.”
The ACLU has called Congress to “enact a moratorium” to prohibit law enforcement from using the technology. “I think Congress is taking this issue seriously,” said Snow.
He told us that Jimmy Gomez and John Lewis, both Democratic House representatives in California and Georgia respectively, have sent a letter to Amazon demanding a meeting to discuss Rekognition. Senator Ed Markey (D-MA) has also written a letter to ask for a formal meeting to learn more about Rekognition really works.
“Companies should listen to their employees and the public before rolling out their technology,” Snow concluded.
A spokesperson for Amazon declined to comment. ®
Updated to add
When using AWS's Rekognition service to match faces, the software will give you a confidence level – 100 per cent being the AI is absolutely certain it is looking at the same person in two different photos. The ACLU set their code to require an 80 per cent or higher confidence to match a face, the default setting for Rekognition.
Amazon's Matt Wood has blogged in response to the ACLU study, saying the web giant now recommends 99 per cent confidence for things that are important, such as cops poring over mugshots – it previously recommended 85 per cent.
PS: Speaking of crap AI, it was reported this week that IBM Watson's healthcare assistance service recommended "unsafe and incorrect" cancer treatments to doctors.
We'll be examining machine learning, artificial intelligence, and data analytics, and what they mean for you, at Minds Mastering Machines in London, between October 15 and 17. Head to the website for the full agenda and ticket information.