This article is more than 1 year old
J'accuse! Amazon's Rekognition reckons 1 in 5 Californian lawmakers are crims in ACLU test
You gotta use 99% confidence setting before arresting anyone
Amazon's Rekognition system wrongly matched one in five Californian politicians with images from a database of 25,000 wanted criminals' mugshots in tests by the American Civil Liberties Union (ACLU).
If this story sounds familiar, it's because the ACLU ran a similar test last year – which matched 28 members of US Congress with crim mugshots.
The lobby group re-ran the tests to highlights its concern with moves to implement recog technology on police body-worn cameras. Of the 26 lawmakers identified as possible criminals, more than half were described as "people of color". The California Assembly and Senate have 120 members in total.
Among those wrongly fingered was Phil Ting, the Californian legislator pushing legislation to stop the use of biometric technology in bodyworn cameras.
Ting said in a statement: "This experiment reinforces the fact that facial recognition software is not ready for prime time – let alone for use in body cameras worn by law enforcement. I could see innocent Californians subjected to perpetual police line ups because of false matches. We must not allow this to happen."
Arrested development: Cops dump Amazon's facial-recognition API after struggling to make the thing work properlyREAD MORE
At a press conference, Ting noted that body cams were meant to increase trust in law enforcement and improve transparency, not be used as a surveillance tool. He also said that putting thousands of facial-recognition cameras on street corners across the state would require lengthy public discussion and legislation, but adding the software to police body cameras would achieve the same result without debate.
The bill, AB1215, would stop police analysing body-cam footage using facial-recognition software. Ting represents San Francisco, which has banned the use of any biometric technology by any government department.
Last month, police in Orlando, Florida, ditched Amazon's cloud-based recog system which they were trialling on live surveillance feeds because, despite a year of trying, they could not get it to work.
The Reg has not heard back from Amazon, but the etailer complained to other outlets that the test was not fair because it used default settings on the software. Amazon recommended using a 99 per cent confidence setting before actually arresting or shooting someone. Perhaps change the default then, lads?
There's video of the press conference on Facebook here. ®