Just one problem – it seldom works
Two years ago, the Center on Privacy and Technology (CPT) at Georgetown Law School said that images representing about half the adults in the US already resided in facial recognition databases maintained by US authorities.
The CPT study suggests facial recognition accuracy isn't quite as good as the FBI claims. "Of the FBI’s 36,420 searches of state license photo and mug shot databases, only 210 (0.6 per cent) yielded likely candidates for further investigations," it said. "Overall, 8,590 (4 per cent) of the FBI’s 214,920 searches yielded likely matches."
Earlier this month, a study published by Big Brother Watch in the UK painted a dismal picture of facial recognition accuracy. It claims that when London's Metropolitan Police used facial recognition technology, it was accurate only two per cent of the time. The other 98 per cent of supposed facial recognition matches wrongly identified innocent members of the public.
Zero arrests, 2 correct matches, no criminals: London cops' facial recog tech slammedREAD MORE
"The biggest question is why are agencies buying it?" said Jeff Bigham, associate professor at Carnegie Mellon's Human-Computer Interaction Institute, in an email to The Register. "It could be that they're simply buying into the hype around AI, like so many others."
That said, Bigham acknowledges that facial recognition software and related AI-oriented code can have some value to reduce the size of haystacks when looking for needles.
"For instance, if it has very few false negatives, then you've reduced your problem of finding thousands of potential suspects among millions of people to simply examining 20 photos to find the one that is a real match," he said. "Presumably, these technologies also work better if you have some idea about who you're looking for and where they might be."
Bigham said his concern is that if law enforcement agencies invest significantly in facial recognition technology, they may be inclined to trust it more than they should.
"We know that many of these technologies have biases, and often those biases match pretty well with the humans who are supposed to be the safeguards," he said.
"So, for instance, if the technology tends to produce more errors on black faces (false positives) and law enforcement also has a bias against black people (also false positives), which are both not only plausible but documented, then there's a very real chance this technology will amplify that bias."
In an email to The Register, Orlando Police Department Sergeant Eduardo Bernal stressed that the OPD's partnership with Amazon aims to test the viability of the technology.
"At this time in the pilot, as it is still very early on in this process, we have no data that supports or does not support that the Rekognition technology works," he said.
The City of Orlando, he said, has only provided facial imaging for a handful of OPD officers who volunteers to participate in the pilot test, which is limited to eight city-owned cameras.
"The Orlando Police Department is not using the technology in an investigative capacity or utilizing any images of members of the public for testing," he said. "All use of the testing and this pilot is being done and operated in accordance with current and applicable law."
Still needs the human touch
In a phone interview, Jeff Talbot, public affairs officer for the Washington County Sheriff's Office, which has also been testing Rekognition, said while it's difficult to make a definitive statement about system accuracy because every photo fed into the system is different, the technology has proven to be useful.
"Anecdotally, we have made a number arrests based on this software in the last year," he said, citing various low-level offenses.
The way the Washington County Sheriff's Office uses the technology is by feeding images – say a video surveillance frame of a suspect who used a stolen credit card at a store – with the Washington County jail booking database.
Then if there's a possible match, a deputy is still required to independently verify that the lead is worth investigating, Talbot explained.
"We are not using it as mass surveillance and we are not using it as real-time surveillance," he said noting that such uses are prohibited under Oregon State law and departmental policies.
Much of the appeal of Rekognition appears to be its cost. Talbot said the application using the Rekognition API was developed in-house at a cost of about $400. The AWS bill comes to something like $6 to $12 per month. Other options cost tens of thousands of dollars, he said.
Speaking by phone, Bigham said, "To construct good policy, we need to better understand how the technology works. I don't believe it's at the state where it should be tested in real use cases with law enforcement."
Bigham argues that facial recognition isn't a solved problem, which makes claims about accuracy very difficult. "The error rates are all over the map," he said.
"Since we don't know how the system performs in the real world under different conditions, it seems really odd to be marketing this as a tool that's ready for use by law enforcement," he said. ®