FBI recruits Amazon Rekognition AI to hunt down 'nudity, weapons, explosives'
Honestly, it sounds like a fun time
The FBI plans to use Amazon's controversial Rekognition cloud service "to extract information and insights from lawfully acquired images and videos," according to US Justice Department documents.
In its Agency Inventory of AI Use Cases, the DOJ lists the project, code-named Tyr, as being in the "initiation" phase for the FBI, which intends to customize and use the technology "to review and identify items containing nudity, weapons, explosives, and other identifying information."
The DOJ document doesn't mention a start date, and simply says the Feds will be using a Rekognition-based commercial off-the-shelf system purchased pre-built from a third party. The FBI declined to comment, and though Amazon promised The Register a statement in response to our inquiries, that has yet to arrive.
In addition to providing facial recognition and analysis services, Amazon says Rekognition can also search for objects in image and video libraries and detect "inappropriate, unwanted, or offensive content," among other capabilities.
Amazon previously pledged to indefinitely ban police from using Rekognition — but with some loopholes. It didn't pause selling the service to government agencies, however, or to third-parties that may then provide the technology to cop shops.
- Amazon Ring, Alexa accused of every nightmare IoT security fail you can imagine
- Four more months of Section 702 snooping slipped into $890B US defense budget bill
- Proposed US surveillance regime would enlist more businesses
So, to be fair, Project Tyr doesn't break any earlier promise by the cloud giant. It does, however, come at a time when concerns about warrantless surveillance seems to be growing, especially when the FBI is doing the snooping.
Earlier this week, Amazon said it would kill the easy button that allowed law enforcement to request Ring video footage without a warrant. Specifically, Amazon sunsetted the Request for Assistance feature in its Neighbors app, which allowed the plod to slurp Ring customers' video recordings. Now officers have to ask first.
The move was applauded by data privacy and civil liberties advocates.
"The ability for law enforcement to use the Neighbors app to mass-request footage from camera owners was always dangerous, and had a documented effect of exacerbating racial profiling," Fight for the Future Director Evan Greer told The Register in an earlier interview.
On the other hand, the news about the FBI using Rekognition promoted a very different response.
NSA buys up Americans' browser recordsMEANWHILE
"I think it's important to look both at FBI and Amazon practices in this space," said Jake Laperruque, deputy director of the Center for Democracy and Technology's (CDT) Security and Surveillance Project.
"The FBI permits broad use of facial recognition in investigations (people don't even need to be designated suspects to be scanned), programs its systems to always return matches even if those matches are unreliable, and hides use of facial recognition from defendants," Laperruque told The Register.
Updated to add
Soon after this article was published, we heard back from Amazon. Spokesperson Duncan Neasham told us that as far as the web giant is concerned, the FBI's use of Rekognition doesn't break its moratorium on banning the police from using the API's face-comparison features. He characterized the technology as "an image and video analysis service that has many non-facial analysis and comparison features."
"As we've said many times, and continue to believe strongly," Neasham continued, "companies and government organizations need to use existing and new technology responsibly and lawfully. We also believe that governments should put in place regulations to govern the ethical use of facial recognition technology, and we are ready to help them design appropriate rules, if requested."
On whether Amazon gave advice or directives to the Feds on how to safely and thoughtfully use Rekognition, the rep told us:
We provide guidance to all Rekognition customers, including law enforcement customers, on the proper and responsible use of Rekognition (such as in Rekognition’s developer guides and on our Responsible AI page), and we have a clear Acceptable Use Policy and Responsible AI Policy.
For example, the AWS Service Terms govern the use of our services, including Sections 50.8 (which applies to law enforcement use of Rekognition) and 50.9, which says, “Amazon has implemented a moratorium on use of Amazon Rekognition’s face comparison feature by police departments in connection with criminal investigations. This moratorium does not apply to use of Amazon Rekognition’s face comparison feature to help identify or locate missing persons.”