Axon, the largest supplier of body cameras to America's cops, will not add facial-recognition technology to its gear anytime soon, it announced Thursday.
Formerly known as Taser, Axon had asked its AI and Policing Technology Ethics Board – made up of engineers, social scientists, and lawyers – to mull over the impact of building face-scanning machine-learning systems into its products. After a year of consideration, the board recommended Axon avoid the tech. And so, astonishingly, it has.
“Face recognition technology is not currently reliable enough to ethically justify its use on bodyworn cameras," the panel of eggheads concluded in a report out this week. "At the least, face recognition technology should not be deployed until the technology performs with far greater accuracy and performs equally well across races, ethnicities, genders, and other identity groups.”
Before we lose our minds over sentient AI, what about self-driving cars that can't detect kids crossing the road?READ MORE
It’s well known that today's computer-vision software struggles to accurately identify people with darker skin, and it isn't terribly reliable when analyzing women, mainly due to biased training data sets and wonky image-processing algorithms. Today's algorithms and models are, therefore, bad news all round for women of color in particular.
The potential for misidentification by AI code sifting through police camera footage, in real time or offline, is, thus, a real threat for black communities especially, who are frequently targeted and harassed by law enforcement. No one wants to be wrongly fingered as a criminal by a computer program studying video footage, after all. Last month, San Francisco banned its local government departments, including its police department, from using facial recognition over concerns about the tech.
In the wake of all of this, Axon has decided to heed the advice of its board, and it will not add facial recognition capabilities to its equipment just yet.
“Current face matching technology raises serious ethical concerns. In addition, there are technological limitations to using this technology on body cameras. Consistent with the board's recommendation, Axon will not be commercializing face matching products on our body cameras at this time,” the biz said in a statement.
Police camera inaction? Civil liberties group questions forces' £23m body-cam spendREAD MORE
Axon defines “face matching” as algorithms that automatically search databases of faces of wanted or persons of interest for faces caught on camera. The manufacturer will, however, deploy software that performs face detection, where people's faces are highlighted with bounding boxes to help officers track their movements across a scene.
To be clear: the manufacturer has not outright banned face matching or facial recognition in perpetuity. It’s just not implementing it for now. Axon will continue to research this technology to reduce biases, improve accuracy, and track developments in academia and the private sector.
Although facial recognition is currently off the cards, Axon will implement software to perform things like gunshot detection “to ensure events involving firearm discharges are recorded.” If shots are detected, code can point officers to those particular moments in the footage.
“We thank the members of our board for the advice they have provided us, and we appreciate the time and effort that each member has dedicated to this effort,” the biz concluded. "Outside ethical advisory boards like this are a new concept among technology companies, and we are proud to embrace it and design an ethical roadmap that we hope other companies can emulate." ®