This article is more than 1 year old

Majority of Axon's AI ethics board resigns over CEO's taser drones

Flying weapons offered as solution to US school shootings

Nine members of non-lethal weapons-maker Axon's AI ethics board resigned Monday after the company's CEO announced plans to build drones equipped with tasers to prevent US school shootings. 

When an 18-year-old shot dead nineteen students and two teachers, whilst wounding several others at an elementary school in Uvalde, Texas, Axon's founder and CEO, Rick Smith, began thinking about how he could help stop mass shootings. His best idea: deploying taser-equipped drones in classrooms and public venues.

Axon develops body cameras and non-lethal weapons for law enforcement. Smith thought he could combine both capabilities and install them onto a drone that, in theory, could immobilize shooters. Smith announced Axon had formally begun developing such systems last week.

"The only viable response to a mass shooter is another person with a gun," Smith said in a statement. "In the aftermath of these events, we get stuck in fruitless debates. We need new and better solutions. For this reason, we have elected to publicly engage communities and stakeholders, and develop a remotely operated, non-lethal drone system that we believe will be a more effective, immediate, humane, and ethical option to protect innocent people."

But not everyone at Axon supported Smith's vision. The company's AI ethics board – a group made up of legal and technical experts from industry and academia – voted against development of taser-equipped drones and did not approve Smith's announcement.

"Only a few weeks ago, a majority of this board – by an 8-4 vote – recommended that Axon not proceed with a narrow pilot study aimed at vetting the company's concept of Taser-equipped drones," nine members, who left the board, declared in a statement. "In that limited conception, the Taser-equipped drone was to be used only in situations in which it might avoid a police officer using a firearm, thereby potentially saving a life."

"We understood the company might proceed despite our recommendation not to, and so we were firm about the sorts of controls that would be needed to conduct a responsible pilot should the company proceed. We just were beginning to produce a public report on Axon's proposal and our deliberations."

Smith, however, decided to override the ethics board and continue working on what he believed could be a viable solution to school shootings. He launched a live Q&A session on Reddit, inviting netizens to post their questions about Axon's idea.

"I know drones in schools can sound nuts. Except that it can't be any crazier than another mass shooting in a school. Obviously, there are complicated issues at stake, but the best place to start is to listen to what people think and answer questions," he said

Days later, however, he announced the project had been paused. "Our announcement was intended to initiate a conversation on this as a potential solution, and it did lead to considerable public discussion that has provided us with a deeper appreciation of the complex and important considerations relating to this matter," Smith said.

"I acknowledge that our passion for finding new solutions to stop mass shootings led us to move quickly to share our ideas. However, in light of feedback, we are pausing work on this project and refocusing to further engage with key constituencies to fully explore the best path forward."

Although Smith said he'd pause working on the idea, the majority of its AI ethics board resigned anyway, citing years-long efforts to thwart Axon's "use of real-time, persistent surveillance in its products". The technology "will harm communities of color and others who are overpoliced, and likely well beyond that" they said, and "has no realistic chance of solving the mass shooting problem."

Axon declined to comment beyond Smith's latest statement.

"A remotely operated non-lethal taser-enabled drone in schools is an idea, not a product, and it's a long way off," Smith reiterated. "We have a lot of work and exploring to see if this technology is even viable and to understand if the public concerns can be adequately addressed before moving forward. Pursuing an extended research path is just one element of getting this right." ®

More about

TIP US OFF

Send us news


Other stories you might like