Is computer vision the cure for school shootings? Likely not
Gun-detecting AI outfits want to help while root causes need tackling
Comment More than 250 mass shootings have occurred in the US so far this year, and AI advocates think they have the solution. Not gun control, but better tech, unsurprisingly.
Machine-learning biz Kogniz announced on Tuesday it was adding a ready-to-deploy gun detection model to its computer-vision platform. The system, we're told, can detect guns seen by security cameras and send notifications to those at risk, notifying police, locking down buildings, and performing other security tasks.
In addition to spotting firearms, Kogniz uses its other computer-vision modules to notice unusual behavior, such as children sprinting down hallways or someone climbing in through a window, which could indicate an active shooter.
If you're wondering about the code's false positive or error rate, Kogniz says it has a "multi-pass AI" and "a trained team of human verifiers" checking the results of its detection software. Either you welcome that extra level of confirmation, or see it as technology potentially falling back on humans right when the computers are needed most.
"[Our solution is] making it dramatically easier for companies, governmental agencies, schools, and hospitals to prepare for and then help reduce the harm done by an active shooter event," said Kogniz CEO Daniel Putterman.
Kogniz is not the first computer-vision company to get into the gun-recognition game – there is a considerable list of companies deploying similar technology and some, such as ZeroEyes, specialize in nothing but firearm detection.
"By spreading their attention across multiple offerings, developers are less able to provide the very best service in gun detection," ZeroEyes said in a blog post. ZeroEyes' technology has been deployed at schools in 14 states, including Oxford High School in metro Detroit, where a 15-year-old shooter killed four and injured seven last year.
Other vendors – such as Defendry, which has a security suite including a panic button app, audio gunshot sensors, first-responder drones and gun detection AI; and Omnilert – are in this depressing growth market. Additional companies in the AI gun detection field include Arcarith, Athena Securities, and Scylla.
Is there actually enough time?
In 2019, the police response to a mass shooting in Dayton, Ohio, was a mere 32 seconds, during which time nine people died. A 2021 shooting at an Indianapolis FedEx facility that killed nine people was also over before the cops could arrive, even though they did in just minutes.
Both of those cases raise the question of whether AI gun detection can reduce response times enough to save lives. Particularly if officers are too scared or choose not to actually respond, such as in the mass murder at Uvalde.
Several other AI-based mass shooting prevention methods have been proposed, such as smart guns that won't fire if they detect a human target. Others have proposed training AIs on data from past shootings, local gun purchases, and socio-economic data to find trends indicative of a planned shooting, as well as scanning social media for similar indicators.
There's also AI bias, a well-documented problem that even diverse datasets and balanced training can't seem to solve. Take a 2020 deployment of facial and gun recognition tech in a New York school district: emails between the school district and the company that deployed the system show concerns over it commonly misidentifying things, such as broom handles for guns.
- Majority of Axon's AI ethics board resigns over CEO's taser drones
- Taser maker offers electric-shock drones to stop school shootings
- Manhunt: 'Armed and dangerous' MIT AI scientist sought by cops probing grad student's gun murder
- Good Grief! Ransomware gang has only gone and pwned the NRA – or so it claims
Speaking to Utah publication Deseret News, ACLU senior policy analyst Jay Stanley said he was concerned that computer vision systems could lead to a world "where people avoid doing something as simple as skipping down a sidewalk for fear of setting off anomaly detectors and being questioned by the police."
One possible use of AI might have more promise, though: a 2018 study from Cincinnati Children's Hospital Medical Center found that AI analysis of therapy sessions agreed with the danger assessments of psychiatrists and counselors 91 percent of the time. The addition of demographic and socioeconomic data only improved the results in identifying young people who were at risk for committing a violent act.
With so many potential complications, is machine learning really ready to prevent mass murder?
Gun violence is the number one killer of children and teens in America, taking an average of 12 young lives, and injuring 32 more, each day. AI may help, but without addressing its shortcomings, less technological approaches are perhaps also needed. Stronger, more widely available mental health care might be a good start.
Earlier this month, President Biden called for, among other things, an assault weapon ban, expanded background checks, and limits on magazine capacity, which he called "rational, commonsense measures." ®
- Central Intelligence Agency
- Computer Science
- Deep Learning
- Federal government of the United States
- Five Eyes
- Foreign Intelligence Surveillance Act
- Google AI
- Harvard University
- New Mexico
- Tensor Processing Unit
- United States Armed Forces
- United States Department of Commerce
- University of California
- US Treasury