“The biggest concern with these systems is false positives when the system wrongly identifies someone who isn’t actually holding a gun,” one attorney said.
The Southeastern Pennsylvania Transportation Authority, which oversees transit in the Philadelphia area, is piloting an artificial intelligence-based gun detection program to address a recent uptick in gun violence and ongoing police staffing shortages, said Charles Lawson, SEPTA’s acting transit police chief.
“It’s coming to a head,” Lawson said. “Moving forward, technology has to be part of the answer.”
Through the six-month pilot, Pennsylvania-based ZeroEyes will use SEPTA’s existing camera system and video analytics technology to spot any object in the shape of a gun.
According to the company, trained personnel monitor the system and verify any weapons detected to ensure accuracy.
After experiencing a demonstration, Lawson said that he was impressed with the system’s high degree of accuracy. “When it sends an alert, within seconds, a person has verified [the alert]. The process happens pretty seamlessly in seconds,” Lawson said.
The pilot “is the first step for us in wading into the pool of AI,” Lawson added.
In Pennsylvania, openly carrying a gun is legal, and Lawson said the new system “doesn’t change what we can do under the law. We need a reasonable suspicion to stop an individual. Just carrying a weapon in and of itself is not a violation of the law.”
JT Wilkins, ZeroEyes’ senior vice president of sales, said the pilot would use SEPTA’s “robust infrastructure” and trained analysts to verify each alert.
But Jake Wiener, an attorney for the Electronic Privacy Information Center in Washington, said threat detection systems “often work substantially worse in the field than in controlled testing.”
“The biggest concern with these systems is false positives when the system wrongly identifies someone who isn’t actually holding a gun,” he said. A false positive in the case of weapon detection creates a “dangerous situation in which police believe that an innocent individual is a potential mass shooter. A false positive isn’t safe for anyone involved: the wrongfully identified suspect, bystanders or the police.”
He said the system might “provide marginally faster police response times to ongoing violence. But that limited benefit does not outweigh increased surveillance and heightened risks of wrongful identifications.
Wiener also expressed concerns about giving a private company access to public surveillance cameras on mass transit. “Unless limited by contract and strict auditing, it raises the specter of location tracking, selling information to third parties and other forms of surveillance for profit.”
However, ZeroEyes’ Wilkins said that the system protects the public’s privacy because the company focuses “on the firearm and the firearm alone,” not on facial recognition. And while “no AI is perfect,” he said that ZeroEyes has done “extensive testing.”
John Hollywood, a senior operations researcher for The RAND Corporation who studies criminal justice, homeland security, and information technology, said that rapid weapon detection could benefit mass transit.
“The basic technology to recognize weapons has been around for years,” Hollywood said. “As a pilot test, it’s worth experimenting with.”
Using an outside company to monitor the system does address the common problem “of not having enough people to watch cameras,” he said.
But that’s not enough. It’s important to have police officers stationed in the area who can respond quickly, Hollywood said. With this system, “we don’t know what the response procedure is. It could shave a few minutes off the response time, but we don’t know if that is enough.”
Article: Attorney Warns “false positives” Could Create Risks for Suspects, Bystanders, and Police in re Transit Authority A.I. Gun Detection Program