A relatively new startup, Truleo, claims to solve this problem with a platform that leverages AI to analyze body cam footage as it comes in. Truleo — which has raised $2.5 million in seed funding — converts the data into “actionable insights,” CEO and cofounder Anthony Tassone claims, using natural language processing and machine learning models to categorize incidents captured by the cameras.
The Seattle Police Department is one of the company’s early customers.
“Truleo analyzes the audio stream within body camera videos — we analyze the conversation between the officer and the public,” Tassone told VentureBeat via email. “We specifically highlight the ‘risky’ language the officer uses, which most often means surfacing directed profanity or using extremely rude language. However, we can also highlight officer shouting commands, so the command staff can evaluate the effectiveness of the subject compliance.”
Potentially flawed AI
Tassone says that Truleo’s AI models were built by its data scientists and law enforcement experts looking for “de-escalation, auto-flagging of incidents, or early warning for volatile interactions” to generate searchable reports. The models can recognize if a call is about drugs, theft, a foot chase, and if there’s profanity or shouting, he claims. Truleo quantifies the classifications as metrics, such as the percentage of “negative interactions” an officer has on a monthly basis and what police language is “effective.”
“Obviously, a call that ends in an arrest is going to be negative. But what if an officer has an overwhelming amount of negative interactions but a below-average number of arrests? Is he or she going through something in their personal lives? Perhaps something deeply personal such as a divorce or maybe the officer was shot at last week. Maybe they need some time off to cool down or to be coached by more seasoned officers. We want to help command staff be more proactive about identifying risky behavior and improving customer service tactics — before the officer loses their job or ends up on the news.”
But some experts are concerned about the platform’s potential for misuse, especially in the surveillance domain. “[Body cam] footage doesn’t just contain the attitude of the officer; it also contains all comments by the person they were interacting with, even when no crime was involved, and potentially conversations nearby,” University of Washington AI researcher Os Keyes told VentureBeat via email. “This is precisely the kind of thing that people were worried about when they warned about the implications of body cameras: police officers as moving surveillance cameras.”
Keyes also pointed out that natural language processing and sentiment analysis are far from perfect sciences. Aside from prototypes, AI systems struggle to recognize examples of sarcasm — particularly systems trained on text data alone. Natural language processing models can also exhibit prejudices along race, ethnic, and gender lines, for example associating “Black-aligned English” with higher toxicity or negative emotions like anger, fear, and sadness.
Speech recognition systems like the kind used by Truleo, too, can be discriminatory. In a study commissioned by the Washington Post, popular smart speakers made by Google and Amazon were 30% less likely to understand non-American accents than those of native-born users. More recently, the Algorithmic Justice League’s Voice Erasure project found that that speech recognition systems from Apple, Amazon, Google, IBM, and Microsoft collectively achieve word error rates of 35% for African American voices versus 19% for white voices.
“If it works, it’s dangerous. If it doesn’t work — which is far more likely — the very mechanism through which it is being developed and deployed is itself a reason to mistrust it, and the people using it,” Keyes said.
According to Tassone, Truleo consulted with officials on police accountability boards to define what interactions should be identified by its models to generate reports. To preserve privacy, the platform converts footage into an MP3 audio file during the upstream process “in memory” and deletes the stream after analysis in AWS GovCloud, writing nothing to disk.
“Truleo’s position is that this data 100% belongs to the police department,” Tassone added. “We aim to accurately transcribe about 90% of the audio file correctly … More importantly, we classify the event inside the audio correctly over 99% of the time … When customers look at their transcripts, if anything is incorrect, they can make those changes in our editor and submit them back to Truleo, which automatically trains new models with these error corrections.”
When contacted for comment, Axon, one of the world’s largest producers of police body cameras, declined to comment on Truleo’s product but said: “Axon is always exploring technologies that have [the] potential for protecting lives and improving efficiency for our public safety customers. We gear towards developing responsible and ethical solutions that are reliable, secure, and privacy-preserving.”
In a recent piece for Security Info Watch, Anthony Treviño, the former assistant chief of police for San Antonio, Texas and a Truleo advisor, argued that AI-powered body cam analytics platforms could be used as a teaching tool for law enforcement. “For example, if an agency learns through body camera audio analytics that a certain officer has a strong ability to de-escalate or control deadly force during volatile situations, the agency can use that individual as a resource to improve training across the entire force,” he wrote.
Given AI’s flaws and studies showing that body cams don’t reduce police misconduct on their own, however, Treviño’s argument would appear to lack merit. “Interestingly, although their website includes a lot of statistics about time and cost savings, it doesn’t actually comment on whether it changes the outcomes in any way,” AI researcher at the Queen Mary University of London, Mike Cook, told VentureBeat via email. “Truleo claims they provide ‘human accuracy at scale’ — but if we already doubt the existing accuracy provided by the humans involved, what good is it to replicate it at scale? What good is a 50% reduction in litigation time if it leads to the same amount of unjust, racist, or wrongful police actions? A faster-working unfair system is still unfair.”
For AI coverage, send news tips to Kyle Wiggers — and be sure to subscribe to the AI Weekly newsletter and bookmark our AI channel, The Machine.
Article: AI Weekly: The perils of AI analytics for police body cameras