close
close

How is AI being used by the police? Officials are raving about artificial intelligence that will create police reports from bodycam audio data

How is AI being used by the police? Officials are raving about artificial intelligence that will create police reports from bodycam audio data

OKLAHOMA-CITY– A body camera recorded every word and bark as Police Sergeant Matt Gilmore and his sniffer dog Gunner searched for a group of suspects for nearly an hour.

The above video is from a previous report on artificial intelligence security.

Normally, the Oklahoma City police sergeant would take his laptop and spend another 30 to 45 minutes writing a report on the search, but this time he had an artificial intelligence write the first draft.

Based on all the sounds and radio messages recorded by the microphone connected to Gilmore’s body camera, the AI ​​tool created a report within eight seconds.

“It was a better report than I could have ever written, and it was 100 percent accurate. It was more fluid,” Gilmore said. He even documented a fact he didn’t remember: another officer’s mention of the color of the car the suspects fled from.

Oklahoma City police are one of a few experimenting with AI chatbots to create early drafts of incident reports. Officers who have tried it out are enthusiastic about the time-saving technology, while some prosecutors, police watchdogs and legal scholars are concerned that it could alter a fundamental document of the criminal justice system that plays a role in who is prosecuted or incarcerated.

You never want to put an officer on the stand who says, “The AI ​​wrote that. Not me.”

The program is based on the same technology as ChatGPT and is distributed by Axon, the company best known for developing the Taser and as a leading US provider of body cameras. According to Gilmore, it could become another “game changer” for policing.

“They become police officers because they want to do police work, and spending half the day doing data entry is just a boring part of the job that they hate,” said Rick Smith, founder and CEO of Axon, describing the new AI product – called Draft One – as the product that has generated “the most positive response” of all the products the company has launched to date.

“There are certainly concerns,” Smith added. Prosecutors pursuing criminal cases in particular want to be sure that police officers – and not just an AI chatbot – are responsible for producing their reports, as they may have to testify in court about their observations.

RELATED TOPICS: New artificial intelligence feature to detect weapons through surveillance video

The company Omnilert is introducing a new technology that allows surveillance cameras to identify weapons and warn guards in the area about the potential threat.

“You never want to call an officer to the stand who says, ‘The AI ​​wrote that, not me,'” Smith said.

AI technology is nothing new to police departments. They have introduced algorithmic tools to read license plates, recognize suspects’ faces, detect gunfire, and predict where crimes might occur. Many of these applications come with privacy and civil rights concerns, and there are attempts by lawmakers to put safeguards in place. But the introduction of AI-generated police reports is so new that there are little to no guardrails for their use.

Concerns that racial prejudice and bias in society could be ingrained in AI technology are just part of what Oklahoma City community activist Aurelius Francisco finds “deeply disturbing” about the new tool, he learned from the Associated Press. Francisco prefers to write his name in lowercase to avoid appearing professional.

“The fact that the technology is being used by the same company that supplies police with Tasers is alarming enough,” said Francisco, co-founder of the Foundation for Liberating Minds in Oklahoma City.

He said automating these reports would “take away the ability of police to harass, surveil and inflict violence on community members. While this makes the police’s job easier, it makes the lives of black and brown people more difficult.”

Before the tool was tested in Oklahoma City, police officers showed it to local prosecutors, but they advised caution when using it in high-risk crime cases. Currently, it is only used for reports of minor incidents that do not result in an arrest.

“So no arrests, no capital crimes, no violent crimes,” said Captain Jason Bussert of the Oklahoma City Police Department, who is in charge of information technology for the 1,170-officer police department.

The situation is different in another city, Lafayette, Indiana. Police Chief Scott Galloway told AP that all of his officers can use Draft One in any type of case. The procedure has also enjoyed “incredible popularity” since the pilot project began earlier this year.

Or in Fort Collins, Colorado, where police sergeant Robert Younger said officers can use the system for any type of report, but they found it doesn’t work well when patrolling the downtown bar district because of the “overwhelming noise.”

In addition to using artificial intelligence to analyze and summarize the audio recording, Axon also experimented with computer vision to summarize what can be “seen” in the video footage. However, the company quickly realized that the technology was not yet mature.

“Given the many sensitive aspects of policing, the race and other identities of the people involved, I think this is an area where we still have some work to do before we roll this out,” said Smith, Axon’s CEO, describing some of the responses tested as not “overtly racist” but insensitive in other ways.

SEE ALSO: AI leads to man wrongfully imprisoned and accused of robbery despite ‘absolutely solid alibi’

A California man was wrongly arrested and accused of a robbery in Texas after facial recognition technology brought him to the attention of authorities, according to a lawsuit.

These experiments led Axon to focus entirely on audio for the product it unveiled in April during the company’s annual police officer conference.

The technology is based on the same generative AI model as ChatGPT, developed by San Francisco-based OpenAI, a close business partner of Microsoft, Axon’s cloud computing provider.

“We use the same underlying technology as ChatGPT, but have access to more knobs and dials than an actual ChatGPT user,” said Noah Spitzer-Williams, who manages Axon’s AI products. Turning down the “creativity dial” helps the model stick to the facts so it “doesn’t exaggerate or hallucinate in the same way that you would if you were just using ChatGPT,” he said.

Axon is not disclosing how many police departments are already using the technology. The company is not alone, with startups like Policereports.ai and Truleo offering similar products. But given Axon’s close relationships with police departments that buy its Tasers and body cameras, experts and law enforcement officials expect AI-generated reports to become more common in the coming months and years.

But before that happens, legal scholar Andrew Ferguson wants to see more public discussion about the benefits and potential harms. Because the large language models behind AI chatbots tend to invent false information. This problem is called hallucination and can lead to convincing and barely detectable untruths in a police report.

I fear that automation and the simplicity of technology would lead to police officers being less careful when writing

Andrew Ferguson, Professor of Law at American University

“I fear that the automation and ease of use of the technology could lead to police officers being less careful in their writing,” said Ferguson, a law professor at American University who is working on what is expected to be the first law journal article on the emerging technology.

Ferguson said a police report is important to determine whether an officer’s suspicions “justify the deprivation of a person’s liberty.” Sometimes it’s the only statement a judge sees, especially in minor offenses.

Human-generated police reports also have flaws, Ferguson said, but it is an open question as to which are more reliable.

For some police officers who have tried it, it is already changing the way they respond to a reported crime. They tell the story of what is happening so that the camera can better capture what they want to put on record.

Bussert expects that as the technology becomes more widespread, officers will become “increasingly verbal” in describing what lies before them.

After Bussert loaded the traffic stop video into the system and pressed a button, the program generated a colloquial-narrative-style report that included the date and time — just as if a police officer had typed something from his notes — all based on the body camera audio.

“It literally took seconds,” Gilmore said, “and it was so advanced that I thought, ‘I don’t need to change anything anymore.'”

At the end of the report, the officer must check a box indicating that the report was created using AI.

SEE ALSO: Jersey Shore town uses AI-based gun detection to prevent mass shootings

Ocean City uses AI-based gun detection to prevent mass shootings

Copyright © 2024 The Associated Press. All rights reserved.

Leave a Reply

Your email address will not be published. Required fields are marked *