- An increasing number of companies are stepping up to help police departments ease the burden of administrative tasks with AI tools.
- Axon, widely recognized for its Taser devices and body cameras, was among the first companies to introduce AI specifically for the common police work of report writing and its AI is being tested in California, Colorado and Indiana.
- Police officers have been impressed by the results, drafting reports in as little as 10 seconds. Yet legal experts are raising concerns over accuracy, transparency, and potential bias — challenges that could significantly shape the future of AI both in policing and in the courtroom.
With law enforcement focused on reducing crime rates and budget pressures, while recruiting and retaining staff, technology companies are having some early success selling artificial intelligence tools to police departments, especially to ease the burden of administrative work.
📺 24/7 Chicago news stream: Watch NBC 5 free wherever you are
Axon, widely recognized for its Taser devices and body cameras, was among the first companies to introduce AI specifically for the most common police task: report writing. Its tool, Draft One, generates police narratives directly from Axon's bodycam audio. Currently, the AI is being piloted by 75 officers across several police departments, including Fort Collins, Colorado; Lafayette, Indiana; and East Palo Alto, California.
Axon CEO Rick Smith said it is restricted to drafting reports for only minor incidents so agencies can get comfortable with the tool before expanding to more complex cases. Early feedback, he added, indicates that Draft One reduces report-writing time by more than 60%, potentially cutting the average time for report completion from 23 minutes to just 8 minutes.
"The hours saved comes out to about 45 hours per police officer per month," said Sergeant Robert Younger of the Fort Collins Police Department, an early adopter of the tool. "When I first tested it myself, I was absolutely floored, because the draft report was incredibly accurate. There weren't any suppositions or guesses about what somebody was thinking or feeling or looked like or anything like that. The information it provided in that draft report was a very well-written, balanced report, chronological in order, based on facts, with an intro and an outcome," he said, adding that the draft was produced in under 10 seconds.
Lawyers are concerned about AI reports in court
Yet as AI gains traction in police work, legal experts are raising concerns over accuracy, transparency, and potential bias — challenges that could significantly shape the future of AI both in policing and in the courtroom. Much of the impact, however, depends on how heavily these tools are relied upon and the ways in which they are implemented.
Money Report
news40 mins ago
Avoid these 3 common mistakes when asking for a promotion, says INSEAD negotiation professor
Trump Administration3 hours ago
Musk, Ramaswamy to discuss DOGE with GOP leaders on Capitol Hill Dec. 5
"For all of the potential issues that AI technology creates in terms of admissibility of evidence, in terms of being completely transparent, in terms of trying to mitigate the biases that can be introduced into the system, I just don't know that it's worth it," said Utah State Senator Stephanie Pitcher, a defense attorney with Parker & McConkie.
Though Pitcher and other experts agree that AI in police reporting can offer benefits, it must be used with clear protocols and careful oversight to ensure accuracy.
Feeling out of the loop? We'll catch you up on the Chicago news you need to know. Sign up for the weekly Chicago Catch-Up newsletter.
"If the police officer is going to rely on artificial intelligence [to draft the report], that report should be reviewed," said New York trial attorney David Schwartz. "The police officer should have to sign off and attest that the facts are truthful to the best of that police officer's knowledge. So, if you have all that, it should be admissible. But it could create many, many problems for the police officer and the prosecution at trial [during cross-examination]."
Axon's Draft One has built-in safeguards that require officers to review and sign off on each report before submission, according to Smith. The system also includes controls, like placeholders for key information that officers must edit, ensuring that no critical details are missed. And beyond the officer's review, the report undergoes multiple levels of human oversight by supervisors, report clerks, and others to ensure it meets agency standards before it's finalized.
Even so, some members of law enforcement — like Keith Olsen, a retired New York detective and president and CEO of consulting firm KO Solutions & Strategies, which advises police associations — don't see the benefits of using AI for police reports.
"It seems to be trying to solve a problem that just doesn't exist," Olsen said. "It doesn't take that long to write a police report, and I think it's going to miss the officer's perspective, and the officer still has to add stuff, delete stuff. I don't think there's a saving-of-time claim. And if you get a clever defense attorney, I can see all kinds of problems with it."
Axon competitors Truleo and 365 Labs are positioning their AI tools as quality-focused aids for officers rather than time savers.
Truleo, which launched its AI technology for auto-generated narratives in July, captures real-time recorded voice notes from the officer in the field rather than relying on bodycam footage like Axon. "We believe dictation and conversational AI is the fastest, most ethical, responsible way to generate police reports. Not just converting a body camera video to a report. That's just nonsense. Studies show it doesn't save officers any time," said Truleo CEO Anthony Tassone.
365Labs, meanwhile, uses AI primarily for grammar and error correction, with CEO Mohit Vij noting that human judgment remains essential for reports involving complex interactions. "If it's burglary or assault, these are serious matters," said Vij. "It takes time to write police reports, and some who join the police force are there because they want to serve the communities and writing is not their strength. So, we focus on the formulation of sentences and grammar."
Accuracy in criminal investigations
Cassandra Burke Robertson, director of the Center for Professional Ethics at Case Western Reserve University School of Law, has reservations about AI in police reporting, especially when it comes to accuracy.
"Generative AI programs are essentially predictive text tools. They can generate plausible text quickly, but the most plausible explanation is often not the correct explanation, especially in criminal investigations," she said, highlighting the need for transparency in AI-generated reports.
Still, she says "I don't think the genie is going back into the bottle. AI tools are useful and will be part of life going forward, but I would want more than just a simple reassurance that the reports are fully vetted and checked."
In the courtroom, AI-generated police reports could introduce additional complications, especially when they rely solely on video footage rather than officer dictation. Schwartz believes that while AI reports could be admissible, they open the door for intense cross-examination. "If there's any discrepancy between what the officer recalls and what the AI report shows, it's an opportunity for the defense to question the report's reliability," he said.
This potential for inconsistency could create a perception of laziness or lack of diligence if officers rely too heavily on AI and don't conduct thorough reviews.
New Jersey-based lawyer Adam Rosenblum said "hallucinations" — instances when AI generates inaccurate or false information — that could distort context are another issue. Courts might need new standards to require detailed, transparent documentation of the AI's decision-making process before allowing the reports into evidence," he said. Such measures, he added, could help safeguard due process rights in cases where AI-generated reports come into play.
Axon and Truleo both confirmed their auto-generated reports include a disclaimer.
"I think it's probably a uniform opinion from many attorneys that if we're overcomplicating something or introducing potential challenges to the inadmissibility, it's just not worth it," sair Pitcher.
But Sergeant Younger at Fort Collins remains optimistic: "The thing that's crucial to understand with anything that involves AI is that it's a process," he said. "I've had officers tell me this makes the difference between deciding whether or not to continue in law enforcement, because the one thing they were not counting on when they became a cop was the incredibly huge amounts of administrative functions, and that's not necessarily what they signed up for."