NewsOur NewsTech News

Police are using AI software to write police reports, but does it hold up in court?

Law enforcement organizations, which are frequently at the forefront of implementing cutting-edge technology, have once again embraced artificial intelligence as a revolutionary breakthrough. Departments are experimenting with a more ambitious application now that AI-powered audio transcription tools have been successfully integrated: software that can produce thorough police reports.

Police are using AI software to write police reports, but does it hold up in court?
Futuristic red and blue like police glowing AI controlling a holographic touch screen with documents on it, inside a futuristic detective’s lab. (Grok2 AI)

Draft One is a generative AI tool that Axon, known for its Tasers and body cams, has unveiled to expedite the laborious process of writing reports. Draft One can generate draft narratives in a matter of minutes by utilizing Microsoft’s Azure OpenAI platform and analyzing body camera footage. This allows for a 30- to 45-minute workday reduction. According to The Associated Press, the software promises to cut paperwork by up to an hour every day, freeing up officers to concentrate on mental health and community involvement.

Police are using AI software to write police reports, but does it hold up in court?
Futuristic red and blue like police glowing AI controlling a holographic touch screen with documents on it, inside a futuristic detective’s lab. (Grok2 AI)

Although there may be advantages to the technology, questions have been raised concerning its dependability and possible biases. Large language models, such as ChatGPT, on which Draft One is based, have drawn criticism for their propensity to produce false or misleading results. Even though Axon says it has made careful adjustments to address these problems, errors and hallucinations are still possible.

Police are using AI software to write police reports, but does it hold up in court? fururistic ChatGPT logo with glowing green circuitry grok2 ai
A futuristic ChatGPT logo with glowing green circuitry. (Grok2 AI)

In addition, there are valid worries about the possibility of gender and racial biases in reports produced by AI. These biases in large language models have been shown by experts, and using them in law enforcement could make inequality already present worse.

Police are using AI software to write police reports, but does it hold up in court?
Futuristic red and blue like police glowing AI controlling a holographic touch screen with documents on it, inside a futuristic detective’s lab. (Grok2 AI)

Departments use Draft One in different ways. While some agencies have given officers more extensive access, others have restricted its use to minor incidents. But experts contend that using AI alone is not a viable option due to the potentially disastrous effects of mistakes in police reports.

Police are using AI software to write police reports, but does it hold up in court?
Futuristic red and blue like police glowing AI controlling a holographic touch screen with documents on it, inside a futuristic detective’s lab. (Grok2 AI)

“The large language models underpinning tools like ChatGPT are not designed to generate truth. Rather, they string together plausible sounding sentences based on prediction algorithms,” Purdue clinical associate professor Lindsey Weinberg who has a focus on digital and technological ethics, told Popular Science.

Police are using AI software to write police reports, but does it hold up in court? fururistic ChatGPT logo with glowing green circuitry grok2 ai
A futuristic ChatGPT logo with glowing green circuitry. (Grok2 AI)

Weinberg is the director of Tech Justice Lab, and he argues that “almost every algorithmic tool you can think of has been shown time and again to reproduce and amplify existing forms of racial injustice.” AI Experts have documented have overseen many instances of gender or race types of biases in large language models throughout the years.

Weinberg concluded with saying, “the use of tools that make it ‘easier’ to generate police reports in the context of a legal system that currently supports and sanctions the mass incarceration of marginalized populations should be deeply concerning to those who care about privacy, civil rights, and justice.”

The growing application of AI in law enforcement necessitates a careful analysis of the advantages and disadvantages. Technology can increase productivity and streamline procedures, but it must be used carefully to avoid sacrificing accountability, justice, or accuracy.

Jeffrey Childers

Journalist, editor, cybersecurity and computer science expert, social media management, roofing contractor.

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button
Signup for The AEGIS Alliance Newsletter! 
The newsletter sends out automatically after eight new posts are published. Also occasional updates about what's new on our YouTube channel. We also offer a memes newsletter.
You can unsubscribe at any time!

Sharing is Caring!

Please share this post with your friends

Adblocker Detected

Hello. Our systems have detected that you're using an adblocker. Please whitelist/bypass our website, or temporarily turn off the adblocker and reload the page. We apologize for the inconvenience. Thanks for your time.