At the top of the list of concerns about AI that led to the self-imposed delay is the growing use by police agencies in Connecticut and elsewhere of aggressively marketed software that generates police reports from audio recordings collected by officer-worn body cameras.
Supporters of the technology, many in law enforcement, predict it will increase police efficiency — and, by extension, public safety — by allowing officers to spend more time on patrol and less time at desks writing reports that often form the basis for prosecutions.
Others express skepticism over applying developing, arguably flawed and still relatively untested AI technology to criminal justice, with its far-reaching societal consequences. They point to highly publicized AI failures, including one involving a Utah traffic stop during which a police body cam recorded the movie “The Princess and the Frog” playing in the background and generated a report stating, among other things, that an officer “turned into a frog.”
Chief State’s Attorney Patrick Griffin, with support from the Connecticut Police Chief’s Association and the State Police, has imposed a moratorium on the use of AI programs “to draft, author and/or narrate criminal reports” in order to allow users to test the software, identify flaws and establish rules for its use.
“There can be little doubt that this technology will lead to increased efficiencies in operations for our police departments, ultimately resulting in cost-saving benefits to our communities,” Griffin said. “Nonetheless, the use of AI must be implemented in a manner that promotes public confidence in our criminal justice system. It is vital that we fully understand both the benefits and the shortcomings of the use of AI in policing before adopting policies for its use and implementing training for officers on the subject.”
The moratorium was imposed at least in part in response to concern among the hundreds of defense lawyers in the state Division of Public Defender Services, many of whom have doubts about whether a computer program can accurately match the impressions of a police officer when portraying often hectic, confused and emotionally fraught crime scenes.
The public defenders proposed legislation earlier this year — it was made moot by the moratorium — that would have closely regulated the use of AI report writing technology. Among other things, it would have required police to clearly label AI generated reports and required officers responsible for the reports to review, sign and attest to the accuracy of every page.
The proposed legislation also would have required police agencies to retain all drafts of AI generated reports, a procedure that would allow examiners to track the correction of computer-generated errors. In addition, there was language in the proposal that would have limited the ability of the software developers to sell computer-generated police reports from Connecticut.
Senior state law enforcement officers were, for the most part, hesitant to discuss AI technology, saying whatever plans they have are on hold pending the results of any inquiry undertaken during the moratorium.
Groton Police Chief Louis J. Fusaro, president of the Connecticut Police Chiefs Association, said he believes fewer than five Connecticut departments have “explored this use of AI for report writing.“
One of those is the New Haven Police Department, which said in written responses to questions that it bought the report writing AI program called Draft One from Axon Enterprises, the Arizona industry leader in marketing law enforcement software. New Haven police spokesman Christian Bruckhart said New Haven is testing the software on low priority calls where there is no arrest.
“I have not personally used it but in generalities, Draft One works by using audio from an officer’s body camera to generate a written draft, which an officer can use as a template for a final report,” Bruckhart said. “We are not ready to commit to AI report writing yet. This tech is still very new for policing in [Connecticut] and there are some state-level meetings of chiefs, along with input from the state’s attorneys, which are upcoming to discuss this issue.”
The police in Meriden are also said by others to have purchased AI report writing software, but did not reply to inquiries.
The State Police are the exception among law enforcement when it comes to willingness to discuss AI. The department has signed a 10-year, $120 million contact with Axon that gives them options to purchase an array of high-tech gear, from drones to non-lethal Taser weapons that activate body cameras when pulled from holsters.
Draft One AI report writing software amounts to a relatively small portion of the state police contract.
Capt. Ryan Maynard, who explored a variety of purchase options for the department, said the Axon AI report-writing software reduces or eliminates computer-generated inaccuracies by requiring or permitting officer input at multiple points in the report writing process. He said it generates reports from data limited to that collected by the body camera audio recorder, preventing the sorts of fictional “hallucinations” that sometimes appear in popular consumer AI programs with access to unlimited data.
The state Supreme Court earlier this month heard a case in which commercially available AI legal brief-writing software created an appeal brief containing by “hallucinated,” nonexistent citations.
“That is the incredibly important part,” Maynard said of the Axon report software, “It is not going to fill in gaps. If it didn’t exist, it is not getting documented in a report.”
Before a completed report can be uploaded into the state police evidence system, which Axon also developed and the state police bought under its contract, an officer must attest to having reviewed it.
Axon is said by analysts to be the largest provider of body-worn cameras to police departments in the United States and has been experiencing 30 percent year over year growth. Company officials would not agree to an interview request.
As artificial intelligence moves into all aspects of life, there is a belief that its inclusion in the law enforcement tool kit is a matter of time.
“I think there is some inevitability,” said Michael Lawlor, former co-chairman of the Legislature’s Judiciary Committee and now associate dean of the Henry Lee College of Criminal Justice and Forensic Sciences at the University of New Haven. “It is being used in every aspect of commerce now. And so it would be I think foolish to not see the extent to which it can be integrated into criminal justice … It’s good to be cautious. But I can’t imagine people saying it is completely unacceptable to have AI writing anything in a police report.”
Griffin, as the state’s top prosecutor and head of the state Division of Criminal Justice, may have the last word on adoption of AI-generate police reports. And he did not sound persuaded last week.
“I can tell you that my concerns are fairly concrete,” he said. “An officer’s police report is not simply based on what they hear. It is based on what they hear, see, smell. It can be based upon their fear, their interpretation, many things.”
Report-writing technology simply picks up sounds and creates a report that many believe “is rife with potential problems” of inaccuracy.
“The proponents of this type of technology will say this will create or encourage efficiency on behalf of police officers because the typing of a police report takes time and it takes police officers off the road and out of service.
“My response is that we are in the business of accuracy and not merely efficiency. And under present conditions under present technology we would have police officers essentially swearing under oath to what a computer believes it heard.
“So that is really my concern. I brought that to the chiefs of police association and they agreed with me,” he said.
©2026 Hartford Courant, Distributed by Tribune Content Agency, LLC.