IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Connecticut AI Task Force Highlights Need for Transparency

During a meeting of the task force this week, lawmakers highlighted the importance of notifying people when they are interacting with artificial intelligence. The group is likely to propose new legislation to that effect.

Connecticut_shutterstock_460602745
(TNS) — Before the General Assembly reconvenes next month, the state's Artificial Intelligence Task Force will likely recommend proposed legislation to enhance transparency and accountability in the emerging technology, plus new training programs and the criminalization of AI-generated porn.

State Sen. James Maroney, D- Milford, co-chairman of the task force that includes computer and public policy experts as well as state agency heads and has been meeting over the last year, plans to draft legislation to build on the group's work. The goal is to create jobs and provide security assurances as the technology has exploded, creating a variety of positive — and negative — opportunities.

During a meeting of the task force Tuesday afternoon, Maroney, who is also co-chairman of the legislative General Law Committee, said it's important for people to know when they interact with Artificial Intelligence, which plays a growing role in daily life. He intends the upcoming legislative recommendations to include the criminalization of so-called deepfake and revenge porn.

The task force has a Feb. 1 deadline to suggest legislation before disbanding, but Maroney wants to continue the group as advisers. He and other members of the task force said that there should be a point person for the AI industry to interface with government. "How do we not hamper innovation?" Maroney asked the 21-member group.

Data privacy, computer literacy for school children and fostering jobs are keys to addressing the growing technology, which in Connecticut is expected to help the health, education and insurance industries. Connecticut is one of 29 states that have started to regulate the technology, said George Mathanool, a task force member who believes that national legislation is also needed during this calendar year.

"We do not have moats around Connecticut," he said, stressing that many companies in the state, particular in the health care industry, outsource their AI work out-of-state. Likely, they would not be subject to state rules, he said. "That becomes a difficult issue to mediate or control or adjudicate on how to regulate AI for health care. It's a black box that nobody has been able to identify what's in there. You can't regulate something that you just don't know what to regulate on."

Maroney said teachers complain they have been overwhelmed since the pandemic, and need professional development or training on AI, likely through the state Department of Education. "It didn't seem that the districts had developed local models," he said of municipal and regional school systems. Parents should also be briefed on responsible uses for AI, he added. "This is a dynamic, not a static topic," Maroney said. "Until people fully trust AI, we're not going to see full adoption."

The group will meet next week to agree on a list of items to vote on. Maroney expected to use AI to generate a final report to the legislature before the Feb. 1 deadline.

Colleen Murphy, executive director and general counsel for the state Freedom of Information Commission, stressed the need for openness. "Generally, our idea, which we have of course waved the flag of transparency and we're just trying to instill public trust and confidence, particularly in government, but of course the policy and thought process applies in the public sector as well," Murphy said of the 2023 law.

"AI has the ability to radically transform the way we live and work," Maroney said in a statement. "In order to reap the full benefits of this rapidly evolving technology, we need to ensure that there is trust and we put in guardrails to keep citizens safe. One of the risks we have seen is in disinformation and spreading fake content. In Connecticut, we need to make certain that it is illegal to use AI to generate non-consensual intimate images. We also need to protect the integrity of our elections and prevent the use of deep fakes for campaign."

Maroney said that one of the problems for the public is that about 27 percent of the population, according to a Pew Research Center study, lacks connectivity to the Internet. "The goal of this task force is to come up with a legislative framework," Maroney said. "We're not going to be developing language here. Once we get a framework, the legislators will take that and make that into language."

©2024 the New Haven Register, Distributed by Tribune Content Agency, LLC.