The bill, which would be effective within 90-day periods preceding elections or primaries, next heads to the House of Representatives. It was approved along party lines during a rare, swift 10-minute meeting of the committee to dispose of an eight-item agenda. The bill would allow for punishment of certain electioneering tactics including distribution of so-called synthetic media by individuals, committees, companies and other organized legal entities.
It would prohibit deep fakes using AI versions of people, including manufactured quotes from images of candidates running for office in campaign advertising, from opposing candidates or political action committees.
"The bill actually doesn't mention artificial intelligence in particular, but with the advent of artificial intelligence, deep fakes have gotten easier to make, in a much-more convincing fashion," said state Rep. Matt Blumenthal, D-Stamford, a member of the Judiciary Committee who is also the co-chairman of the Government Administration and Elections Committee, where the bill originated.
He said that similar laws protecting candidates have been approved in about 18 other state including Georgia, Texas and New Hampshire, where deep fakes were found portraying candidates saying and doing things they never did.
"These sorts of election lies can easily go viral," Blumenthal said. "They can distort our elections, deceive voters and compromise our democracy. It can be created and spread by foreign adversaries among other entities." Criminal cases would have to prove there was intent to affect elections. "Proving state of mind is something prosecutors do all the time," Blumenthal said during an interview in the Legislative Office Building. The person depicted as well as the state attorney general, would also have legal recourse to seek civil awards.
Criminal penalties could range from a class C misdemeanor — punishable by three months in jail and a $500 fine — to a class D felony with penalties of $5,000 and five years in prison, depending on the circumstances.
Under the legislation, "deceptive synthetic media" is defined as any image, audio, or video of someone and any representation of their speech or conduct that a "reasonable person" could see depicts someone's speech or conduct when the person did not do so.
During the recent public hearing process, Paul Amarone, senior policy director for the Connecticut Business and Industry Association, warned of the unintended consequences.
"The definition of 'deceptive synthetic media' may capture a broad range of common editing practices, creating hesitation around legitimate uses of emerging technologies," Amarone testified during a recent public hearing. "Media organizations may decline to publish or broadcast certain political content altogether rather than assume the legal risk. Finally, by layering complex liability standards onto hosting platforms, the bill shifts enforcement burdens onto lawful businesses rather than narrowly targeting malicious actors. This will disproportionately affect local broadcasters, small publishers, startups, and mid-sized employers."
© 2026 The Middletown Press, Conn. Distributed by Tribune Content Agency, LLC.