The Connecticut House on Friday approved a sweeping artificial intelligence bill that Democratic Gov. Ned Lamont is expected to sign.
The bipartisan vote represents a breakthrough victory for Sen. James Maroney, a Democrat and national leader on AI regulation, who has tried for three years to set rules for private industry use of AI in Connecticut.
In years one and two, he introduced comprehensive regulations focused on combatting algorithmic discrimination in high-risk settings such as employment, healthcare and housing. The Senate designated them priority bills and passed them, including a heavily amended measure last year, but, under threat of veto, they were not taken up by the House.
Read more: Q&A: Conn. Sen. James Maroney, on the challenge of passing AI regulation
Maroney was able to incorporate some housing-related protections into a 2025 update of Connecticut’s consumer data privacy law.
This year, Maroney narrowed his focus to protecting workers from AI discrimination, regulating companion chatbots and providing whistleblower protections. The bill also includes incentives for the AI industry along with provisions to help students and workers adapt to an AI economy.
“We are still early innings in this transformation, and this is a solid foundation to make sure that Connecticut residents are protected and trained to participate in the new AI economy,” Maroney told Pluribus News before the House vote.
Maroney said he worked with Lamont’s office on the legislation and incorporated elements of an AI bill that Lamont had requested.
“We were able to get to a place where they were comfortable with the bill,” Maroney said, adding that Lamont has indicated he is likely to sign it.
Lamont communications director Cathryn Vaulman said in a statement to Pluribus News that the governor’s office “worked closely with Senator Maroney on this legislation and welcomes its focus on empowering Connecticut residents to use AI responsibly.”
“Workers should be able to benefit from greater efficiency on the job without fearing discrimination or displacement by AI,” Vaulman said.
Lamont, a former business executive, previously expressed concern about states creating a patchwork of AI laws and prior iterations of the bill potentially hurting Connecticut’s competitiveness.
The bill the House passed 131-17, with 3 not voting, is a 71-pager that carries the title of online safety measure but focuses primarily on the use of AI.
On the regulation front, it requires disclosure to workers and job applicants when AI is a “substantial factor” in making an employment-related decision about them.
A section requiring employers to provide a “high-level statement” explaining the reason for an adverse decision was removed at the request of the governor’s office, according to Maroney.
The bill also makes clear that the use of automated decisionmaking systems does not exempt employers from antidiscrimination laws.
Another section of the bill would give employees at companies building the most powerful AI systems — known as frontier models — protections from discipline if they report catastrophic risks.
It would also regulate companion chatbots, which mimic human interactions. Chatbots would have to remind users they are not communicating with a human and refer users in mental distress to crisis services. Chatbot operators would not be allowed to let minors use their service if the chatbot is capable of encouraging self harm or offering mental health services.
Companies engaging in mass layoffs would have to disclose whether the attrition is related to AI.
The bill would establish an independent verification pilot program to test the use of third-party entities to verify companies’ compliance with state privacy and AI laws, a concept that’s been introduced in several states this year.
Read more: California nonprofit pushes voluntary AI safety certification legislation
Language from a social media regulation bill that Lamont requested was also incorporated into the AI bill. One of those provisions would require surgeon general-style warning labels on platforms, something California, Minnesota and New York have also adopted.
Tech trade group NetChoice sued Minnesota this week over its warning label law, calling it “compelled speech” in violation of the First Amendment.
Other provisions in Connecticut’s bill are designed to encourage AI development.
The bill would empower state officials to develop a regulatory sandbox program, similar to Utah’s, which allows AI developers to test their systems without fear of enforcement. This is a priority for Lamont and was included in his requested AI legislation.
It would also create a Connecticut AI Academy and an AI working group, establish a study on AI’s impact on the state workforce, and require that public schools teach computer science.
“It is a start, it’s not an end point,” Maroney said in the interview. “There’s additional work I’d like to do, but I think this bill positions us well — particularly with the workforce development and training programs that we’re building in here — to make sure our residents are prepared to compete in the AI economy.”
The tech industry isn’t sold on the bill.
The Computer & Communications Industry Association praised the inclusion of a provision that requires the Connecticut AI Academy to develop a course on “durable skills” for the AI economy. But it otherwise stood by its critique of the legislation last month.
“This measure remains overly broad and would negatively impact a wide range of common digital tools, including customer service chat functions, productivity software, and safety technologies,” Megan Stokes, CCIA’s state director, said in a statement.
NetChoice wrote in a March letter opposing the bill that it would contribute to “an unsustainable patchwork of state laws.” The letter noted that President Donald Trump has vowed to preempt state AI regulations and establish a national standard.
“Against this backdrop, enacting SB 5 creates significant risk for Connecticut,” said the eight-page letter from Patrick Hedger, NetChoice’s director of policy.
“Prudence counsels waiting to see the contours of the federal framework before committing to a comprehensive state regime,” Hedger continued.
Maroney defended the bill as primarily focused on transparency and disclosure, and said many of the requirements are taken from existing law.
“I don’t think it makes Connecticut any more of a target than it already is, being seen as a blue state,” he said.