Processing Your Payment

Please do not leave this page until complete. This can take a few moments.

March 18, 2024 Politics & Policy

CT’s AI bill could be model for U.S., but businesses wary of reporting requirements

HBJ FILE PHOTO The state Capitol building.

With artificial intelligence increasingly working itself into the fabric of society and commerce, state lawmakers are attempting to create an AI regulatory framework that has received mixed reaction from the business community.

Senate Bill 2, which was introduced and approved by the state legislature’s General Law Committee, would implement several AI-related regulations while also ramping up investment in artificial intelligence education and workforce training.

James Maroney

The bill is the result of work done by the Connecticut Artificial Intelligence Working Group, which was created by the legislature last year. General Law Committee co-chair Sen. James Maroney (D-Milford) has been a leader in helping shape the regulations.

Maroney said Connecticut can be a leader on AI as states continue to grapple with how to manage the technology. He said similar bills have been proposed in Vermont and Virginia, and Connecticut officials have had discussions with legislators from at least 30 states in the last year.

Several parts of the bill have unanimous support from legislators, the working group and business leaders, while others, mostly related to reporting requirements for businesses, are getting some pushback.

Bob Duff

Senate Majority Leader Bob Duff (D-Norwalk) said the legislature isn’t trying to curtail the use of AI, which can offer many positives, such as disease identification technology in the medical and biomedical industries. But the state must be prepared for its use and any potential negative consequences, he said.

“We’re not trying to slow its growth or anything else. I think what we’re trying to do is acknowledge that it’s here and that it’s moving very quickly,” Duff said.

Reporting requirements

The 58-page bill establishes a new regulatory framework for AI technology use in the private sector.

For example, it requires companies to create impact assessments on AI technology they’re developing or using, with the attorney general and other state agency heads acting as enforcers.

In particular, the bill singles out developers and users of “high-risk AI” that, when used, makes, or is a controlling factor in making, a consequential decision on employment opportunities or financial, loan, healthcare, housing, insurance or legal services.

The goal, proponents say, is to ensure AI systems don’t lead to discrimination or disadvantages for people based on ethnicity, race, religion, age or other factors.

Wyatt Bosworth, a lobbyist for the Connecticut Business & Industry Association, said that unlike workforce development initiatives, these measures didn’t gain consensus from the AI working group.

“If the state wants to regulate this industry and wants to set up rules and parameters for deployment and development, it’s really critical that the business community has buy-in and is part of the development of those regulations,” Bosworth said.

Bosworth said “burdensome reporting and constant fear of litigation and penalties” could hinder the development of new technology. He said 20% of CBIA’s recently surveyed members acknowledged they were using AI in some form, and 35% expect to use it within the next five years.

He also noted key industries in the state, such as insurance, financial services and health care, already adhere to strict regulatory frameworks governing data protection and consumer rights, accountability and the ethical uses of AI.

For example, the Connecticut Insurance Department in February issued guidance for insurers’ use of AI, making Connecticut one of the first states in the nation to do so.

Workforce development

Still, Bosworth said CBIA supports several workforce development initiatives in the bill, calling them “critical investments” in getting Connecticut workers ready to use artificial intelligence.

One section of the bill requires the state Office of Workforce Strategy to incorporate AI into its training programs, and the Board of Regents, which oversees the state’s public and community colleges, to establish AI-related engineering, marketing and operations certificate programs targeting small businesses.

The Board of Regents would also be tasked with creating an “AI Academy” at Charter Oak State College to develop and offer online courses on responsible AI use.

The bill tasks the state Department of Economic and Community Development with developing a plan to offer high-performance computing services to businesses and researchers and establishing a confidential “computing cluster” for them. Bosworth said the computing cluster could act as an incubator for businesses and foster AI-related collaboration.

Further, DECD would conduct a “CT AI Symposium” that could get businesses, AI experts and other stakeholders involved in sharing their insights and best practices.

Duff said strengthening Connecticut’s workforce is a crucial part of the bill to ensure “no one gets left behind” as the technology continues to advance.

“Connecticut has a highly educated workforce, one of the most productive in the nation, and we know that there’s going to be some job losses because of AI, but there’s also going to be, if you net it out, more job gains,” Duff said. “We want to make sure that we have the workforce that’s ready for those job gains, and there’s retraining or other means that we can make sure that folks are prepared and ready.”

Sign up for Enews


Order a PDF