Processing Your Payment

Please do not leave this page until complete. This can take a few moments.

April 23, 2024

CT AI bill would set standards, but Lamont remains wary

MARK PAZNIOKAS / CTMIRROR.ORG Sen. James Maroney with, from left, Sens.Norm Needleman, Martin Looney and Bob Duff.

Senate Democrats said Monday they are narrowing a groundbreaking proposal to set standards for the development and use of artificial intelligence in Connecticut, an effort to address Gov. Ned Lamont’s skepticism over a small state playing an outsized role in shaping technology.

Backed by the top leaders of the Senate, the bill’s lead sponsor, Sen. James Maroney of Milford pushed back forcefully at an industry trade group intent on killing the legislation while acknowledging a need for revisions to a bill addressing everything from discrimination to the production of deep fakes.

“Throughout the process, we’ve been taking stakeholder feedback from both advocates, industry, academia, meeting with people regularly to update the bill,” Maroney said. “I do anticipate a new draft of the bill will be out later today or tomorrow.”

Senate Bill 2 would make Connecticut fill a regulatory void left by congressional inaction over the rapidly growing field of AI, including its role in devising the decision-making algorithms in hiring, student admissions, lending and identifying targets in criminal investigations.

Among other things, the bill would set a deadline of July 1, 2025, for developers to take “reasonable care to protect consumers from any known or reasonably foreseeable risks of algorithmic discrimination.” It also would penalize the use of AI to place real people in realistic, deep-fake videos portraying them doing and saying things they never did.

Taylor Swift was the recent victim of faked pornographic photos, and the technology also has been used to harass and humiliate non-public figures, including high school students, by placing their images in sexually explicit photographs and videos, Maroney said.

The press conference came four days after House Speaker Matt Ritter, D-Hartford, made clear that the Senate bill was unlikely to get called for a vote in the House unless Maroney got the Lamont administration on board. It was unclear if the promised revisions would meet Lamont’s concerns.

“We are supportive of efforts to increase disclosure of the use of AI in elections, and to close loopholes around intimate images,” said Julia Bergman, a Lamont spokeswoman. “However, the governor remains concerned that this is a fast moving space and that we need to make sure we do this right and don’t stymie innovation.”

Bergman noted that discrimination in housing, employment, banking and other areas already are illegal, whether committed by AI or other means, and the governor was hopeful that AI regulation would be addressed more broadly, either by the federal regulators or a multi-state effort.

Doug Johnson, the vice president of emerging technology policy for the Consumer Technology Association, offered similar concerns in an interview Monday and in previous comments, insisting that Senate Bill 2 was an exercise in regulatory overreach on a topic best left to federal players.

“There is bipartisan, bicameral attention to this in Congress,” he said.

Four federal agencies, including the Federal Trade Commission and Department of Justice, issued a policy statement last year asserting they already have jurisdiction to pursue discriminatory policies involving AI, Johnson said.

Senate President Pro Tem Martin M. Looney, D-New Haven, said it is a state issue because of Congress’ inability to act, regardless of what attention is being paid to AI in Washington.

“Expecting Congress to do something would be like asking a huge stone formation to become animated, get up and dance,” Looney said. “That’s not exactly going to happen anytime soon.”

Looney said a stable regulatory framework is preferable for business development than the current free-for-all. Maroney said the bill has been endorsed by Microsoft and IBM, and smaller Connecticut companies welcome the legislation.

“We like the concept of regulation. We like the concept of certainty. We like the concept of understanding where the guidelines go,” said Matthew Wallace, the CEO of VRSim, a training company. “I will tell you that this will foster growth, not inhibit it.”

Senate Majority Leader Bob Duff, D-Norwalk, noted that the press conference had drawn a significant audience, mainly lobbyists.

“There’s a lot of money to be made here. But our job really is to protect consumers,” Duff said. The legislators were trying to learn from the failure of state or federal government to protect the privacy of consumers when the internet was developed, he said. “Our job is not to repeat the sins of the past.”

Maroney acknowledged, however, that the bill as originally drafted was too ambitious in a section that would have made Connecticut a regulator of how every general-purpose AI tool was developed. It would have forced developers to establish policies on federal and state copyright laws, which currently are the subject of litigation, and detail the content used to train the general-purpose models.

“I think they’re right. I think that may be too soon,” Maroney said of critics of that portion of the bill, known as section 4. “That section was what was causing the most angst among a lot of large companies I’ve been speaking with.”

The section has been stricken from the bill, and other changes are in the offing.

Johnson said the deletion does not address the basic concern that AI regulation should be done nationally.

“We have 1,300 or so companies in our membership. Eighty percent of those companies are small- and medium-sized businesses,” Johnson said. “So when we’re looking for optimum national policy concerning AI, it has to be policy that is workable across the larger ecosystem of companies that are out there, from startups to major most multinational companies.”

Sign up for Enews


Order a PDF