Internet Explorer 11 is not supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Some Concerned About Connecticut’s Extensive AI Use

A proposed bill would establish an Office of Artificial Intelligence and create a task force to study the emerging technology and establish an AI bill of rights. If passed, the legislation would be the first of its kind.

(TNS) — In the global AI race, Connecticut lawmakers are looking to rein in the government’s relationship with technology before it advances too far.

A new bill in the General Law Committee would establish an Office of Artificial Intelligence and create a government task force to study the emerging technology and develop an AI bill of rights.

The target is not the AI chatbots that evoke science fiction fears, but the automated systems that government agencies use to decide which residents obtain benefits, advance in the hiring process, or receive state intervention.

State Sen. James Maroney, the chair of the General Law committee, explained that automated decision making impacts us all, yet few citizens are even aware of the scope of its use by state agencies or its potential to unwittingly discriminate against minorities, the elderly or those in poverty.

“Everyone is impacted by algorithms and algorithmic discrimination,” Maroney said. “While we understand the tremendous power of AI to bring efficiencies and help make our life better, we want things tested to ensure that our residents aren’t being disproportionately impacted or discriminated against, when and if we are going to use these different automated decision-making systems.”

The bill, SB 1103, would allow for government oversight, mandate the inventory and testing of state-used algorithms, close existing data privacy loopholes and enumerate citizen protections through the AI Bill of Rights.

If passed, the legislation would be among the first of its kind in the U.S.

At a public hearing Tuesday, the bill was applauded by Suresh Venkatasubramanian, a Brown University professor who co authored the Biden-Harris administration’s Blueprint for an AI Bill of Rights while serving as the assistant director for science and justice in the White House Office of Science and Technology Policy.

“In putting this bill forward, Connecticut will be among the leaders in the nation and provide concrete guidance for the governance of automated systems,” Venkatasubramanian said. “It’s a very comprehensive approach to governing the technologies that are impacting our civil rights, our opportunities for advancement and access to critical services.”

Maroney said that he isn’t concerned about being first — he wants Connecticut to be adaptive and proactive, tackling technologies as they emerge instead of reacting when problems arise.

“It’s important that we get ahead because this is going to be impacting every facet of our lives as we move forward,” Maroney said. “Before we get to that point, we need to make sure that we have some guardrails in place. We don’t want to hamper innovation, but we also want to make sure we don’t unnecessarily cause any harms.”

The bill would ensure that the technology used by state agencies adhere to state and federal privacy and discrimination laws.

Jeffrey Daniels, the co-chair of the Legislative Committee of the Connecticut Council on Freedom of Information explained that currently machine-based data systems organize information with little to no human oversight.

“In Connecticut Agencies today are using algorithms and doing so mostly in secret. We know little about how and when they’re used, how they are designed, whether they are fair and accurate,” Daniels said.

S.B. 1103 would bring these largely unknown processes to light — a piece that Daniels said is crucial.

“Access to information is critical to fairness and democracy,” he said. “The public has a right to know how the government makes this decision, and this new age of automated decision making makes stakes higher and the complexity greater.”

Tech Companies’ Objections

Tech companies are not thrilled with the proposal. They fear that too much transparency will expose trade secrets and code to competitors.

“The bill proposes a number of things that could severely restrict the ability of companies to offer services to state government agencies. First, the bill requires the Office of AI to produce detailed reports including how data was collected, generated, or processed. It’s unclear how such a report could be generated without compromising proprietary information,” said Christopher Gilrein, the executive director of the northeast branch of TechNet, a national network of “technology CEOs and senior executives.”

Gilrein added that the bill’s language could produce unintended restrictions due to the dynamic nature of the rapidly emerging AI field.

“It’s difficult to provide actionable intelligence right now. This technology is advancing faster than we have the policy language to describe it,” Gilrein said. “The scope of the bill could conceivably cover any decision that is aided in any way by technology. These artificial intelligence, automated decision-making systems, these are all tools that exist on a spectrum.”

But watchdog researchers of the Media Freedom and Information Access Clinic at Yale Law School say that the current transparency requirements are woefully inadequate.

Danny Haidar, a law student at the clinic, said the information researchers uncovered while studying the state’s AI was “disturbing.”

“The use of algorithms has spread throughout Connecticut’s government rapidly and largely unchecked,” Haidar said. “Algorithms are now used to make decisions on many consequential matters, including assigning students to magnet schools, allocating police resources, setting bail, and distributing welfare benefits.”

The 2022 report found the Connecticut Department of Children and Families had used a risk assessment algorithm to issue recommendations for child welfare action and prevent life-threatening episodes. It also determined that the State Department of Education paid more than $650,000 for an algorithm used to aid in the placement of Hartford students in suburban open choice schools.

Researchers also sought information on the Department of Administrative Services’ algorithm that sorts hundreds of job applications for state employee and contractor positions but their FOIA request was blocked.

Haidar said that at the current level of supervision, the concern over state-use of AI is four fold.

“Algorithms can make mistakes that go unchecked without proper government oversight. Second, algorithms can amplify pre existing biases when they make decisions based on biased data. Third, government agencies lack the ability to investigate their own algorithms, which can lead to unforced errors and wasteful spending. And fourth, there is too little transparency about the algorithms used by Connecticut’s government, and that means that the people of Connecticut can’t hold their government accountable for its use of algorithms,” Haidar said.

Privacy Concerns

The other main concern is data privacy.

Claudio Antonini, a nuclear, aerospace, and financial engineer, explained to lawmakers Tuesday that certain AI can be reverse engineered to access massive data banks used to build its models.

“One can extract emails, names, phone numbers, or copyrighted material as text or medical images or pictures of individuals as images just querying the models,” Antonini said.

Antonini also cautioned that the world may encounter autonomous AI that can think, act and develop systems within the next three to 15 years.

House Minority Leader Rep. Vincent Candelora said that he would like to see data protections encompass all government vendors through SB 1103.

“There’s more and more data that private individuals are handing over to government that we never envisioned before,” Candelora said. “Because we now have these interfaces, it’s giving our agencies the opportunity to sort of grab information that regulation or statute doesn’t necessarily require. And we need to make sure that data is protected.”

©2023 Hartford Courant. Distributed by Tribune Content Agency, LLC.
From Our Partners