A policy working group has issued recommendations for modifying Colorado's pioneering artificial intelligence law. These proposals, focusing on transparency and liability, need to be developed into a legislative bill and passed this session to address previous disagreements and ensure a balance between consumer protection and AI innovation.
Colorado enacted a groundbreaking artificial intelligence law in 2024, designed to regulate the use of AI systems in "consequential" decision-making. However, its implementation was significantly delayed, initially due to take effect last month, but now postponed until June 2026. This delay was necessitated by ongoing disagreements and challenges in establishing a balanced framework that could simultaneously protect consumers, foster innovation, and be practical for businesses to implement. The working group was specifically formed by Governor Jared Polis to address these complexities and propose viable modifications to the existing legislation, ensuring its eventual effectiveness.
A core aspect of the AI Policy Working Group's recommendations focuses on enhancing transparency for AI systems. The proposals suggest that AI developers should be mandated to furnish their clients, referred to as "deployers," with comprehensive descriptions of the AI system. This includes crucial details like the system's intended applications, the types of data utilized for its training, inherent limitations, and precise instructions for adequate monitoring. Furthermore, deployers would be required to clearly and conspicuously notify consumers, in easily understandable language, about the specific role AI played in any significant decision affecting them. This aims to empower individuals with knowledge about when and how AI impacts their lives.
The issue of liability for adverse outcomes resulting from AI systems was a particularly complex challenge tackled by the working group. Their recommendations propose a shared liability model where responsibility would be apportioned between both the AI developers and the deployers. The specific allocation of liability would depend directly on each party's contribution to the fault or problem that led to the adverse outcome. This nuanced approach acknowledges that both the creators and the implementers of AI systems bear a degree of responsibility in ensuring ethical and safe usage, moving beyond a simple singular point of blame.
To further bolster consumer protection and regulatory oversight, the working group also recommended an expanded role for the Attorney General's office. Under the proposed changes, the Attorney General would be responsible for establishing specific rules regarding the disclosures that deployers must provide to consumers. These disclosures would be mandatory following any adverse outcome where an AI system was involved in a consequential decision. This measure is intended to ensure that consumers receive timely and clear information about negative impacts, potentially facilitating avenues for recourse or corrective action.
The recommendations put forth by the working group are not yet final law; they must first be drafted into a formal bill and successfully navigate the legislative process within the current session. Governor Jared Polis expressed strong appreciation for the group's unanimous agreement, viewing it as a critical step forward for Colorado's AI policy. The Colorado Technology Association (CTA), a key participant in the discussions, also voiced support for the framework, highlighting its "targeted revisions" and their anticipation for future legislation that balances consumer protection with innovation. Democratic State Representative Brianna Titone, an original sponsor of the 2024 law, acknowledged the recommendations as a solid foundation but cautioned that the bill might undergo further changes during legislative debates. Similarly, Senate Majority Leader Robert Rodriguez stressed the importance of ensuring the law provides consumers with transparency regarding AI decisions and effective mechanisms for correction, emphasizing the need for robust enforcement.