What Is the Colorado AI Act?

This article is part of a seven-part series on the Colorado AI Act (SB 24-205). Over the coming weeks, I’ll walk through what this law means, who it affects, and what steps Colorado businesses should take to prepare. If you run a professional services firm in South Denver, this series is written for you.


Quick navigation: Skip to FAQ


A conversation I keep having

A few weeks ago, I sat down with the managing partner of a law firm in Greenwood Village. Smart. Careful. Runs a tight practice.

We were talking about his IT environment and I asked a simple question.

“Do you know how many AI tools your team is using right now?”

He paused.

“Probably a couple,” he said.

We looked together. The answer was closer to a dozen. Some were embedded in platforms his firm already paid for. Some were tools individual staff had signed up for on their own. One was being used to help draft client communications.

None of it was documented. None of it had been reviewed. And some of it touched areas that a new Colorado law is about to regulate.

He’s not careless. He’s busy. And until recently, there wasn’t a reason to look that closely.

There is now.


What is the Colorado AI Act?

The Colorado AI Act (SB 24-205) is a consumer protection law focused on artificial intelligence.

Its purpose is straightforward: prevent algorithmic discrimination. That means preventing AI systems from producing unfair or biased outcomes in decisions that significantly affect people’s lives.

The law covers decisions in areas like:

  • Employment and hiring
  • Lending and credit
  • Housing
  • Insurance
  • Healthcare
  • Education
  • Legal services

If your business uses AI in any of those areas, even through a vendor-provided tool, this law likely applies to you.

And here is the part that catches most business owners off guard: the law does not just regulate companies that build AI. It also regulates companies that use AI. The law calls those companies “deployers.”

Most professional services firms in South Denver will fall into the deployer category.

When does it take effect?

The requirements are currently scheduled to take effect on June 30, 2026.

That date was already pushed back once. The original deadline was February 1, 2026, but a follow-up bill (SB25B-004) extended it because of the complexity involved.

June 2026 is not far away. And the preparation work takes time.


Why this matters if you run a business in South Denver

If you run a law firm, a wealth management practice, a consulting firm, or any professional services business in the Denver Tech Center, Centennial, Littleton, Lone Tree, or Highlands Ranch, you are almost certainly using AI in some form today.

It might be:

  • AI features built into your Microsoft 365 environment
  • A hiring or screening tool with AI under the hood
  • A CRM or client management platform that uses AI for scoring or recommendations
  • Generative AI tools your staff are using for research, drafting, or analysis

Even “light” use can create obligations under this law, if the AI is involved in what the statute calls a consequential decision.

And here is the question I keep coming back to with business owners:

If a regulator asked you today to show how AI is being used in your firm, and what controls are in place, could you answer clearly?

Most firms I talk to cannot. That’s not a criticism. It’s just the reality of how fast AI has moved into everyday business tools.


Colorado is not alone in this

This is not a Colorado-only trend. And it is not a partisan issue.

AI regulation is moving forward across the country and around the world, from states led by both parties.

The EU AI Act

The EU AI Act is now the global benchmark for AI regulation. It uses a risk-based approach with strict requirements for high-risk AI systems and penalties that can reach up to 7% of global annual turnover. If your business touches European customers or data, this is already in play.

Other U.S. states

Several states have enacted their own AI-related laws:

  • Utah enacted an AI transparency law requiring certain disclosures when AI is used. It is narrower in scope than Colorado’s law, but it signals the direction.
  • Texas passed AI legislation addressing algorithmic transparency in certain contexts.
  • California, New York, and Illinois have introduced or enacted AI laws targeting specific areas like hiring algorithms, deepfakes, and AI-generated content.

Here is the key point: Colorado’s AI Act is currently one of the broadest and most comprehensive state-level AI laws in the United States. If you do business in Colorado, this is likely the most demanding AI compliance framework you will face at the state level right now.

And I would expect this landscape to keep expanding. More states are working on AI legislation as I write this.


Who does the Colorado AI Act apply to?

The law draws a clear line between two roles. Both carry obligations.

Developers

A developer is a company that builds an AI system, or intentionally and substantially modifies one.

If you created the AI tool or materially changed how it works, you are a developer under this law.

Deployers

A deployer is a company that uses a high-risk AI system.

This is where most South Denver businesses will land. You did not build the AI. But you are using it. And if that use touches consequential decisions about Colorado residents, you are a deployer with real obligations.

Examples:

  • Using an AI-powered platform to screen job applicants
  • Using AI-driven tools for tenant evaluation
  • Using AI-assisted underwriting or risk scoring
  • Using AI for client segmentation that influences credit or service offers

You can be both a developer and a deployer. And using a vendor’s tool does not automatically transfer your compliance responsibilities to the vendor.

I will say that again because it is important: using a vendor-built tool does not mean you are off the hook.


The small business carve-out: real, but narrow

One of the first things business owners ask me is: “Does this apply to us? We’re small.”

There is a carve-out for smaller businesses. But it is not as broad as many people assume.

The conditions

To qualify, you need to meet all of the following:

  1. You employ fewer than 50 full-time equivalent employees
  2. You do not use your own data to train the high-risk AI system
  3. You use the system only for its intended purpose as described by the developer
  4. You rely on the developer’s documentation

Miss any one of those conditions, and the carve-out may not apply.

What the carve-out provides

If you qualify, you may get relief from:

  • Some formal risk management program requirements
  • Certain impact assessment obligations (you can lean on developer documentation instead)
  • Some website disclosure requirements

What the carve-out does NOT remove

Even if you qualify, you likely still need to:

  • Notify consumers when AI is used in consequential decisions
  • Explain adverse decisions influenced by AI
  • Offer processes for correction and appeal
  • Monitor for problems and report discovered discrimination

For more detail, the Future of Privacy Forum’s slides on the small business carve-out and the National Association of Attorneys General analysis are both worth reviewing.

Bottom line: Being small does not mean being exempt. The carve-out is real, but it is narrow. And the consumer-facing obligations remain.


What happens if you don’t comply?

I will cover enforcement in detail later in this series. But here is what you need to know now.

The Colorado Attorney General has exclusive enforcement authority over this law. Violations are treated as deceptive trade practices under the Colorado Consumer Protection Act.

This is not theoretical. Attorneys General in other states, including Pennsylvania and Massachusetts, have already pursued enforcement actions against companies whose AI systems caused consumer harm. And they did so using existing laws, before comprehensive AI statutes were even in place.

The penalties can be significant. And the reputational impact of an enforcement action can be far more damaging than the fine itself.

I will go deeper on enforcement, penalties, and real cases in Post 7 of this series.


What should you do now?

If you have read this far, you are already ahead of most business owners I talk to.

Here is what I would recommend as a starting point:

  1. Take an honest inventory of where AI is being used in your business. Include tools your staff may have adopted on their own.
  2. Clarify your role. Are you a deployer? A developer? Both?
  3. Look at your vendors. Can they provide the documentation you will need to show compliance?
  4. Assign someone to own this. Even in a small firm, someone needs to be responsible for AI risk.

I cover these first steps in detail in Post 3: Your First 30 Days, AI Assessment, Inventory, and Policy Basics. If you want a practical starting point, that is where I would send you next.


How we can help

I work with professional services firms across South Denver, from Greenwood Village and the DTC to Centennial, Littleton, Lone Tree, and Highlands Ranch. AI compliance is new territory for most of these firms. But the underlying discipline is not new to me.

I spent 30 years in enterprise technology. I have led teams through large-scale incidents, managed platforms handling hundreds of billions in annual transactions, and delivered difficult news to executives and customers when things went wrong.

I left that world to bring enterprise-grade discipline to local businesses that deserve the same level of care.

If you are unsure whether the Colorado AI Act applies to your firm, or what to do about it, let’s talk.

We offer AI compliance assessments designed for Colorado businesses. We will help you understand where AI is in your operations, whether you have high-risk use cases, and what a practical compliance path looks like.

Book an AI Assessment →


Frequently Asked Questions About the Colorado AI Act

Does this law apply if we only use ChatGPT internally?

It depends on how you use it. If you are using it for general writing, research, or brainstorming, and it is not influencing decisions about employment, lending, housing, insurance, or similar areas, you are likely outside the high-risk scope. But if AI is touching decisions that affect people’s lives or livelihoods, even indirectly, you should assess your exposure. When in doubt, document what you are doing and have it reviewed.

Are small businesses exempt from the Colorado AI Act?

Not entirely. There is a limited carve-out for deployers with fewer than 50 employees, but it has multiple conditions. You must not use your own data to train the system, you must use it for its intended purpose, and you must rely on developer documentation. Even if you qualify, many consumer-facing obligations still apply. Do not assume you are exempt without checking.

What if our AI tool comes from a third-party vendor?

You may still have obligations as a deployer. Using a vendor-built tool does not transfer compliance responsibility to the vendor. If the tool is used in consequential decisions, you will likely need to provide consumer notices, explain adverse decisions, and offer correction and appeal processes. We cover vendor risk in depth in Post 6 of this series.

Is Colorado the only state with AI rules?

No. Utah, Texas, California, New York, Illinois, and other states have all enacted AI-related laws of varying scope. The EU AI Act is the broadest framework globally. Colorado’s law is currently one of the most comprehensive at the state level in the U.S., but the regulatory landscape is expanding quickly.

What happens if we ignore this?

Violations are treated as deceptive trade practices under the Colorado Consumer Protection Act. The Attorney General has enforcement authority and penalties can be significant. Regulators in other states are already acting on AI-related consumer harms. The businesses that prepare early will be in a much stronger position than those that wait.


Disclaimer: This article is provided for general informational purposes only and is not legal advice. Businesses should consult qualified legal counsel regarding their specific compliance obligations under SB 24-205 or any other applicable law.

Back to Blog

Share:

Related Posts

Project manager reviewing digital blueprints for a Denver jobsite.

Cybersecurity for Construction in South Denver: That $10.5 Trillion Threat Is Targeting Your Job Sites

October is Cybersecurity Awareness Month This October, Cybersecurity Awareness Month. there’s a…

Read More

Cybersecurity for Law Firms in South Denver: Don’t Let a Digital Flat Tire Derail Your Practice

October is Cybersecurity Awareness Month A funny thing happened on the way…

Read More
Employees in a South Denver office participating in cybersecurity awareness training session.

Security Awareness Training in South Denver: Empower Your Team, Protect Your Business

October is Cybersecurity Awareness Month Here in South Denver, we are surrounded…

Read More