This article is part of a series on Colorado’s AI and automated decision-making rules (SB24-205 and the proposed update SB26-189). This is Post 3. In Post 1, I covered what the law is and why it matters. In Post 2, I covered what kinds of AI use are most likely to fall into scope. In this post, I want to make it practical. If you are a Colorado business owner wondering what to do first, this is where I would start.
Legislative update (as of May 2026): Colorado lawmakers have introduced SB26-189, which would repeal and replace SB24-205 with a framework focused on automated decision-making technology used in consequential decisions. The proposal would shift the effective date to Jan. 1, 2027. This article is for general information and reflects what is known at the time of writing. If your business uses AI in hiring, housing, lending, insurance, healthcare, or similar decisions, I recommend monitoring this bill’s progress and getting clarity on your tools and vendors now.
Quick navigation: Skip to FAQ | Jump to vendor review | Jump to AI policy
If you are wondering where to start, start here
A lot of business owners freeze when they hear the words “AI compliance.”
Not because they do not care. Usually it is the opposite. They care enough to know they do not want to get this wrong.
I was talking recently with a professional services firm in Greenwood Village. Smart leadership team. Careful people. They had read enough about the Colorado AI Act to know it might matter. But they were stuck on the first question.
“What do we actually do now?”
That is the right question.
You do not need to begin with a giant compliance project. You need a sensible first 30 days.
If you run a law firm, advisory firm, consultancy, or other professional services business in Centennial, Littleton, Lone Tree, Highlands Ranch, Greenwood Village, or the Denver Tech Center, this is the practical starting point I would recommend.
What you should have in place in 30 days
If you want a simple target, aim for this. By the end of the first month, you should have:
- A basic inventory of AI tools in use
- A short list of likely high-risk workflows
- Initial outreach to key vendors for documentation
- A simple AI use policy that your team can follow
- A named internal owner for next steps
- Basic documentation of what you reviewed and decided
If you have those six things, you are ahead of most businesses I talk to.
Step 1: Create an AI inventory
This is the first move because most firms do not actually know where AI is being used.
And I do not mean just ChatGPT.
I mean Microsoft 365 features, HR platforms, CRM tools, practice management software, note-taking apps, analytics tools, customer service tools, and industry-specific platforms that now include AI quietly in the background.
Your inventory does not need to be fancy. A spreadsheet is fine.
Start with these columns:
- Tool or platform name
- Business owner or team using it
- What AI feature is being used
- What output it generates
- Whether it influences any consequential decision
- Whether a vendor is involved
- Whether vendor documentation is available
This step alone will give you clarity most firms do not have today.
And if you support law firms in Denver, financial advisory teams, or consultancies, this matters because AI often enters through standard business software, not a formal “AI project.”
Step 2: Identify high-risk use cases
Once you have the inventory, the next step is to separate the routine from the risky.
The practical question is simple:
Does this AI tool influence a consequential decision?
That includes decisions affecting:
- Hiring or employment
- Housing
- Lending or credit
- Insurance
- Healthcare
- Education
- Legal services
- Government-related access or benefits
If the answer is yes, or even maybe, flag it for review.
This is where many South Denver businesses realize the issue is not their visible AI use. It is the AI built into the systems they already trust.
A meeting summary tool is usually low-risk. A resume ranking tool is not. A drafting assistant may be low-risk. A client scoring model that changes eligibility or access may not be.
Do not overcomplicate this step. You are not trying to finish the legal analysis. You are trying to identify which workflows deserve attention.
Step 3: Review your vendors
If I had to pick one area where businesses are most exposed, it is here.
Many firms assume that if a third-party vendor built the AI, the vendor owns the problem.
That is not how this works.
If your business uses a vendor tool in a high-risk context, you may still have obligations as a deployer. That means you need to know what the vendor can provide.
At minimum, start asking for:
- Documentation on how the AI system is intended to be used
- Any known limitations or risks
- Information about testing, monitoring, or bias mitigation
- Any compliance support materials relevant to Colorado or similar laws
- A point of contact for follow-up questions
You are not looking for perfection on day one. You are looking for visibility.
This matters whether you are providing managed IT services in South Denver, supporting a law office in Greenwood Village, or helping a growing firm in Lone Tree keep pace with security and compliance expectations.
Step 4: Put a basic AI use policy in place
This is the step many firms skip. I think that is a mistake.
An inventory tells you what is happening. A policy tells your people how AI should and should not be used.
Without a policy, each employee is making their own judgment call. Some will be cautious. Some will not. That is not governance. That is drift.
Your first policy does not need to be complicated. It should be usable.
Start with the basics:
- Which AI tools are approved
- Which tools are prohibited
- What kinds of data can and cannot be entered
- When human review is required
- What kinds of decisions need management approval
- Who employees should contact with questions
If you do nothing else in the first month, do this. It creates a baseline. It also shows that you are taking reasonable steps to manage risk.
For most small and mid-size businesses in Denver Tech Center, Centennial, and Littleton, a basic policy goes further than people think. It reduces confusion, limits ad hoc tool adoption, and gives leadership something concrete to enforce.
Step 5: Assign ownership
One of the fastest ways for AI risk to grow is for nobody to own it.
That does not mean you need a Chief AI Officer. Most firms are nowhere near that scale.
But someone should be responsible for coordinating the inventory, gathering vendor information, maintaining the policy, and escalating questions when a use case looks high-risk.
In some firms, that will be operations. In others, compliance, HR, legal, or IT. In smaller firms, it may simply be one senior leader with support from an outside advisor.
The title matters less than the ownership.
If everyone thinks someone else is handling AI, nobody is handling AI.
Step 6: Document what you learn
You do not need a giant compliance binder. But you do need a record.
Keep notes on:
- Which tools you reviewed
- Which ones were flagged as potentially high-risk
- What vendor information you requested and received
- What policy decisions you made
- Who owns next steps
This helps you run the work coherently, and it helps you show what you did and why if you ever need to.
That is one of the recurring themes in cybersecurity services in Denver and compliance work generally. Reasonable care is easier to demonstrate when you have records.
What not to do in the first 30 days
Let me be just as clear about what I would not do.
- Do not wait for the law to become perfectly settled before doing anything
- Do not assume your vendors have everything handled
- Do not treat “internal use” as automatically safe
- Do not write a bloated policy nobody will read
- Do not turn this into a panic project
The goal is not to solve everything in a month. The goal is to get visibility, reduce obvious risk, and create enough structure that you can make smarter decisions going forward.
How we can help
I work with professional services firms across South Denver that want a practical way to get started. Not theory. Not fear. Just a clear assessment of what is in use, where the risk sits, and what to do next.
If your firm needs help building an AI inventory, reviewing vendor tools, or putting a usable AI policy in place, we can help.
We provide AI assessments and policy workshops designed for Colorado businesses that need clarity and momentum, not complexity.
Book an AI Assessment and Policy Workshop →
Frequently Asked Questions About Your First 30 Days of AI Compliance
Disclaimer: This article is provided for general informational purposes only and is not legal advice. Businesses should consult qualified legal counsel regarding their specific compliance obligations under SB 24-205 or any other applicable law.