AI Governance Institute logo
AI Governance Institute

Practical Governance for Enterprise AI

← AI Governance Playbook

Question 4 of 34

What are our obligations under emerging AI regulations?

Published by AI Governance Institute · Practical Governance for Enterprise AI

Tracking the EU AI Act, U.S. executive orders, SEC guidance, and sector-specific rules to understand what AI compliance actually requires.

If you only do 3 things, do this:

  1. 1.Map your AI inventory against the EU AI Act's Annex III risk categories. If any system falls there, you have concrete obligations with deadlines.
  2. 2.Build to the most stringent standard that applies to you — typically EU AI Act plus your sector's rules. You'll satisfy almost everything else automatically.
  3. 3.Assign one person the job of tracking AI regulatory developments in your key jurisdictions. This cannot be a passive "check when something seems important" role.

The Situation

Who this is for: Compliance officers and legal teams responsible for tracking and meeting AI regulatory requirements

When you need this: When building a compliance program, or when a new regulation takes effect in a jurisdiction where you operate

The Decision

Which regulations apply to which of our AI systems, and what do they actually require us to do?

The Steps

  1. 1Identify all jurisdictions where your AI systems operate or where your organization is based
  2. 2Map each AI system against applicable regulatory frameworks in those jurisdictions
  3. 3For each applicable regulation, document what it requires, your current compliance status, and any gaps
  4. 4Prioritize by risk: start with systems subject to the EU AI Act's high-risk categories and sector-specific rules
  5. 5Build a compliance calendar showing effective dates, registration deadlines, and required assessments
  6. 6Assign someone to monitor regulatory developments and update the mapping on a rolling basis

The Artifacts

  • Regulatory applicability matrix (AI systems × jurisdictions × regulations)
  • Compliance calendar template (deadlines by jurisdiction and system)
  • EU AI Act Annex III checklist (high-risk use case categories)
  • Regulatory monitoring subscription list (agency feeds, regulatory bodies)

The Output

A current map of which regulations apply to which AI systems, with a compliance status for each, and an ongoing process to keep it current.

The regulatory landscape is fragmented by design

There is no single global AI regulation. Instead, compliance teams face a patchwork of binding regulations, voluntary frameworks, sector-specific rules, and emerging legislation that varies by jurisdiction, industry, and use case. The challenge is not understanding any single rule. It is maintaining a current map of which rules apply to your organization and what they actually require.

The EU AI Act is the most comprehensive binding framework currently in force. It applies to any organization that places AI systems on the EU market or affects EU residents, regardless of where the organization is headquartered. Compliance deadlines are phased, with obligations for high-risk systems taking effect in 2026.

U.S. federal and sector-specific rules

At the federal level, Executive Order 14110 on Safe, Secure, and Trustworthy AI directed agencies to develop sector-specific guidance, much of which is now in effect. The FTC has issued guidance on AI in advertising and consumer-facing products. The SEC has signaled that AI-related disclosures in public filings are subject to existing material disclosure requirements. EEOC guidance addresses AI use in employment decisions under Title VII.

In financial services, banking regulators have issued model risk management guidance that applies to AI models used in credit decisions. Healthcare organizations must navigate HIPAA obligations when AI systems process protected health information. The FCRA applies to AI used in credit, employment, housing, and insurance decisions and requires adverse action notices when AI influences a denial.

State-level regulation is accelerating

Colorado's AI Act (SB 205) imposes obligations on developers and deployers of high-risk AI systems affecting Colorado residents, including impact assessments and transparency requirements. Illinois and New York City have enacted specific rules on AI use in employment decisions. California has several AI-related bills in various stages of passage.

For organizations operating across multiple U.S. states, the practical approach is to identify the most stringent applicable requirements and build to that standard. Complying with Colorado's AI Act and the EU AI Act will put you in a strong position relative to most other jurisdictions, even if it exceeds what is strictly required locally.

Building a compliance tracking process

Regulatory monitoring should be a continuous process, not an annual review. Assign someone responsibility for tracking AI regulatory developments in your key jurisdictions. Subscribe to regulatory agency feeds, use a directory like this one to track changes, and build relationships with outside counsel who specialize in AI law.

Map each regulation to the AI systems in your inventory. For each system, document which regulations apply, what they require, your current compliance status, and the timeline for any gaps to be remediated. This mapping becomes the foundation of your AI compliance program.

Governance Controls

Operational controls that implement the guidance in this playbook.