Question 4 of 24
What are our obligations under emerging AI regulations?
Tracking the EU AI Act, U.S. executive orders, SEC guidance, and sector-specific rules to understand what AI compliance actually requires.
The regulatory landscape is fragmented by design
There is no single global AI regulation. Instead, compliance teams face a patchwork of binding regulations, voluntary frameworks, sector-specific rules, and emerging legislation that varies by jurisdiction, industry, and use case. The challenge is not understanding any single rule. It is maintaining a current map of which rules apply to your organization and what they actually require.
The EU AI Act is the most comprehensive binding framework currently in force. It applies to any organization that places AI systems on the EU market or affects EU residents, regardless of where the organization is headquartered. Compliance deadlines are phased, with obligations for high-risk systems taking effect in 2026.
U.S. federal and sector-specific rules
At the federal level, Executive Order 14110 on Safe, Secure, and Trustworthy AI directed agencies to develop sector-specific guidance, much of which is now in effect. The FTC has issued guidance on AI in advertising and consumer-facing products. The SEC has signaled that AI-related disclosures in public filings are subject to existing material disclosure requirements. EEOC guidance addresses AI use in employment decisions under Title VII.
In financial services, banking regulators have issued model risk management guidance that applies to AI models used in credit decisions. Healthcare organizations must navigate HIPAA obligations when AI systems process protected health information. The FCRA applies to AI used in credit, employment, housing, and insurance decisions and requires adverse action notices when AI influences a denial.
State-level regulation is accelerating
Colorado's AI Act (SB 205) imposes obligations on developers and deployers of high-risk AI systems affecting Colorado residents, including impact assessments and transparency requirements. Illinois and New York City have enacted specific rules on AI use in employment decisions. California has several AI-related bills in various stages of passage.
For organizations operating across multiple U.S. states, the practical approach is to identify the most stringent applicable requirements and build to that standard. Complying with Colorado's AI Act and the EU AI Act will put you in a strong position relative to most other jurisdictions, even if it exceeds what is strictly required locally.
Building a compliance tracking process
Regulatory monitoring should be a continuous process, not an annual review. Assign someone responsibility for tracking AI regulatory developments in your key jurisdictions. Subscribe to regulatory agency feeds, use a directory like this one to track changes, and build relationships with outside counsel who specialize in AI law.
Map each regulation to the AI systems in your inventory. For each system, document which regulations apply, what they require, your current compliance status, and the timeline for any gaps to be remediated. This mapping becomes the foundation of your AI compliance program.
