SEC AI Governance Guidance
Issued by
U.S. Securities and Exchange Commission (SEC)
The SEC has issued rules, guidance, and proposed rulemaking addressing AI governance obligations for registered investment advisers, broker-dealers, and public companies, focusing on conflicts of interest in predictive data analytics, AI-related disclosures in securities filings, and examination priorities targeting algorithmic systems.
Applies To
Overview
The SEC's AI governance posture comprises several distinct but interconnected instruments rather than a single unified framework. The most significant is the August 2023 proposed rule on 'Conflicts of Interest Associated with the Use of Predictive Data Analytics by Broker-Dealers and Investment Advisers' (Release Nos. 34-97990; IA-6383), which would require registrants to identify and neutralize or eliminate conflicts of interest that arise when AI or algorithmic tools optimize for firm interests at the expense of investor interests. The proposal reflects the Commission's concern that AI systems may be designed-intentionally or emergently-to favor firm revenue over investor outcomes, a structural conflict that existing suitability and fiduciary frameworks may not fully capture. In parallel, the SEC's Division of Corporation Finance has issued guidance and Staff Bulletins addressing AI-related disclosures in annual reports, prospectuses, and proxy statements. Public companies are expected to disclose material risks arising from AI use, AI-related cybersecurity exposures, and the governance structures in place to oversee AI systems. The Commission has also scrutinized 'AI washing'-instances where companies overstate AI capabilities in investor communications-bringing enforcement actions grounded in anti-fraud provisions of the Securities Act of 1933 and the Securities Exchange Act of 1934. The SEC's Office of Compliance Inspections and Examinations (now OCIE, reconstituted as the Division of Examinations) has included AI and algorithmic systems among its examination priorities, directing examiners to assess whether registrants have adequate policies and procedures governing AI model development, validation, deployment, and monitoring. The 2022 and 2023 Examination Priorities publications explicitly flag AI governance as an area of heightened scrutiny. Separately, the SEC's 2023 cybersecurity disclosure rule, effective December 2023, indirectly captures AI-related cyber risks by requiring timely disclosure of material cybersecurity incidents and annual disclosure of cybersecurity risk management practices, which must encompass AI systems that process material data or support critical operations. Registrants should also monitor the interplay between SEC guidance and FINRA's ongoing work on algorithmic supervision and AI use in member firm operations.
Key Requirements
- •Identify, document, and neutralize or eliminate conflicts of interest arising from the use of predictive data analytics or AI tools that optimize for firm interests over investor interests (proposed rule obligation, currently under final rulemaking)
- •Disclose material AI-related risks in annual reports (Form 10-K), registration statements, and proxy materials, including risks from AI model failure, bias, and third-party AI dependencies
- •Refrain from making materially misleading statements about AI capabilities in investor communications, marketing materials, or SEC filings (anti-fraud provisions)
- •Maintain written policies and procedures governing AI model development, validation, change management, and ongoing monitoring as part of the adviser or broker-dealer compliance program under the Investment Advisers Act or Exchange Act
- •Include AI systems that process material nonpublic information or support critical operations within the scope of cybersecurity risk management programs disclosed under the 2023 Cybersecurity Disclosure Rule
- •Ensure AI-driven trading, order routing, or portfolio construction systems are subject to governance controls that examiners can review during SEC inspections
- •Document how AI-generated investment recommendations are supervised and reviewed to satisfy best interest and fiduciary obligations
- •Assess third-party AI vendor arrangements for outsourcing risk and ensure vendor due diligence is documented and periodically updated
What Your Organization Must Do
- →Conduct a firm-wide inventory of all AI and predictive data analytics tools by Q1 each year, mapping each system to its decision function (e.g., order routing, portfolio construction, client communications) and identifying any configuration that could prioritize firm revenue over investor outcomes, with the Chief Compliance Officer owning this inventory.
- →Draft or update written policies and procedures governing AI model development, validation, change management, and ongoing monitoring to satisfy Investment Advisers Act Rule 206(4)-7 or Exchange Act Rule 15c3-5 compliance program requirements, ensuring these policies are reviewed at least annually and before any material model change is deployed.
- →Audit all investor-facing communications, marketing materials, Form ADV filings, Form 10-K disclosures, and registration statements to remove or correct any unsupported claims about AI capabilities, assigning Legal and Compliance sign-off as a mandatory pre-publication control to mitigate AI washing enforcement risk under Securities Act Section 17(a) and Exchange Act Section 10(b).
- →Incorporate AI systems that process material nonpublic information or support critical operations into the cybersecurity risk management program required under the 2023 Cybersecurity Disclosure Rule (effective December 2023), and verify that Form 8-K incident reporting procedures cover AI-related breaches within the required four-business-day window.
- →Prepare an examination-ready conflict-of-interest documentation package for each AI-driven recommendation or trading system, including model objective functions, backtesting results, validation records, and evidence of neutralization controls, so the Division of Examinations can assess compliance during inspections flagged under the 2023 Examination Priorities.
- →Establish a third-party AI vendor due diligence process requiring documented initial assessments and annual reviews of all material AI vendors, covering model transparency, data handling, conflict-of-interest exposure, and contractual audit rights, with results reported to the CCO and flagged to senior management where vendor risk is elevated.
Playbook Guidance
Step-by-step implementation guidance for compliance teams.
Frequently Asked Questions
- Does the SEC predictive data analytics proposed rule apply to robo-advisers and quantitative hedge funds, or only traditional broker-dealers?
- The proposed rule applies to all SEC-registered investment advisers and broker-dealers, which explicitly includes robo-advisers and quantitative hedge funds. Any registrant using AI or algorithmic tools that could optimize for firm revenue over investor interests is within scope, regardless of business model or strategy.
- What is the current status of the SEC predictive data analytics conflict-of-interest rule and when is a final rule expected?
- The rule remains in proposed form as of mid-2024, published in August 2023 under Release Nos. 34-97990 and IA-6383. No final rule date has been confirmed by the Commission. Firms should nonetheless build compliance infrastructure now, as examination staff are already scrutinizing AI conflict-of-interest controls under current Examination Priorities.
- What counts as AI washing under SEC anti-fraud provisions and what enforcement risk does it create?
- AI washing refers to materially overstating AI capabilities in investor communications, SEC filings, or marketing materials. The SEC has brought enforcement actions under Securities Act Section 17(a) and Exchange Act Section 10(b), treating these misstatements as fraud. Firms should require Legal and Compliance sign-off before any AI capability claim is published externally.
- How does the SEC's 2023 cybersecurity disclosure rule intersect with AI governance obligations for public companies?
- The 2023 Cybersecurity Disclosure Rule, effective December 2023, requires annual disclosure of cybersecurity risk management practices and timely Form 8-K reporting of material incidents within four business days. AI systems that process material data or support critical operations must be included in the cybersecurity program scope, making AI governance a direct input to cybersecurity disclosures.
- What specific AI-related items does the SEC Division of Examinations look for during inspections of investment advisers and broker-dealers?
- Examiners assess whether firms have written policies governing AI model development, validation, change management, and ongoing monitoring. They also review conflict-of-interest controls for AI-driven recommendation and trading systems, including model objective functions, backtesting records, and validation documentation. The 2022 and 2023 Examination Priorities publications explicitly flag algorithmic systems as a heightened scrutiny area.
- What due diligence obligations apply when a registered investment adviser or broker-dealer uses a third-party AI vendor?
- While no single rule prescribes a specific third-party AI vendor framework, SEC examination expectations and existing outsourcing guidance require documented initial and periodic due diligence covering model transparency, data handling, and conflict-of-interest exposure. Contractual audit rights and annual review cycles are best practice, with results reported to the Chief Compliance Officer.
