AI Governance Institute logo
AI Governance Institute

Practical Governance for Enterprise AI

Must ComplyRegulationEU

EU Digital Services Act – AI and Algorithmic Accountability Provisions

Issued by

European Parliament and Council of the European Union; enforced by Digital Services Coordinators (DSCs) in each Member State and by the European Commission for Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs)

liveEffective 2024-02-17DSA AI ProvisionsVerified April 2026
Official document →

The Digital Services Act imposes transparency, accountability, and risk-management obligations on online intermediaries with respect to algorithmic recommender systems, targeted advertising, and systemic risks. Obligations scale with platform size, with the most stringent requirements applying to VLOPs and VLOSEs.

Applies To

Very Large Online Platforms (VLOPs) designated by the European Commission (threshold: 45 million average monthly active EU recipients), including major social media, e-commerce marketplaces, and app stores.Very Large Online Search Engines (VLOSEs) designated by the European Commission.Mid-size and smaller online platforms with EU users that are not VLOPs but are subject to baseline obligations effective 17 February 2024.Online intermediaries including hosting services, cloud infrastructure providers, and content delivery networks, subject to proportionate obligations.Advertisers and ad-tech providers relying on VLOP/VLOSE inventory for EU-targeted campaigns.AI vendors supplying recommender engines, content moderation tools, and ad-targeting systems to in-scope platforms.

Overview

Regulation (EU) 2022/2065 on a Single Market for Digital Services (Digital Services Act, DSA) entered into force on 16 November 2022. Obligations for VLOPs and VLOSEs became applicable on 25 August 2023; obligations for all other in-scope providers became applicable on 17 February 2024. The DSA establishes a layered regulatory framework for online intermediary services, with AI-related obligations concentrated in three domains: recommender systems, targeted advertising, and systemic risk management. Providers of online platforms are required to explain the parameters of any recommender system they deploy and offer users meaningful alternatives, including at least one option not based on profiling. All online platforms are prohibited from using dark patterns in their interfaces. For VLOPs and VLOSEs, which are designated by the European Commission on the basis of a threshold of at least 45 million average monthly active recipients in the EU, the obligations are substantially more demanding. Designated platforms must conduct annual systemic risk assessments covering risks arising from algorithmic amplification of illegal content, fundamental rights impacts, civic discourse, electoral processes, and gender-based violence. They must implement reasonable mitigation measures, submit to annual independent audits, grant vetted researcher data access, appoint a compliance officer, and share data with the European Commission and DSCs. Advertising transparency requirements prohibit targeting minors or using sensitive personal data categories for micro-targeted advertising. The Commission has sole enforcement authority over VLOPs and VLOSEs, with fines of up to six percent of global annual turnover and, for repeated infringement, periodic penalty payments and eventual temporary access restriction. Member State DSCs enforce obligations applicable to smaller platforms, with fines up to six percent of national turnover.

Key Requirements

  • All online platforms: Provide clear, accessible information about the main parameters of any recommender system and allow users to modify or opt out of profiling-based recommendations (Article 27).
  • All online platforms: Maintain an advertisement repository disclosing information about each advertisement served, including targeting parameters used (Article 39, applies to platforms with more than one million average monthly active EU recipients).
  • All online platforms: Prohibit the use of interface design that obscures choices or manipulates user behavior (dark patterns prohibition, Article 25).
  • VLOPs and VLOSEs: Conduct annual systemic risk assessments covering algorithmic amplification of illegal content, fundamental rights, democratic processes, and public health risks (Article 34).
  • VLOPs and VLOSEs: Design and implement proportionate risk mitigation measures and document those measures (Article 35).
  • VLOPs and VLOSEs: Submit to independent audits at least annually and publish audit reports (Article 37).
  • VLOPs and VLOSEs: Provide vetted researchers with access to data necessary for systemic risk research (Article 40).
  • VLOPs and VLOSEs: Appoint a DSA compliance officer at senior management level (Article 41).
  • VLOPs and VLOSEs: Prohibit targeted advertising directed at minors and advertising based on sensitive personal data categories (Article 26).
  • VLOPs and VLOSEs: Publish annual transparency reports and submit enhanced transparency reports to the Commission (Articles 15 and 42).
  • All in-scope providers: Publish terms of service in plain language describing content moderation policies and any use of automated means in enforcement.

What Your Organization Must Do

  • Determine your platform's classification immediately: calculate average monthly active EU recipients and confirm whether you meet the 45 million VLOP/VLOSE threshold or the 1 million threshold for the ad repository obligation, and document that analysis with your legal and data teams.
  • Audit all deployed recommender systems by assigning your Chief Compliance Officer or a designated DSA Compliance Officer to map every algorithmic parameter used for content ranking and ensure user-facing explanations and at least one non-profiling-based alternative are live and accessible.
  • Commission an annual systemic risk assessment (VLOPs and VLOSEs only) covering algorithmic amplification of illegal content, fundamental rights impacts, electoral processes, and public health risks, completing the first cycle and submitting findings to the European Commission before any Commission-imposed deadline or audit cycle begins.
  • Engage an accredited independent auditor to conduct the annual DSA audit required under Article 37, ensuring the engagement letter is signed and audit scope is defined well enough in advance to produce a published report within each annual cycle.
  • Implement advertising controls to block all targeting of minors and any targeting based on special categories of sensitive personal data, and configure your ad repository to log and disclose targeting parameters for every ad served to EU recipients with more than 1 million average monthly active users.
  • Review all platform interface designs against the dark patterns prohibition under Article 25, document remediation of any identified manipulative design patterns, and establish a recurring UX review process owned by product and compliance teams to catch regressions before DSC inquiries arise.

Playbook Guidance

Step-by-step implementation guidance for compliance teams.

Governance Controls

Operational controls that implement requirements from this regulation.

Frequently Asked Questions

Which platforms are classified as VLOPs under the DSA and subject to the strictest algorithmic accountability rules?
The European Commission designates platforms as VLOPs when they reach at least 45 million average monthly active recipients in the EU. Current designees include major social media networks, large e-commerce marketplaces, and app stores. Designation triggers the full set of obligations including systemic risk assessments, independent audits, and researcher data access.
When did DSA obligations become enforceable for platforms that are not VLOPs or VLOSEs?
Baseline obligations for all other in-scope online intermediaries became applicable on 17 February 2024. VLOP and VLOSE obligations were already enforceable from 25 August 2023 following Commission designation decisions issued that year.
What are the maximum fines for violating the DSA algorithmic transparency and risk management provisions?
The European Commission can impose fines of up to 6 percent of a VLOP or VLOSE's global annual turnover for violations. For repeated infringement, periodic penalty payments and temporary access restrictions are also available. Member State DSCs can fine smaller platforms up to 6 percent of national annual turnover.
Does the DSA require platforms to offer users a non-algorithmic feed or recommendation option?
Yes. Under Article 27, all online platforms using recommender systems must offer users at least one option not based on profiling. The alternative must be clearly accessible and meaningfully distinct from personalized ranking, and platforms must explain the main parameters used in any recommender system they operate.
How do the DSA's algorithmic risk assessment obligations differ from those under the EU AI Act?
The DSA requires VLOPs and VLOSEs to conduct annual systemic risk assessments focused on societal harms such as electoral interference, illegal content amplification, and public health risks. The EU AI Act imposes conformity assessments on high-risk AI systems at the product level. A VLOP deploying a high-risk AI recommender system may need to satisfy both frameworks independently.
Does the DSA prohibit all targeted advertising on VLOPs, or only certain types?
The DSA does not ban targeted advertising broadly. It prohibits targeting minors and targeting based on sensitive personal data categories such as health, religion, or sexual orientation. All platforms above 1 million average monthly active EU recipients must also maintain a public ad repository disclosing targeting parameters used for each advertisement served.