AI Governance Institute logo
AI Governance Institute

Practical Governance for Enterprise AI

VoluntaryGuidelineUS

FDA AI/ML Software as Medical Device Guidance

Issued by

U.S. Food and Drug Administration (FDA), Center for Devices and Radiological Health (CDRH)

liveEffective 2021-01-12FDA AI/ML SaMD GuidanceVerified April 2026
Official document →

FDA's action plan and associated guidance documents establish a regulatory framework for AI/ML-based Software as a Medical Device (SaMD), introducing a total product lifecycle (TPLC) approach, predetermined change control plans, and transparency and monitoring requirements for adaptive AI/ML algorithms used in clinical settings.

Applies To

Medical device manufacturersdigital health companieshospital systems and integrated delivery networks developing or acquiring AI/ML-enabled clinical toolspharmaceutical companies with companion diagnostic AI componentsradiology and pathology software vendorsand healthcare IT vendors embedding AI into clinical workflows. Contract research organizations supporting AI/ML SaMD submissions are also directly affected.

Overview

The FDA's AI/ML Software as a Medical Device (SaMD) guidance framework addresses the unique challenge posed by adaptive AI and ML algorithms that can continuously learn and change their behavior after deployment-a characteristic that sits uneasily within traditional premarket submission paradigms. The foundational document is the January 2021 'Artificial Intelligence/Machine Learning (AI/ML)-Based Software as a Medical Device (SaMD) Action Plan,' which outlined five strategic areas: tailored regulatory framework, good machine learning practices (GMLP), patient-centered approach, transparency, and real-world performance monitoring. This action plan was preceded by a 2019 discussion paper and followed by the April 2023 draft guidance on marketing submissions for AI/ML-enabled devices, as well as the 2023 predetermined change control plan (PCCP) guidance. The TPLC approach requires manufacturers to demonstrate not only that a device performs safely at the time of clearance or approval, but that ongoing changes-particularly model updates driven by real-world data-remain within prespecified and FDA-accepted parameters. The PCCP mechanism allows manufacturers to describe planned modifications and the validation methods that will govern them, reducing the need for new premarket submissions for each algorithmic update. The framework draws heavily on the International Medical Device Regulators Forum (IMDRF) SaMD framework, categorizing software by the significance of the information it provides and the healthcare situation or condition it addresses. Enterprises in digital health, medical imaging, clinical decision support, and diagnostics must navigate this framework when developing or commercializing AI/ML-enabled devices intended for U.S. market access. The guidance also aligns with the 21st Century Cures Act's clinical decision support provisions, which carve out certain lower-risk software from device regulation entirely.

Key Requirements

  • Submit a Predetermined Change Control Plan (PCCP) with premarket submissions describing anticipated algorithm modifications, their rationale, and associated validation protocols
  • Adhere to Good Machine Learning Practices (GMLP) encompassing data management, model training, testing, documentation, and bias assessment
  • Implement a Total Product Lifecycle (TPLC) approach with post-market performance monitoring and deviation reporting
  • Ensure transparency to users and patients regarding AI/ML model capabilities, limitations, and intended use
  • Characterize and disclose training and test dataset composition, including demographic representation, to support bias and generalizability evaluation
  • Conduct analytical and clinical validation studies appropriate to the device's IMDRF SaMD risk category
  • Maintain device history records and design history files that include AI/ML-specific documentation such as model cards and performance metrics
  • Report algorithm-related adverse events or deviations through established FDA Medical Device Reporting (MDR) mechanisms
  • Evaluate whether software meets clinical decision support (CDS) carve-out criteria under the 21st Century Cures Act before initiating device regulatory pathway

What Your Organization Must Do

  • Determine whether each AI/ML software product meets the 21st Century Cures Act clinical decision support carve-out criteria before initiating any FDA regulatory pathway; assign this triage responsibility to your regulatory affairs lead and document the analysis in the design history file.
  • Prepare and submit a Predetermined Change Control Plan (PCCP) alongside any premarket submission (510(k), De Novo, or PMA) for adaptive AI/ML devices, detailing anticipated model modifications, retraining triggers, and validation protocols to avoid separate submissions for each future algorithm update.
  • Establish a Good Machine Learning Practices (GMLP) program covering data governance, training and test dataset documentation, demographic representation analysis, and bias assessment; assign ownership to your quality and data science teams and integrate requirements into your existing quality management system.
  • Implement a Total Product Lifecycle monitoring program that tracks real-world algorithm performance against prespecified thresholds, triggers deviation reviews when performance drifts, and feeds findings into your FDA Medical Device Reporting (MDR) process for algorithm-related adverse events.
  • Build AI/ML-specific documentation artifacts, including model cards and performance metric summaries, into your design history file and device history records to satisfy CDRH reviewer expectations established in the April 2023 draft marketing submission guidance.
  • Publish clear transparency disclosures to clinical users covering model intended use, known limitations, training dataset characteristics, and demographic generalizability, and verify these disclosures align with the IMDRF SaMD risk category assigned to each product.

Playbook Guidance

Step-by-step implementation guidance for compliance teams.

Frequently Asked Questions

Does the FDA AI/ML SaMD guidance apply to hospital systems that build AI tools internally, not just device manufacturers?
Yes. Hospital systems and integrated delivery networks that develop or acquire AI/ML-enabled clinical tools are considered affected entities under this framework. If the software meets the definition of a medical device, FDA regulatory requirements apply regardless of whether the developer is a traditional manufacturer or a healthcare institution.
What is a Predetermined Change Control Plan and when is it required for an AI/ML device submission?
A PCCP is a document submitted alongside a 510(k), De Novo, or PMA that prespecifies anticipated algorithm modifications, retraining triggers, and associated validation protocols. It allows manufacturers to implement approved future updates without filing a new premarket submission for each change, which is critical for adaptive AI/ML models that learn from real-world data.
How does the 21st Century Cures Act clinical decision support carve-out affect whether my AI software needs FDA clearance?
Certain lower-risk clinical decision support software is excluded from device regulation under the 21st Century Cures Act. Before pursuing any FDA pathway, manufacturers should assess whether their software meets those carve-out criteria. This triage step should be documented in the design history file and led by your regulatory affairs team.
What does the IMDRF SaMD risk categorization framework mean for the level of clinical validation FDA will expect?
The IMDRF framework categorizes SaMD by the significance of information provided and the severity of the healthcare condition it addresses. Higher-risk categories require more rigorous analytical and clinical validation studies. FDA uses this categorization to calibrate its review expectations for premarket submissions involving AI/ML-enabled devices.
What are the post-market monitoring obligations for adaptive AI/ML medical devices under the FDA framework?
Manufacturers must implement a Total Product Lifecycle monitoring program that tracks real-world algorithm performance against prespecified thresholds. When performance drifts beyond accepted parameters, manufacturers must trigger deviation reviews and report algorithm-related adverse events through the FDA Medical Device Reporting process.
What AI/ML-specific documentation does FDA expect in a design history file for a premarket submission?
FDA expects artifacts such as model cards, performance metric summaries, training and test dataset descriptions, demographic representation analyses, and bias assessments. The April 2023 draft marketing submission guidance formalized these expectations, and CDRH reviewers will assess whether this documentation supports the safety and effectiveness claims made in the submission.