The FDA AI/ML SaMD Framework: What Companies Need to Know Now

In a recent post on this blog, we examined how artificial intelligence (AI) and machine learning (ML) are reshaping the informed consent process in clinical trials—and why the regulatory and legal frameworks surrounding AI in that context are still catching up. That article focused on one corner of the clinical trial lifecycle. This one broadens the lens. Medical devices, and in particular software as a medical device (SaMD) is a different but related category.
The Food and Drug Administration (FDA) AI-Enabled Medical Device List (not comprehensive) shows over 1,400 entries authorized for U.S. marketing as of December 2025. The vast majority are classified as Class II and entered the market through the 510(k) pathway. Some were authorized through the De Novo pathway, and a small number have received premarket approval (PMA), typically as Class III devices. The agency's regulatory approach to these devices—particularly its newly finalized framework for Predetermined Change Control Plans—is something companies should consider moving from passive monitoring to active planning.
From Action Plan to Final Guidance
The FDA's journey toward a coherent regulatory framework for AI/ML-enabled medical devices has been iterative and deliberate. In January 2021, the agency published its "Artificial Intelligence/Machine Learning (AI/ML)-Based Software as a Medical Device (SaMD) Action Plan," committing to a total product lifecycle (TPLC) approach that would accommodate devices capable of learning and adapting after deployment. That plan introduced the concept of an Algorithm Change Protocol—a mechanism for manufacturers to pre-specify the types of modifications their devices might undergo.
The concept matured. In October 2021, the FDA, Health Canada, and the United Kingdom's Medicines and Healthcare products Regulatory Agency (MHRA) jointly published "Good Machine Learning Practice for Medical Device Development: Guiding Principles," which the International Medical Device Regulators Forum (IMDRF) later adopted in final form in January 2025. Then, in January 2025, the FDA issued two draft guidances on consecutive days: "Artificial Intelligence-Enabled Device Software Functions: Lifecycle Management and Marketing Submission Recommendations" and "Artificial Intelligence and Machine Learning in Drug and Biological Product Development" (the latter establishing a risk-based credibility framework drawing on the agency's experience reviewing more than 500 regulatory submissions with AI components since 2016).
The capstone came in December 2024, when the FDA published final guidance: "Marketing Submission Recommendations for a Predetermined Change Control Plan for Artificial Intelligence-Enabled Device Software Functions." The FDA subsequently updated this guidance in August 2025 with expanded implementation detail. For some teams, it may be the most operationally significant piece of the framework to date.
Understanding the PCCP
A Predetermined Change Control Plan (PCCP) is a documented plan submitted alongside a device's initial marketing application—whether a 510(k), De Novo request, or PMA—that describes specific modifications the manufacturer intends to make after authorization, the methodology for developing, validating, and implementing those modifications, and an assessment of their impact on safety and effectiveness. The statutory authority comes from Section 515C of the Federal Food, Drug, and Cosmetic Act, added by the Food and Drug Omnibus Reform Act of 2022.
The practical effect is significant. With an authorized PCCP, a manufacturer may implement the specified AI/ML updates—retraining on new data, adjusting algorithmic thresholds, refining model architecture within defined bounds—without filing a new premarket submission for each change. This addresses the fundamental tension that the FDA itself has acknowledged: the agency's traditional device review paradigm, built for static products, was not designed for adaptive AI and machine learning technologies.
But a PCCP is not a blank check. Changes outside the plan’s scope—or that fail to meet its validation criteria—still require conventional submissions. Implementing an unauthorized modification can render a device adulterated or misbranded under the Federal Food, Drug, and Cosmetic Act (FDCA). PCCP implementation must also occur within the manufacturer’s quality management system under FDA’s Quality Management System Regulation (QMSR) (21 C.F.R. Part 820, effective February 2, 2026), which incorporates ISO 13485 by reference. For manufacturers operating in both U.S. and EU markets, that alignment can reduce friction in quality-system compliance. Even so, incorporation by reference does not displace FDA‑specific requirements or FDA’s enforcement authority under the FDCA.
There is an additional statutory constraint that carries strategic significance. Under Section 515C(c), a device that has been modified pursuant to an authorized PCCP may not serve as a predicate device in its modified form. Only the version of the device cleared or approved prior to any PCCP-authorized changes may be referenced in a 510(k) comparison. Once a manufacturer independently clears the modified version through a subsequent marketing submission, it becomes eligible as a predicate—but not before. For companies, this restriction has implications beyond regulatory sequencing. It affects competitive positioning, since competitors cannot leapfrog to your PCCP-evolved device as a predicate. It affects portfolio planning, since manufacturers must track which device versions are PCCP-modified versus independently cleared. And it affects intellectual property strategy, since the pre-modification version remains the public reference point in the regulatory record.
The CDS Boundary Question
Before the PCCP framework even comes into play, companies face a threshold classification question: is the AI-enabled software function at issue a regulated medical device, or does it qualify as exempt clinical decision support (CDS) software?
Under Section 520(o)(1)(E) of the FDCA, as amended by the 21st Century Cures Act, certain CDS software functions are excluded from the definition of a medical device if they meet four statutory criteria—including that the software is intended to enable a health care professional (HCP) to independently review the basis for the recommendation. The FDA issued revised final guidance on this exemption in January 2026 ("Clinical Decision Support Software: Guidance for Industry and Food and Drug Administration Staff"), clarifying its current thinking on where the boundary falls.
The practical stakes are high. Software that qualifies as non-device CDS is not subject to the FDA's premarket review requirements and has no need for a PCCP. Software that crosses the line into regulated Software as a Medical Device (SaMD) triggers the full apparatus—510(k) or De Novo submissions, PCCP planning, labeling obligations, post-market surveillance, and cybersecurity compliance. Many AI-enabled clinical tools—particularly those that synthesize patient data and present risk assessments or treatment recommendations to clinicians—sit close to this boundary. As AI models become more autonomous and less transparent in their reasoning, the fourth criterion becomes the fault line. FDA's recent CDS guidance underscores that the ability of a HCP to independently review the basis for a recommendation is central; more opaque approaches may face challenges meeting that criterion depending on how the basis is communicated.
For legal teams, the classification analysis is a threshold question that logically precedes any PCCP strategy discussion—and one worth revisiting as the company's AI capabilities evolve.
Why This Matters
For life sciences and medical device companies, the PCCP framework touches multiple practice areas simultaneously.
Regulatory strategy. The decision of whether to include a PCCP in a marketing submission is both a regulatory and a business decision. A well-drafted PCCP can accelerate a company's ability to improve its AI-enabled device iteratively, but it also commits the company to specific methodologies and validation protocols that the FDA will expect to see followed. Early legal involvement in defining the scope and boundaries of any PCCP helps ensure the plan is defensible, auditable, and aligned with the company's product roadmap.
Product liability exposure. An emerging body of case law is beginning to shape the liability landscape for AI-enabled medical devices. In Dickson v. Dexcom, Inc. (W.D. La. 2024), a federal court held—for the first time—that the Medical Device Amendments expressly preempt state product-liability claims against a Class II device brought to market through the De Novo pathway. Given that nearly all FDA-authorized AI/ML devices are Class II—and that the De Novo pathway is increasingly popular for novel AI applications—this decision has meaningful implications for manufacturers' preemption defenses.
At the same time, there is still virtually no direct case law on liability when a patient is harmed by a medical AI system. Traditional product liability theories—manufacturing defect, design defect, failure to warn—were built for products that remain static after leaving the manufacturer's facility. AI devices that learn and adapt post-deployment challenge each of these theories in fundamental ways. Courts have yet to resolve whether AI-enabled software is even a "product" for purposes of strict liability, although recent decisions suggest this question is evolving in the manufacturer's disfavor.
Post-market obligations. The PCCP framework does not relieve manufacturers of post-market responsibilities. The FDA's August 2025 final guidance on PCCPs requires that device labeling inform users when a device has an authorized PCCP. As updates are implemented, labeling must explain what changed and how the device should be used safely. Public-facing summaries must include high-level PCCP descriptions, and new unique device identifiers (UDIs) may be required. These labeling and disclosure obligations create both compliance requirements and potential litigation touchpoints.
Vendor and third-party risk. Many AI-enabled medical devices rely on third-party components—training data, cloud infrastructure, algorithm modules. The FDA's framework holds the device manufacturer accountable for the full product, regardless of what was developed in-house versus outsourced. Vendor agreements that do not address PCCP compliance, data governance, and validation requirements may leave the manufacturer exposed—particularly given the FDA's position that the device manufacturer bears ultimate regulatory accountability regardless of what was outsourced.
Cybersecurity: No Longer Advisory
Cybersecurity is now treated by FDA as a core component of device safety and effectiveness, particularly for software-enabled products. On February 3, 2026, FDA issued final guidance titled "Cybersecurity in Medical Devices: Quality Management System Considerations and Content of Premarket Submissions," which supersedes the June 27, 2025 version and sets out FDA’s current recommendations—including documentation expectations that support compliance with the statutory requirements for certain “cyber devices” under FDCA Section 524B. For devices that meet the cyber device definition, missing required elements can materially affect FDA’s review and authorization determinations.
For AI-enabled SaMD, the intersection with PCCPs is practical and immediate: PCCP change management should be designed so that cybersecurity risk management, testing/validation, and documentation are treated as gating steps for each update cycle. Many cloud-connected AI products, remote monitoring tools, and over‑the‑air update models will typically meet the statutory “cyber device” criteria (software, internet connectivity, and vulnerability potential), which makes Software Bills of Materials (SBOM) practices, coordinated vulnerability disclosure planning, and lifecycle vulnerability management central—not only to postmarket operations—but also to what the FDA expects to see reflected in premarket submissions and lifecycle controls.
Still More to Consider
The through-line across PCCPs, CDS classification, post-market labeling, and cybersecurity is that AI devices are regulated as living systems—and companies are expected to manage them as such. The winners won’t be those who treat updates as an engineering afterthought; they’ll be those who build a lifecycle governance model that legal, regulatory, quality, and product teams can defend under audit and in litigation.
In practice, the FDA's PCCP framework establishes the regulatory foundation, but it is only one layer of a broader compliance challenge. In a later article, we will zoom further out to examine the converging, fast-forming patchwork of obligations that extend beyond the FDA and increasingly determine whether an AI-enabled product is not just innovative, but durable.
Authored by Phillip Skaggs, Berkley Life Sciences, Vice President & Chief Legal & Regulatory Affairs Officer
This post is for general informational purposes only and is not intended as legal or other professional advice.