---
title: "Clinical Decision Support Agents: Where FDA Draws the Line in 2026"
description: "FDA's 2026 guidance on AI-based clinical decision support clarifies what is regulated software. What this means for builders and providers."
canonical: https://callsphere.ai/blog/clinical-decision-support-agents-fda-line-2026
category: "Vertical Solutions"
tags: ["Clinical AI", "FDA", "Healthcare AI", "CDS"]
author: "CallSphere Team"
published: 2026-04-25T00:00:00.000Z
updated: 2026-05-07T09:42:08.344Z
---

# Clinical Decision Support Agents: Where FDA Draws the Line in 2026

> FDA's 2026 guidance on AI-based clinical decision support clarifies what is regulated software. What this means for builders and providers.

## What's Regulated and What's Not

The FDA's 2024-2025 guidance on Clinical Decision Support (CDS) software, with 2026 refinements, draws clearer lines between regulated medical devices and unregulated tools. For AI-based CDS, the four-criteria test from the 21st Century Cures Act remains the framework. The 2026 refinement clarified how the test applies to LLM-based CDS specifically.

This piece walks through what the test means in practice and what 2026 deployments are doing on each side of the line.

## The Four-Criteria Test

```mermaid
flowchart TD
    A[1. Not for time-critical decisions] --> Pass1
    B[2. Displays supporting evidence] --> Pass1
    C[3. Independent review possible] --> Pass1
    D[4. Used by HCPs] --> Pass1
    Pass1{All four met?}
    Pass1 -->|Yes| NoFDA[Not regulated as a device]
    Pass1 -->|No| FDA[Regulated as medical device software]
```

If all four criteria are met, the software is not a medical device under FDA. If any fails, it is.

The 2026 refinement clarified that LLMs presenting evidence in summary form may still satisfy criterion 2 if the underlying source material is accessible. Earlier interpretations were stricter.

## What's Not Regulated (Examples)

- An LLM that summarizes medical literature for an MD with citations
- A tool that compiles a patient's chart for the MD to review
- Reference Q&A on dosing guidelines
- Documentation drafting (visit notes, discharge summaries) for MD review and signoff
- Patient-facing scheduling and triage that does not give clinical advice

These are widely deployed in 2026 across health systems.

## What Is Regulated (Examples)

- Software that generates a diagnostic recommendation without exposing the basis to the clinician
- Software that issues alerts in time-critical scenarios where the clinician cannot independently review
- Software that recommends specific treatments without supporting evidence
- Direct-to-patient diagnostic AI

These require FDA clearance (typically 510(k) or De Novo) and are subject to ongoing post-market obligations.

## The 2026 LLM-Specific Wrinkle

LLMs raise three specific issues the 2024 guidance addressed and 2026 refined:

- **Confidence calibration**: an LLM giving a confident wrong answer is more dangerous than a low-confidence right one. FDA expects calibration evaluation.
- **Currency of training data**: medical knowledge evolves; clinical AI must stay current. FDA has signaled expectations for update cadence.
- **Performance drift**: LLMs can degrade as inputs shift. Post-market monitoring is expected for cleared devices.

## What Deployments Look Like in 2026

```mermaid
flowchart TB
    NoFDA[Non-device deployments] --> Doc[Documentation assistance]
    NoFDA --> Lit[Literature summaries]
    NoFDA --> Adm[Administrative]
    Cleared[FDA-cleared] --> Diag[Specific diagnostic AI]
    Cleared --> Alert[Critical alerting]
    Cleared --> Tri[Triage automation]
```

Most deployed clinical LLM applications in 2026 fall on the non-device side: documentation help, literature search, administrative workflow. The cleared-device side has specific point solutions (radiology AI, sepsis prediction, etc.) that have gone through formal regulatory review.

## A Health System's Decision Framework

For a health system evaluating a clinical LLM application in 2026:

1. Apply the four-criteria test honestly
2. If non-device, check vendor-supplied evidence of clinical accuracy
3. Verify HIPAA, BAA, and data privacy
4. Confirm physician sign-off workflow before any clinical action
5. Plan for post-deployment monitoring of accuracy and clinical impact
6. Have a rollback path if quality degrades

If device-cleared, ensure the deployment matches the cleared indications for use; off-label deployment of cleared software is a regulatory issue.

## Liability Considerations

Even non-device CDS carries liability if it produces wrong recommendations relied upon by clinicians. The 2026 best practice:

- All clinical decisions remain with the clinician of record
- AI outputs are framed as suggestions with explicit "verify before acting" language
- Documentation of AI use in the EHR
- Vendor indemnification for AI errors (negotiated case-by-case)

## Patient-Facing Deployments

Direct-to-patient AI in clinical contexts is a different category. The 2026 deployments that are working:

- Symptom triage with explicit "this is not a diagnosis" framing and clear path to a clinician
- Medication reminders and adherence support
- Health-literacy explanations of clinician-prepared diagnoses
- Appointment scheduling and pre-visit intake

What's not deployed (or not working): autonomous symptom-to-diagnosis flows, medication recommendations without clinician review, autonomous treatment planning. These hit the unregulated/regulated boundary in dangerous ways.

## What's Coming

- Specific FDA pathways for adaptive (continuously updating) AI
- Clearer expectations on post-market performance monitoring
- More CDS-cleared LLM applications
- State-level regulatory variation as some states adopt their own rules

## Sources

- FDA CDS guidance — [https://www.fda.gov/regulatory-information/search-fda-guidance-documents](https://www.fda.gov/regulatory-information/search-fda-guidance-documents)
- 21st Century Cures Act — [https://www.fda.gov](https://www.fda.gov)
- FDA AI/ML Action Plan — [https://www.fda.gov/medical-devices](https://www.fda.gov/medical-devices)
- "Predetermined Change Control Plan" FDA — [https://www.fda.gov](https://www.fda.gov)
- AMA AI policy — [https://www.ama-assn.org](https://www.ama-assn.org)

---

Source: https://callsphere.ai/blog/clinical-decision-support-agents-fda-line-2026
