Skip to content
Learn Agentic AI
Learn Agentic AI9 min read7 views

SQLAlchemy Sessions: Production Database-Backed Agent Memory

Use SQLAlchemySession with PostgreSQL and asyncpg for production-grade persistent agent memory including connection pooling, auto table creation, and migration strategies.

When You Need a Real Database Behind Your Agent

SQLite is great for prototypes. Redis is great for ephemeral, high-speed access. But many production systems need agent memory stored in a real relational database — one that supports ACID transactions, complex queries against session data, backup and recovery, and integration with your existing data infrastructure.

The OpenAI Agents SDK provides SQLAlchemySession for exactly this case. It works with any database SQLAlchemy supports — PostgreSQL, MySQL, MariaDB, and SQLite — through async drivers.

Installation

Install the SQLAlchemy extension:

flowchart TD
    START["SQLAlchemy Sessions: Production Database-Backed A…"] --> A
    A["When You Need a Real Database Behind Yo…"]
    A --> B
    B["Installation"]
    B --> C
    C["SQLAlchemySession.from_url with Postgre…"]
    C --> D
    D["Async Database Drivers"]
    D --> E
    E["Auto-Setup with create_tables=True"]
    E --> F
    F["Connection Pooling"]
    F --> G
    G["Migration Strategies"]
    G --> H
    H["Complete Production Setup"]
    H --> DONE["Key Takeaways"]
    style START fill:#4f46e5,stroke:#4338ca,color:#fff
    style DONE fill:#059669,stroke:#047857,color:#fff
pip install openai-agents[sqlalchemy]

For PostgreSQL with the recommended async driver:

pip install asyncpg

For MySQL:

pip install aiomysql

SQLAlchemySession.from_url() with PostgreSQL

The fastest way to get started is the from_url() factory method:

from agents.extensions.sessions import SQLAlchemySession

# PostgreSQL with asyncpg driver
session = await SQLAlchemySession.from_url(
    "postgresql+asyncpg://user:password@localhost:5432/myapp",
    create_tables=True,
)

The create_tables=True parameter tells the session to create its required tables on first connection. In production, you will want to manage this through migrations instead.

Full Connection Example

import asyncio
from agents import Agent, Runner
from agents.extensions.sessions import SQLAlchemySession

agent = Agent(
    name="DatabaseAgent",
    instructions="You are an assistant with persistent memory backed by PostgreSQL.",
)

async def main():
    session = await SQLAlchemySession.from_url(
        "postgresql+asyncpg://postgres:[email protected]:5432/agents",
        create_tables=True,
    )

    result = await Runner.run(
        agent,
        "Remember that my project deadline is March 30th.",
        session=session,
        session_id="user-789",
    )
    print(result.final_output)

    result = await Runner.run(
        agent,
        "When is my project deadline?",
        session=session,
        session_id="user-789",
    )
    print(result.final_output)  # References March 30th

asyncio.run(main())

Async Database Drivers

SQLAlchemySession uses SQLAlchemy's async engine, which requires an async-compatible database driver. Here are the recommended drivers by database:

flowchart TD
    ROOT["SQLAlchemy Sessions: Production Database-Bac…"] 
    ROOT --> P0["SQLAlchemySession.from_url with Postgre…"]
    P0 --> P0C0["Full Connection Example"]
    ROOT --> P1["Async Database Drivers"]
    P1 --> P1C0["Using asyncpg Recommended for PostgreSQL"]
    P1 --> P1C1["Using aiosqlite for Development"]
    ROOT --> P2["Connection Pooling"]
    P2 --> P2C0["Configuring Pool Parameters"]
    P2 --> P2C1["Pool Sizing Guidelines"]
    ROOT --> P3["Migration Strategies"]
    P3 --> P3C0["Step 1: Generate the Initial Migration"]
    P3 --> P3C1["Step 2: Use the Session Without Auto-Cr…"]
    P3 --> P3C2["Step 3: Future Schema Changes"]
    style ROOT fill:#4f46e5,stroke:#4338ca,color:#fff
    style P0 fill:#e0e7ff,stroke:#6366f1,color:#1e293b
    style P1 fill:#e0e7ff,stroke:#6366f1,color:#1e293b
    style P2 fill:#e0e7ff,stroke:#6366f1,color:#1e293b
    style P3 fill:#e0e7ff,stroke:#6366f1,color:#1e293b
Database Driver Connection URL Prefix
PostgreSQL asyncpg postgresql+asyncpg://
PostgreSQL psycopg (async) postgresql+psycopg://
MySQL aiomysql mysql+aiomysql://
SQLite aiosqlite sqlite+aiosqlite://

asyncpg is a high-performance async PostgreSQL driver written in Cython. It is significantly faster than psycopg for most workloads:

See AI Voice Agents Handle Real Calls

Book a free demo or calculate how much you can save with AI voice automation.

# asyncpg — fastest option
session = await SQLAlchemySession.from_url(
    "postgresql+asyncpg://user:pass@host:5432/db",
    create_tables=True,
)

Using aiosqlite for Development

During development, you can use SQLAlchemy-backed SQLite for easy local testing:

# Local dev with SQLite
session = await SQLAlchemySession.from_url(
    "sqlite+aiosqlite:///./dev_sessions.db",
    create_tables=True,
)

This lets you develop against the same SQLAlchemySession interface you use in production, with a zero-dependency local database.

Auto-Setup with create_tables=True

When create_tables=True is set, SQLAlchemySession automatically creates the required tables if they do not exist. The schema typically includes:

  • A session_items table with columns for session_id, item_data (JSON), item_order, and created_at
  • An index on session_id for fast retrieval
# First run — tables are created automatically
session = await SQLAlchemySession.from_url(
    "postgresql+asyncpg://user:pass@host/db",
    create_tables=True,
)

This is convenient for prototyping, but in a real production environment you should manage schema through migrations.

Connection Pooling

SQLAlchemy's async engine supports connection pooling out of the box. For high-throughput agent workloads, tuning the pool is essential.

flowchart LR
    S0["Installation"]
    S0 --> S1
    S1["Step 1: Generate the Initial Migration"]
    S1 --> S2
    S2["Step 2: Use the Session Without Auto-Cr…"]
    S2 --> S3
    S3["Step 3: Future Schema Changes"]
    style S0 fill:#4f46e5,stroke:#4338ca,color:#fff
    style S3 fill:#059669,stroke:#047857,color:#fff

Configuring Pool Parameters

from sqlalchemy.ext.asyncio import create_async_engine
from agents.extensions.sessions import SQLAlchemySession

engine = create_async_engine(
    "postgresql+asyncpg://user:pass@host:5432/db",
    pool_size=20,           # Base pool connections
    max_overflow=10,        # Additional connections under load
    pool_timeout=30,        # Seconds to wait for a connection
    pool_recycle=3600,      # Recycle connections after 1 hour
    pool_pre_ping=True,     # Verify connections before use
)

session = await SQLAlchemySession.from_engine(engine, create_tables=True)

Pool Sizing Guidelines

A good starting formula: pool_size = num_workers * 2. If you have 4 uvicorn workers, start with a pool size of 8. Monitor connection wait times and adjust.

# For a FastAPI app with 4 workers
engine = create_async_engine(
    "postgresql+asyncpg://user:pass@host:5432/db",
    pool_size=8,
    max_overflow=4,
    pool_pre_ping=True,
)

Migration Strategies

For production deployments, use Alembic to manage the session table schema instead of create_tables=True.

Step 1: Generate the Initial Migration

Create the table manually in an Alembic migration:

"""create agent session tables

Revision ID: 001_agent_sessions
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects.postgresql import JSONB

def upgrade():
    op.create_table(
        "session_items",
        sa.Column("id", sa.Integer, primary_key=True, autoincrement=True),
        sa.Column("session_id", sa.String(255), nullable=False, index=True),
        sa.Column("item_data", JSONB, nullable=False),
        sa.Column("item_order", sa.Integer, nullable=False),
        sa.Column("created_at", sa.DateTime, server_default=sa.func.now()),
    )
    op.create_index(
        "ix_session_items_session_order",
        "session_items",
        ["session_id", "item_order"],
    )

def downgrade():
    op.drop_table("session_items")

Step 2: Use the Session Without Auto-Creation

# Production — tables managed by Alembic
session = await SQLAlchemySession.from_url(
    "postgresql+asyncpg://user:pass@host:5432/db",
    create_tables=False,  # Explicitly disable
)

Step 3: Future Schema Changes

If the SDK updates its schema requirements, create a new Alembic migration to add columns or indexes rather than relying on create_tables=True which only handles initial creation.

Complete Production Setup

Here is a full FastAPI application with SQLAlchemy-backed agent sessions:

from contextlib import asynccontextmanager
from fastapi import FastAPI
from sqlalchemy.ext.asyncio import create_async_engine
from agents import Agent, Runner
from agents.extensions.sessions import SQLAlchemySession, SessionSettings

engine = create_async_engine(
    "postgresql+asyncpg://user:pass@db:5432/agents",
    pool_size=10,
    max_overflow=5,
    pool_pre_ping=True,
)

session_backend = None
agent = Agent(
    name="ProdAgent",
    instructions="You are a production assistant with database-backed memory.",
)

@asynccontextmanager
async def lifespan(app: FastAPI):
    global session_backend
    session_backend = await SQLAlchemySession.from_engine(
        engine, create_tables=False
    )
    yield
    await engine.dispose()

app = FastAPI(lifespan=lifespan)

@app.post("/chat")
async def chat(session_id: str, message: str):
    settings = SessionSettings(limit=50)
    result = await Runner.run(
        agent,
        message,
        session=session_backend,
        session_id=session_id,
        session_settings=settings,
    )
    return {"response": result.final_output}

This gives you connection pooling, proper lifecycle management, session limits, and all the durability guarantees of PostgreSQL.

Sources:

Share
C

Written by

CallSphere Team

Expert insights on AI voice agents and customer communication automation.

Try CallSphere AI Voice Agents

See how AI voice agents work for your industry. Live demo available -- no signup required.

Related Articles You May Like