---
title: "Build an AI Voice Agent with SolidStart + SolidJS + OpenAI Realtime (2026)"
description: "SolidStart 1.3 + Solid 1.9 deliver fine-grained reactivity with no VDOM — voice agents render at 30% lower CPU than React. Plug WebRTC into Solid signals."
canonical: https://callsphere.ai/blog/vw8h-build-ai-voice-agent-solidjs-solidstart-realtime-2026
category: "AI Voice Agents"
tags: ["SolidJS", "SolidStart", "WebRTC", "Voice Agent", "Realtime"]
author: "CallSphere Team"
published: 2026-04-22T00:00:00.000Z
updated: 2026-05-07T22:23:19.371Z
---

# Build an AI Voice Agent with SolidStart + SolidJS + OpenAI Realtime (2026)

> SolidStart 1.3 + Solid 1.9 deliver fine-grained reactivity with no VDOM — voice agents render at 30% lower CPU than React. Plug WebRTC into Solid signals.

> **TL;DR** — Solid 1.9's fine-grained signals re-render only the exact DOM node that changed — perfect for voice agents that emit hundreds of transcript deltas per second. SolidStart 1.3 (Vite + Nitro) ships a clean place to mint ephemeral OpenAI keys.

## What you'll build

A SolidStart route that mints an ephemeral key, a SolidJS component that opens WebRTC to OpenAI Realtime, and a signal-driven transcript that rerenders only the changed token.

## Prerequisites

1. `solid-js@^1.9`, `@solidjs/start@^1.3`, `vinxi@^0.5`.
2. `OPENAI_API_KEY` in `.env`.
3. Node 20+.

## Architecture

```mermaid
flowchart LR
  S[Solid signal] --> UI[DOM node]
  UI --> SS[SolidStart /api/key]
  SS -- POST sessions --> OA1[OpenAI]
  OA1 --> SS --> UI
  UI -- WebRTC SDP --> OA2[OpenAI Realtime]
```

## Step 1 — API route

```ts
// src/routes/api/key.ts
import type { APIEvent } from "@solidjs/start/server";

export async function POST(_e: APIEvent) {
  const r = await fetch("[https://api.openai.com/v1/realtime/sessions](https://api.openai.com/v1/realtime/sessions)", {
    method: "POST",
    headers: { Authorization: `Bearer ${process.env.OPENAI_API_KEY}`,
               "Content-Type": "application/json" },
    body: JSON.stringify({ model: "gpt-realtime", voice: "verse" }),
  });
  return new Response(await r.text(),
    { headers: { "Content-Type": "application/json" } });
}
```

## Step 2 — Voice component

```tsx
import { createSignal } from "solid-js";

export function Voice() {
  const [live, setLive] = createSignal(false);
  const [transcript, setTranscript] = createSignal("");
  let audioEl!: HTMLAudioElement;

const start = async () => {
    const { client_secret } = await fetch("/api/key",
      { method: "POST" }).then((r) => r.json());
    const pc = new RTCPeerConnection();
    pc.ontrack = (e) => (audioEl.srcObject = e.streams[0]);
    const ms = await navigator.mediaDevices.getUserMedia({ audio: true });
    ms.getTracks().forEach((t) => pc.addTrack(t, ms));

```
const dc = pc.createDataChannel("oai-events");
dc.addEventListener("message", (e) => {
  const evt = JSON.parse(e.data);
  if (evt.type === "response.audio_transcript.delta")
    setTranscript((t) => t + evt.delta);
});

const offer = await pc.createOffer();
await pc.setLocalDescription(offer);
const ans = await fetch(
  "https://api.openai.com/v1/realtime?model=gpt-realtime",
  { method: "POST", body: offer.sdp,
    headers: { Authorization: `Bearer ${client_secret.value}`,
               "Content-Type": "application/sdp" } });
await pc.setRemoteDescription({ type: "answer", sdp: await ans.text() });
setLive(true);
```

};

return (
    <>

        {live() ? "Live" : "Talk"}

```
{transcript()}
```

  );
}
```

## Step 3 — Use it

```tsx
// src/routes/voice.tsx
import { Voice } from "~/components/Voice";
export default () => ;
```

## Step 4 — Tools via createResource

```tsx
const [order] = createResource(orderId, (id) =>
  fetch(`/api/orders/${id}`).then((r) => r.json()));
```

When OpenAI emits `response.function_call_arguments.done`, set the resource source to trigger the fetch, then reply on the data channel.

## Step 5 — Deploy

SolidStart's Nitro layer supports `server-vercel`, `server-cloudflare-pages`, or `server-bun` presets. Pick one in `vinxi.config.ts` and deploy.

## Pitfalls

- **No JSX child reactivity in static blocks**: Wrap dynamic children in functions: `{() => transcript()}`.
- **Stores vs signals**: For nested transcript history, use `createStore` not nested signals.
- **`ref`**: Solid's `ref` is a setter callback, not a `useRef`-like `{current}` object.

## How CallSphere does this in production

CallSphere prefers React in production (OneRoof on Next.js 16, Sales on React 18 + Vite), but Solid's perf characteristics make it a strong choice for embed widgets. The platform spans **37 agents · 90+ tools · 115+ DB tables · 6 verticals**. Healthcare (FastAPI), OneRoof, Salon (NestJS 10 + Prisma). **$149/$499/$1,499**, **14-day trial**, **22% affiliate**.

## FAQ

**Solid 1.9 stable?** Yes — Solid hit 1.0 in 2022, 1.9 in 2026 with perf wins.

**SolidStart 2.0?** Roadmap targeted late 2026; 1.3 is current GA as of May 2026.

**Is the bundle smaller than Svelte?** Comparable — Solid is ~7kb, Svelte 5 is ~6kb after compile.

**Vercel AI SDK support?** Yes — added Solid support in 2024.

## Sources

- SolidStart docs - [https://start.solidjs.com/](https://start.solidjs.com/)
- SolidStart 2026 guide - [https://www.johal.in/solidstart-solidjs-full-stack-vite-powered-ssr-2026/](https://www.johal.in/solidstart-solidjs-full-stack-vite-powered-ssr-2026/)
- This Month in Solid (1.3) - [https://dev.to/danieljcafonso/this-month-in-solid-the-road-to-20-is-here-solidstart-130-release-18c9](https://dev.to/danieljcafonso/this-month-in-solid-the-road-to-20-is-here-solidstart-130-release-18c9)
- OpenAI Realtime WebRTC - [https://developers.openai.com/api/docs/guides/realtime-webrtc](https://developers.openai.com/api/docs/guides/realtime-webrtc)

---

Source: https://callsphere.ai/blog/vw8h-build-ai-voice-agent-solidjs-solidstart-realtime-2026
