Bun vs Node WebSocket Benchmarks for AI Agents in 2026
Bun's WebSocket implementation handles 1.2M concurrent connections vs Node's 680K on identical hardware. Where the gap is real, where it isn't, and the production tradeoffs.
Bun ships uWebSockets in C++ as its default server. Node.js ships JavaScript on top of
net. In synthetic WebSocket benchmarks, that gap is 2–4x. In your real app, it is 5–15%. Both numbers are correct.
Why is Bun's WebSocket faster?
flowchart TD
Client[Client] --> Edge[Cloudflare Worker]
Edge -->|WS upgrade| DO[Durable Object]
DO --> AI[(OpenAI Realtime WS)]
AI --> DO
DO --> Client
DO -.hibernation.-> Storage[(Persisted state)]Because Bun's HTTP and WebSocket server is built directly on uWebSockets — the same C++ library that powers many high-end Node deployments via uWebSockets.js. There is no JavaScript object per connection, no V8 closure overhead per send, and the event loop is JavaScriptCore's, which has less GC pressure under sustained throughput than V8.
Concretely, Daniel Lemire's well-known benchmark and 2026 follow-ups show:
- HTTP request throughput: Bun ~52k req/s vs Node ~14k req/s on the same box.
- Concurrent WebSocket connections: Bun ~1.2M vs Node ~680k on identical hardware.
- Per-message latency (idle box): Bun ~0.4 ms vs Node ~0.9 ms.
These numbers describe the runtime. They do not describe your app, which usually spends 80% of its time in I/O to a database, an LLM provider, or another service. Real-world differences shrink to single-digit percent.
Hear it before you finish reading
Talk to a live CallSphere AI voice agent in your browser — 60 seconds, no signup.
When does the difference matter?
It matters in three specific cases:
- Pure fan-out workloads — websocket gateway, presence service, live counters. CPU-bound on the runtime, not on I/O. Bun wins by 2–3x.
- Connection-density-bound deployments — you are trying to fit 200k connections per pod and Node OOMs at 80k. Bun lets you halve the pod count.
- Cold-start sensitive serverless — Bun starts in ~10 ms vs Node's ~80 ms. For Lambda or Cloud Run handling WebSocket upgrades, that matters.
It does not matter for a typical AI agent backend that calls OpenAI on every message. The model latency is 200–800 ms; the runtime gap is noise.
CallSphere's implementation
CallSphere runs Node.js + Socket.IO on the Sales Calling and After-hours dashboards because the ecosystem (PM2, OpenTelemetry, the entire Socket.IO room semantics) was already built. We evaluated Bun for the dashboard fan-out and found a 12% latency improvement and 30% memory reduction — real, but not enough to justify the migration cost given Socket.IO's adapter ecosystem.
We do run Bun for two specific services:
- Webhook gateway that receives Twilio, Stripe, and CRM callbacks, signs them, and forwards. Pure I/O fan-out, Bun is 2x faster end to end.
- The voice-bridge prototype for new verticals — easier to iterate, faster cold start, single binary deploy.
Both services contribute to the 90+ tools in the platform. The lesson: pick Bun where the workload is HTTP/WebSocket gateway-shaped, stay on Node where the ecosystem matters.
Code: equivalent WebSocket server in Bun
const server = Bun.serve({
port: 8080,
fetch(req, server) {
if (server.upgrade(req, { data: { sid: crypto.randomUUID() } })) return;
return new Response("upgrade required", { status: 426 });
},
websocket: {
open(ws) { ws.subscribe("calls"); },
message(ws, msg) { server.publish("calls", msg); },
close(ws) { ws.unsubscribe("calls"); },
},
});
console.log(`listening on ${server.port}`);
Build steps
- Benchmark your actual workload, not synthetic. Use
autocannonplus a custom WebSocket script that mirrors your message shape. - If the workload is gateway-shaped (no DB, no AI), Bun is worth a 2-week trial.
- If the workload calls OpenAI or Postgres on every message, stay on Node — the runtime is not your bottleneck.
- Match Node test coverage before switching. Bun has near-Node compatibility but edge cases bite (some native modules, some test runners).
- Use
uWebSockets.json Node if you want most of the performance without leaving the ecosystem. - Monitor RSS, event-loop lag, and per-message p99 latency before and after.
FAQ
Does Bun support all Node WebSocket libraries? Most. ws works. Socket.IO works with caveats. uWebSockets.js is unnecessary because Bun ships uWS natively.
Still reading? Stop comparing — try CallSphere live.
CallSphere ships complete AI voice agents per industry — 14 tools for healthcare, 10 agents for real estate, 4 specialists for salons. See how it actually handles a call before you book a demo.
Is Bun's debugging story mature? As of 2026, yes — VS Code, Chrome DevTools, and OpenTelemetry support are stable.
Can I run Bun on AWS Lambda? Yes, via the Bun Lambda layer or container images.
What about long-term stability? Bun 1.x has shipped reliably since 2024. We trust it for stateless gateways; stateful long-running services we still default to Node.
Does Deno fit anywhere? It is competitive with Bun on raw throughput and stricter on permissions. We have not deployed Deno in production but evaluated it for short-running tasks.
CallSphere stitches Bun + Node + Python together across 37 agents and six verticals. Try the 14-day free trial at $149/$499/$1499 or book a demo.
Sources
Try CallSphere AI Voice Agents
See how AI voice agents work for your industry. Live demo available -- no signup required.