---
title: "Bun vs Node WebSocket Benchmarks for AI Agents in 2026"
description: "Bun's WebSocket implementation handles 1.2M concurrent connections vs Node's 680K on identical hardware. Where the gap is real, where it isn't, and the production tradeoffs."
canonical: https://callsphere.ai/blog/vw1c-bun-vs-node-websocket-benchmarks-2026
category: "AI Engineering"
tags: ["WebSockets", "Bun", "Node.js", "Scalability", "AI Engineering"]
author: "CallSphere Team"
published: 2026-04-19T00:00:00.000Z
updated: 2026-05-07T09:32:10.889Z
---

# Bun vs Node WebSocket Benchmarks for AI Agents in 2026

> Bun's WebSocket implementation handles 1.2M concurrent connections vs Node's 680K on identical hardware. Where the gap is real, where it isn't, and the production tradeoffs.

> Bun ships uWebSockets in C++ as its default server. Node.js ships JavaScript on top of `net`. In synthetic WebSocket benchmarks, that gap is 2–4x. In your real app, it is 5–15%. Both numbers are correct.

## Why is Bun's WebSocket faster?

```mermaid
flowchart TD
  Client[Client] --> Edge[Cloudflare Worker]
  Edge -->|WS upgrade| DO[Durable Object]
  DO --> AI[(OpenAI Realtime WS)]
  AI --> DO
  DO --> Client
  DO -.hibernation.-> Storage[(Persisted state)]
```

CallSphere reference architecture

Because Bun's HTTP and WebSocket server is built directly on uWebSockets — the same C++ library that powers many high-end Node deployments via `uWebSockets.js`. There is no JavaScript object per connection, no V8 closure overhead per send, and the event loop is JavaScriptCore's, which has less GC pressure under sustained throughput than V8.

Concretely, Daniel Lemire's well-known benchmark and 2026 follow-ups show:

- HTTP request throughput: Bun ~52k req/s vs Node ~14k req/s on the same box.
- Concurrent WebSocket connections: Bun ~1.2M vs Node ~680k on identical hardware.
- Per-message latency (idle box): Bun ~0.4 ms vs Node ~0.9 ms.

These numbers describe the runtime. They do not describe your app, which usually spends 80% of its time in I/O to a database, an LLM provider, or another service. Real-world differences shrink to single-digit percent.

## When does the difference matter?

It matters in three specific cases:

1. **Pure fan-out workloads** — websocket gateway, presence service, live counters. CPU-bound on the runtime, not on I/O. Bun wins by 2–3x.
2. **Connection-density-bound deployments** — you are trying to fit 200k connections per pod and Node OOMs at 80k. Bun lets you halve the pod count.
3. **Cold-start sensitive serverless** — Bun starts in ~10 ms vs Node's ~80 ms. For Lambda or Cloud Run handling WebSocket upgrades, that matters.

It does not matter for a typical AI agent backend that calls OpenAI on every message. The model latency is 200–800 ms; the runtime gap is noise.

## CallSphere's implementation

CallSphere runs **Node.js + Socket.IO** on the Sales Calling and After-hours dashboards because the ecosystem (PM2, OpenTelemetry, the entire Socket.IO room semantics) was already built. We evaluated Bun for the dashboard fan-out and found a 12% latency improvement and 30% memory reduction — real, but not enough to justify the migration cost given Socket.IO's adapter ecosystem.

We do run **Bun** for two specific services:

- **Webhook gateway** that receives Twilio, Stripe, and CRM callbacks, signs them, and forwards. Pure I/O fan-out, Bun is 2x faster end to end.
- **The voice-bridge prototype** for new verticals — easier to iterate, faster cold start, single binary deploy.

Both services contribute to the [90+ tools](/pricing) in the platform. The lesson: pick Bun where the workload is HTTP/WebSocket gateway-shaped, stay on Node where the ecosystem matters.

## Code: equivalent WebSocket server in Bun

```typescript
const server = Bun.serve({
  port: 8080,
  fetch(req, server) {
    if (server.upgrade(req, { data: { sid: crypto.randomUUID() } })) return;
    return new Response("upgrade required", { status: 426 });
  },
  websocket: {
    open(ws) { ws.subscribe("calls"); },
    message(ws, msg) { server.publish("calls", msg); },
    close(ws) { ws.unsubscribe("calls"); },
  },
});
console.log(`listening on ${server.port}`);
```

## Build steps

1. Benchmark your actual workload, not synthetic. Use `autocannon` plus a custom WebSocket script that mirrors your message shape.
2. If the workload is gateway-shaped (no DB, no AI), Bun is worth a 2-week trial.
3. If the workload calls OpenAI or Postgres on every message, stay on Node — the runtime is not your bottleneck.
4. Match Node test coverage before switching. Bun has near-Node compatibility but edge cases bite (some native modules, some test runners).
5. Use `uWebSockets.js` on Node if you want most of the performance without leaving the ecosystem.
6. Monitor RSS, event-loop lag, and per-message p99 latency before and after.

## FAQ

**Does Bun support all Node WebSocket libraries?** Most. `ws` works. Socket.IO works with caveats. `uWebSockets.js` is unnecessary because Bun ships uWS natively.

**Is Bun's debugging story mature?** As of 2026, yes — VS Code, Chrome DevTools, and OpenTelemetry support are stable.

**Can I run Bun on AWS Lambda?** Yes, via the Bun Lambda layer or container images.

**What about long-term stability?** Bun 1.x has shipped reliably since 2024. We trust it for stateless gateways; stateful long-running services we still default to Node.

**Does Deno fit anywhere?** It is competitive with Bun on raw throughput and stricter on permissions. We have not deployed Deno in production but evaluated it for short-running tasks.

CallSphere stitches Bun + Node + Python together across [37 agents and six verticals](/pricing). [Try the 14-day free trial](/trial) at $149/$499/$1499 or [book a demo](/demo).

## Sources

- [Bun vs Node.js in 2026: Benchmarks & Migration Guide](https://strapi.io/blog/bun-vs-nodejs-performance-comparison-guide)
- [A simple WebSocket benchmark in JavaScript: Node.js versus Bun](https://lemire.me/blog/2023/11/25/a-simple-websocket-benchmark-in-javascript-node-js-versus-bun/)
- [Bun vs Node.js vs Deno: Production Performance Benchmarks 2026](https://www.askantech.com/bun-vs-nodejs-vs-deno-performance-benchmarks-2026/)
- [Socket.IO vs ws vs uWebSockets.js 2026](https://www.pkgpulse.com/guides/socketio-vs-ws-vs-uwebsockets-websocket-servers-nodejs-2026)

---

Source: https://callsphere.ai/blog/vw1c-bun-vs-node-websocket-benchmarks-2026
