ion semantics. The following architecture demonstrates a scalable pattern using the App Router, Web Streams API, and a reusable React hook.
Step 1: Server-Side Stream Handler
The server route must return a ReadableStream with strict SSE-compliant headers. The stream generator should respect client disconnect signals, emit properly formatted events, and include metadata for recovery.
// app/api/stream/[channel]/route.ts
import { NextRequest, NextResponse } from 'next/server';
export const runtime = 'nodejs';
export const dynamic = 'force-dynamic';
export async function GET(
request: NextRequest,
{ params }: { params: { channel: string } }
) {
const channel = params.channel;
const lastEventId = request.headers.get('last-event-id') ?? '0';
const stream = new ReadableStream({
async start(controller) {
const encoder = new TextEncoder();
let eventCounter = parseInt(lastEventId, 10);
const pushEvent = (type: string, payload: Record<string, unknown>) => {
eventCounter++;
const idLine = `id: ${eventCounter}\n`;
const eventLine = `event: ${type}\n`;
const dataLine = `data: ${JSON.stringify(payload)}\n\n`;
controller.enqueue(encoder.encode(idLine + eventLine + dataLine));
};
// Simulate an async data source (Redis PubSub, DB listener, message queue)
const dataInterval = setInterval(() => {
pushEvent('status_update', {
channel,
timestamp: Date.now(),
progress: Math.min(100, eventCounter * 5),
});
}, 2000);
// Graceful teardown on client disconnect
request.signal.addEventListener('abort', () => {
clearInterval(dataInterval);
controller.close();
});
},
});
return new NextResponse(stream, {
headers: {
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache, no-transform, must-revalidate',
'Connection': 'keep-alive',
'X-Accel-Buffering': 'no',
},
});
}
Architecture Rationale:
ReadableStream provides backpressure-aware streaming compatible with Node.js and Edge runtimes.
- The
id: field is mandatory for production use. It enables the browser to send Last-Event-ID on reconnect, allowing the server to replay missed events.
X-Accel-Buffering: no prevents Nginx and compatible proxies from buffering the response, which would defeat the real-time nature of the stream.
request.signal ties the stream lifecycle to the HTTP request, preventing memory leaks when clients navigate away or lose connectivity.
Step 2: Client-Side Consumer with Custom Hook
Wrapping EventSource in a custom hook abstracts connection state, error handling, and cleanup while preserving native reconnection behavior.
// hooks/useServerPush.ts
import { useEffect, useRef, useCallback, useState } from 'react';
interface SSEOptions {
onMessage?: (event: string, data: unknown) => void;
onError?: (error: Event) => void;
onOpen?: () => void;
}
export function useServerPush(url: string, options: SSEOptions = {}) {
const sourceRef = useRef<EventSource | null>(null);
const [isConnected, setIsConnected] = useState(false);
const connect = useCallback(() => {
if (sourceRef.current?.readyState === EventSource.OPEN) return;
const source = new EventSource(url);
sourceRef.current = source;
source.onopen = () => {
setIsConnected(true);
options.onOpen?.();
};
source.onmessage = (event) => {
try {
const parsed = JSON.parse(event.data);
options.onMessage?.(event.type, parsed);
} catch {
console.warn('Invalid SSE payload:', event.data);
}
};
source.addEventListener('status_update', (event) => {
const parsed = JSON.parse((event as MessageEvent).data);
options.onMessage?.('status_update', parsed);
});
source.onerror = (error) => {
setIsConnected(false);
options.onError?.(error);
};
}, [url, options]);
useEffect(() => {
connect();
return () => {
sourceRef.current?.close();
sourceRef.current = null;
setIsConnected(false);
};
}, [connect]);
return { isConnected };
}
Architecture Rationale:
useRef prevents stale closure issues during re-renders while maintaining a stable reference to the EventSource instance.
- Custom event listeners (
addEventListener('status_update', ...)) allow typed routing of different payload types without parsing every message.
- The hook returns connection state, enabling UI components to render loading indicators or fallback states when the stream drops.
- Cleanup in
useEffect guarantees the connection terminates when the component unmounts or the URL changes, preventing orphaned network requests.
Pitfall Guide
1. Proxy Response Buffering
Explanation: Reverse proxies like Nginx, Apache, and cloud CDNs often buffer responses to improve throughput. When buffering is enabled, SSE chunks accumulate until the buffer fills or the connection closes, destroying real-time delivery.
Fix: Always include X-Accel-Buffering: no and Cache-Control: no-cache, no-transform. Verify proxy configuration disables proxy_buffering for SSE routes.
2. Omitting Event IDs
Explanation: Without id: fields, the browser cannot track which events were successfully received. Network interruptions cause data loss, and the client cannot request missed messages upon reconnect.
Fix: Increment a monotonic counter or use UUIDs. Send id: <value>\n immediately before the data: line for every event.
3. Blocking the Event Loop in Stream Generators
Explanation: Synchronous operations inside the ReadableStream start callback or interval handlers block the Node.js event loop, preventing other requests from being processed and causing timeout cascades.
Fix: Use async iterators, message queues, or non-blocking I/O. Offload heavy computation to worker threads or background jobs before pushing to the stream.
4. Ignoring req.signal for Teardown
Explanation: When a client navigates away or loses connectivity, the HTTP request terminates, but the server-side interval or listener continues running. This causes memory leaks and unnecessary resource consumption.
Fix: Always attach an abort listener to request.signal that clears intervals, unsubscribes from pub/sub channels, and calls controller.close().
5. Assuming Bidirectional Communication
Explanation: Developers sometimes attempt to send client-to-server data through the SSE channel, which is technically impossible. SSE is strictly unidirectional.
Fix: Use standard REST, GraphQL, or WebRTC data channels for client-to-server mutations. Keep SSE strictly for server pushes.
Explanation: Missing or incorrect cache directives allow intermediate proxies or CDNs to cache the stream response, returning stale data to subsequent clients or blocking new connections.
Fix: Set Cache-Control: no-cache, no-transform, must-revalidate. Avoid public or max-age directives on SSE endpoints.
7. Overcomplicating with Third-Party Libraries
Explanation: Polyfills and wrapper libraries often mask native reconnection behavior, introduce bundle bloat, or fail to handle modern browser security policies correctly.
Fix: Use the native EventSource API. Only consider polyfills if supporting legacy browsers (IE11), which is increasingly rare in modern SaaS stacks.
Production Bundle
Action Checklist
Decision Matrix
| Scenario | Recommended Approach | Why | Cost Impact |
|---|
| Dashboard notifications, export progress, live metrics | SSE | Native HTTP streaming, zero protocol overhead, auto-reconnect | Low (standard HTTP infrastructure) |
| Real-time chat, collaborative editing, multiplayer games | WebSockets | Requires full-duplex communication, low-latency bidirectional sync | High (sticky sessions, connection scaling, state management) |
| Low-frequency updates (>30s intervals), legacy proxy environments | HTTP Polling | Simple implementation, works behind restrictive firewalls | Medium (redundant requests, higher latency, server load) |
| High-throughput telemetry, IoT device streams | WebSockets or MQTT | Binary framing, efficient packet routing, QoS guarantees | High (specialized brokers, infrastructure complexity) |
Configuration Template
// app/api/stream/[channel]/route.ts
import { NextRequest, NextResponse } from 'next/server';
export const runtime = 'nodejs';
export const dynamic = 'force-dynamic';
export async function GET(req: NextRequest, { params }: { params: { channel: string } }) {
const stream = new ReadableStream({
async start(controller) {
const encoder = new TextEncoder();
let id = 0;
const interval = setInterval(() => {
id++;
const payload = `id: ${id}\nevent: update\ndata: ${JSON.stringify({ channel: params.channel, ts: Date.now() })}\n\n`;
controller.enqueue(encoder.encode(payload));
}, 1500);
req.signal.addEventListener('abort', () => {
clearInterval(interval);
controller.close();
});
},
});
return new NextResponse(stream, {
headers: {
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache, no-transform, must-revalidate',
'Connection': 'keep-alive',
'X-Accel-Buffering': 'no',
},
});
}
// hooks/useSSE.ts
import { useEffect, useRef, useState } from 'react';
export function useSSE(endpoint: string) {
const ref = useRef<EventSource | null>(null);
const [ready, setReady] = useState(false);
useEffect(() => {
const source = new EventSource(endpoint);
ref.current = source;
source.onopen = () => setReady(true);
source.onerror = () => setReady(false);
return () => { source.close(); setReady(false); };
}, [endpoint]);
return { ready, source: ref.current };
}
Quick Start Guide
- Create the Route Handler: Add a dynamic route file under
app/api/stream/[channel]/route.ts. Implement a ReadableStream that emits SSE-formatted payloads with id: and data: fields.
- Apply Strict Headers: Configure
Content-Type: text/event-stream, disable caching with Cache-Control: no-cache, no-transform, and add X-Accel-Buffering: no to prevent proxy buffering.
- Build the Client Hook: Wrap
EventSource in a custom React hook. Attach onopen, onmessage, and onerror listeners. Return connection state and expose the instance for manual event routing.
- Validate Delivery: Test the endpoint using
curl -N http://localhost:3000/api/stream/test. Confirm chunks arrive in real-time without buffering. Integrate the hook into a dashboard component and verify auto-reconnection after network interruption.