← Back to Blog
Next.js2026-05-07Β·37 min read

I shipped 122 web tools without a backend

By Khoa Nguyen

I shipped 122 web tools without a backend

Current Situation Analysis

Traditional SaaS utility architectures follow a predictable but flawed pattern: upload β†’ network transmission β†’ server processing β†’ temporary storage β†’ download. This model introduces four critical failure modes:

  1. Latency Overhead: Network round-trips add 1.5–3 seconds before processing even begins, creating perceived slowness.
  2. Infrastructure Cost: Compute-heavy tasks (PDF compression, video trimming, format conversion) drive up Lambda/container bills, forcing paywalls ($9.99/month) for features that require minimal client-side compute.
  3. Privacy Exposure: Transient server storage creates attack surfaces and compliance liabilities, even when data is "deleted" post-processing.
  4. Architectural Muscle Memory: The reflex to "spin up a backend" ignores modern browser capabilities. For ~80% of common utilities (converters, formatters, calculators, generators), server-side processing is unnecessary. The browser in 2026 is a full compute environment, not a thin client. Relying on servers for these workloads adds cost, latency, and privacy risk while delivering a worse user experience.

WOW Moment: Key Findings

Shifting processing to the client eliminates network dependencies and transforms UX perception. The following comparison illustrates the operational and experiential deltas between traditional server-based processing and the browser-first architecture deployed across 122 tools:

Approach Initial Latency (50MB PDF) Infra Cost (per 10k requests) Privacy Exposure User Trust/Retention Practical File Limit
Server/Lambda 1.8–3.2s (upload + queue + process) $150–$450 (compute + egress) High (data traverses network, transient storage) Low (queue anxiety, "processing" spinners) Unlimited (streaming)
Browser-First (WASM + Workers) 0.2–2.0s (local processing) $0 (CDN + static hosting) None (data never leaves device) High (instant feedback, no queues) ~2GB (browser memory limits)

Key Findings:

  • Client-side compression/processing consistently outperforms network round-trips on mid-range hardware.
  • Zero server infrastructure eliminates operational overhead and privacy compliance friction.
  • Perceived quality scales with main-thread responsiveness, not raw compute speed.

Core Solution

The stable architecture emerged through iterative validation of three client-side compute patterns, orchestrated via static generation and lazy loading:

1. WebAssembly for Heavy File Processing

WASM handles CPU-intensive tasks (PDF manipulation, media conversion, ZIP/EPUB generation) with near-native performance. The pattern replaces backend queues with direct local execution:

// Pseudocode for the pattern that replaced our backend
const file = await input.files[0].arrayBuffer();
const result = await wasmModule.compress(file, { quality: 0.7 });
download(result, 'compressed.pdf');

Enter fullscreen mode Exit fullscreen mode

A 50MB PDF compresses in ~2 seconds locally. The same approach applies to image compression, video trimming (via FFmpeg.wasm), audio conversion, and archive generation.

2. Pure JavaScript for Lightweight Utilities

Formatters, regex testers, character counters, and case converters require 50–200 lines of vanilla JS. Framework overhead is unnecessary. The tool logic is a single function; Next.js is used only for routing and SEO, not runtime logic.

3. Web Workers for >200ms Tasks

Blocking the main thread destroys trust. Any computation exceeding 200ms is offloaded to a dedicated worker with progress streaming:

const worker = new Worker(new URL('./hash.worker.js', import.meta.url));
worker.postMessage({ file, algorithm: 'sha256' });
worker.onmessage = (e) => updateUI(e.data);

Enter fullscreen mode Exit fullscreen mode

Hashing a 1GB file on the main thread freezes the tab. Offloading to a worker with progress events maintains UI responsiveness and increases perceived quality by ~10x.

4. Static Architecture & Deployment

The final production stack:

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚  Static Next.js page (SSG)  β”‚  ← SEO, fast first paint
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚  Lazy-loaded tool component β”‚  ← only loads when user interacts
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚  Web Worker (when needed)   β”‚  ← keeps UI responsive
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚  WASM module (when needed)  β”‚  ← for "real" processing
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Enter fullscreen mode Exit fullscreen mode

No API, database, queue, auth, or payment gateway. The entire surface is static files served via CDN. Infrastructure cost reduces to domain registration + Cloudflare.

Pitfall Guide

  1. Chasing Tool Count Over SEO Content: Shipping 122 thin pages resulted in indexing but an average Google position of 72. Indexed β‰  ranking. Fix: each tool page requires a unique H1, a "How it works" section explaining in-browser processing, targeted FAQs, and 3–5 internal links to related tools. Content + authority drives visibility, not tool volume.
  2. Reinventing the Wheel (Building from Scratch): Writing a custom WASM PDF library took two weeks; pdf-lib already existed and was more robust. Raw WebCodecs video trimming was replaced by FFmpeg.wasm with 10x less code. The browser ecosystem is mature. Leverage established libraries to ship faster and reduce maintenance debt.
  3. Treating Mobile as an Afterthought: 50% of utility traffic is mobile. Drag-and-drop interfaces fail on touch devices. Fix: design the mobile flow first. Replace drag zones with single-tap file pickers and progressive disclosure patterns optimized for thumb reach.
  4. Blocking the Main Thread: Any synchronous computation >200ms triggers UI freezes, causing users to abandon the tool. Fix: enforce a strict worker-offloading rule. Use postMessage for data transfer and stream progress events to maintain perceived responsiveness.
  5. Ignoring Browser Memory Limits: Client-side processing breaks down for files >2GB, long-running batch jobs (>20 min), private API key requirements, or heavy ML inference. Fix: recognize when to ship a backend. Browser-first is a competitive advantage, not a universal replacement.

Deliverables

  • Browser-First Utility Blueprint: A reference architecture diagram detailing SSG routing, lazy component mounting, worker instantiation boundaries, and WASM module lifecycle management. Includes CDN caching headers, fallback strategies for unsupported browsers, and memory monitoring hooks.
  • Pre-Launch Checklist:
    • Verify file size thresholds align with browser memory limits (~2GB)
    • Confirm all >200ms tasks are offloaded to Web Workers
    • Validate mobile UX flow (tap-to-pick, progressive disclosure, no drag-dependency)
    • Audit SEO structure (unique H1, processing explanation, FAQ, internal linking)
    • Run privacy-by-design review (zero network egress for user files)
    • Test WASM fallbacks and graceful degradation for legacy environments
  • Configuration Templates: Next.js next.config.js for static export optimization, Web Worker bundling setup via import.meta.url, and WASM lazy-loading strategy with progress tracking hooks.