← Back to Blog
TypeScript2026-05-14·79 min read

When zippopotam.us 404s on a real US ZIP: building a 4-tier geocoding fallback

By GasPriceCheck

Architecting Resilient Geospatial Lookups: A Multi-Tier Fallback Strategy for Finite Datasets

Current Situation Analysis

Postal code geocoding is frequently treated as a trivial, solved problem in modern application development. The assumption is straightforward: US ZIP codes are a finite, publicly documented dataset. Multiple free APIs expose them. Wire up an endpoint, parse the latitude and longitude, and move on. This mindset works perfectly until a real-world lookup fails silently.

The industry pain point isn't the absence of data; it's the silent corruption of downstream calculations when that data is missing. When a geocoding service returns a 404 or an empty payload, developers often default to a sentinel coordinate like { lat: 0, lng: 0 }. Mathematically, this is catastrophic. The Haversine formula, standard for calculating great-circle distances, treats (0, 0) as a valid geographic point in the Gulf of Guinea. Any distance calculation against this coordinate inflates by thousands of miles, rendering sorting algorithms, proximity filters, and delivery estimates completely unusable.

This problem is systematically overlooked because teams optimize for initial development velocity rather than runtime resilience. Free APIs are assumed to be complete. Rate limits are ignored until throttling occurs. Timeouts are omitted, causing request threads to hang indefinitely. The reality is that free, community-maintained geocoding datasets contain undocumented gaps. Sampling across metropolitan areas consistently reveals a 3-5% missing rate, even for densely populated regions. When an application serves tens of thousands of pages or user sessions, a 4% gap translates to hundreds of corrupted distance calculations per day. The failure mode isn't loud; it's silent, mathematical, and highly visible to end users.

WOW Moment: Key Findings

The most critical insight isn't about choosing a better API. It's about recognizing that deterministic mappings over finite input spaces should never be resolved at runtime by default. Shifting from a single-source runtime lookup to a tiered resolution strategy fundamentally changes latency, coverage, and failure behavior.

Strategy Avg Latency Coverage Rate Runtime Cost Failure Behavior
Single Free API ~150ms ~96% $0 Silent corruption (0,0)
Runtime Fallback Chain ~200-400ms ~99.5% $0 Graceful degradation
Build-Time Pre-Resolution + Cache <5ms 100% $0 Zero network calls

This finding matters because it decouples data assurance from network reliability. Pre-resolving the entire input space at build time collapses runtime complexity from "1 API call per request" to "1 memory lookup per request." The fallback chain becomes a safety net for edge cases (newly issued codes, PO boxes, dataset drift) rather than the primary execution path. The result is predictable latency, complete coverage, and a system that fails open instead of failing mathematically.

Core Solution

Building a resilient geospatial lookup requires a pipeline architecture. Each tier must be ordered by cost, speed, and reliability. The pipeline only advances to the next tier if the current one cannot produce a valid coordinate pair.

Architecture Rationale

  1. Static Lookup (Tier 1): A pre-baked JSON file containing every known postal code mapped to coordinates. Loaded into memory at startup. Zero network I/O. Handles 99%+ of requests.
  2. Distributed Cache (Tier 2): Redis or equivalent. Catches runtime-resolved codes that weren't in the static file. Shared across horizontally scaled instances. Prevents duplicate external API calls.
  3. Primary External API (Tier 3): Fast, free, but incomplete. Used as a bridge for codes missing from Tier 1.
  4. Secondary External API (Tier 4): Slower, rate-limited, but comprehensive. Covers gaps left by Tier 3. Requires strict etiquette compliance.
  5. Degradation Handler (Tier 5): Returns a structured failure object instead of a raw sentinel. Downstream consumers check for failure state and adjust behavior (e.g., skip distance sorting, show unsorted lists).

Implementation

The following TypeScript implementation uses a class-based pipeline with explicit tier methods, dependency injection for caching, and structured failure handling.

import { Redis } from '@upstash/redis';
import { createHash } from 'crypto';

export interface CoordinatePair {
  latitude: number;
  longitude: number;
}

export interface LookupResult {
  success: boolean;
  coordinates?: CoordinatePair;
  source: 'static' | 'cache' | 'primary_api' | 'secondary_api' | 'degraded';
}

export class GeospatialResolver {
  private staticLookup: Record<string, CoordinatePair>;
  private cache: Redis;
  private readonly CACHE_TTL_SECONDS = 60 * 60 * 24 * 30; // 30 days

  constructor(staticData: Record<string, CoordinatePair>, redisClient: Redis) {
    this.staticLookup = staticData;
    this.cache = redisClient;
  }

  public async resolve(postalCode: string): Promise<LookupResult> {
    const normalizedCode = postalCode.trim().padStart(5, '0');

    // Tier 1: Static file lookup
    const staticCoord = this.staticLookup[normalizedCode];
    if (staticCoord) {
      return { success: true, coordinates: staticCoord, source: 'static' };
    }

    // Tier 2: Distributed cache
    const cacheKey = `geo:${normalizedCode}`;
    const cached = await this.cache.get<CoordinatePair>(cacheKey);
    if (cached) {
      return { success: true, coordinates: cached, source: 'cache' };
    }

    // Tier 3: Primary external service
    const primaryResult = await this.queryPrimaryAPI(normalizedCode);
    if (primaryResult) {
      await this.cache.set(cacheKey, primaryResult, { ex: this.CACHE_TTL_SECONDS });
      return { success: true, coordinates: primaryResult, source: 'primary_api' };
    }

    // Tier 4: Secondary external service
    const secondaryResult = await this.querySecondaryAPI(normalizedCode);
    if (secondaryResult) {
      await this.cache.set(cacheKey, secondaryResult, { ex: this.CACHE_TTL_SECONDS });
      return { success: true, coordinates: secondaryResult, source: 'secondary_api' };
    }

    // Tier 5: Graceful degradation
    return { success: false, source: 'degraded' };
  }

  private async queryPrimaryAPI(code: string): Promise<CoordinatePair | null> {
    try {
      const controller = new AbortController();
      const timeoutId = setTimeout(() => controller.abort(), 2000);

      const response = await fetch(`https://api.zippopotam.us/us/${code}`, {
        signal: controller.signal,
      });
      clearTimeout(timeoutId);

      if (!response.ok) return null;

      const payload = await response.json();
      const place = payload.places?.[0];
      if (!place?.latitude || !place?.longitude) return null;

      return {
        latitude: parseFloat(place.latitude),
        longitude: parseFloat(place.longitude),
      };
    } catch {
      return null;
    }
  }

  private async querySecondaryAPI(code: string): Promise<CoordinatePair | null> {
    try {
      const controller = new AbortController();
      const timeoutId = setTimeout(() => controller.abort(), 3000);

      const response = await fetch(
        `https://nominatim.openstreetmap.org/search?postalcode=${code}&country=us&format=json&limit=1`,
        {
          signal: controller.signal,
          headers: {
            'User-Agent': 'GeoResolver/2.1 (ops@yourdomain.com)',
            'Accept': 'application/json',
          },
        }
      );
      clearTimeout(timeoutId);

      if (!response.ok) return null;

      const results = await response.json();
      if (!Array.isArray(results) || results.length === 0) return null;

      return {
        latitude: parseFloat(results[0].lat),
        longitude: parseFloat(results[0].lon),
      };
    } catch {
      return null;
    }
  }
}

Why This Structure Works

  • Explicit Failure State: Returning { success: false } instead of { lat: 0, lng: 0 } forces downstream consumers to handle missing data intentionally. This prevents mathematical corruption.
  • AbortController Timeouts: Network requests are bounded. Without explicit timeouts, a stalled API call can exhaust connection pools and cause cascading latency.
  • Cache Key Normalization: Postal codes are normalized and padded before lookup. This prevents cache fragmentation from inconsistent input formats.
  • Separation of Concerns: Each tier is isolated. If the primary API changes its response schema, only queryPrimaryAPI needs updating. The pipeline remains intact.
  • TTL Strategy: 30-day cache expiration balances freshness with cost. Geospatial data for established postal codes rarely changes. Shorter TTLs increase external API load; longer TTLs risk serving stale data for newly issued codes.

Pitfall Guide

1. The Zero Coordinate Trap

Explanation: Returning { lat: 0, lng: 0 } on failure corrupts distance calculations. The Haversine formula treats this as a valid point in the Atlantic Ocean, inflating all downstream distances. Fix: Return a structured failure object. Downstream consumers must check success before running spatial math. If false, skip distance sorting or display a fallback message.

2. Ignoring External API Etiquette

Explanation: Free geocoding services like Nominatim enforce strict rate limits (1 request/second) and require valid User-Agent headers. Default fetch clients often use generic agents that get blocked. Fix: Always set a descriptive User-Agent with contact information. Implement exponential backoff and request throttling. Never burst-call free APIs without a queue.

3. Missing Request Timeouts

Explanation: Unbounded fetch calls can hang indefinitely on DNS resolution or TCP handshake failures. This exhausts server threads and causes timeout cascades. Fix: Use AbortController with explicit millisecond limits. Wrap external calls in try/catch blocks that return null on timeout, allowing the pipeline to advance.

4. Runtime-Only Resolution

Explanation: Resolving every lookup at runtime treats a finite dataset as infinite. It wastes bandwidth, increases latency, and exposes the application to external API outages. Fix: Pre-resolve the entire input space at build time. Store results in a static file. Use runtime resolution only for edge cases or newly issued codes.

5. Caching Failures

Explanation: If an API returns a 404 or empty payload, caching that result prevents future successful lookups once the data is updated upstream. Fix: Only cache successful responses. Explicitly skip caching on 404, 5xx, or empty payloads. Consider a separate "negative cache" with a short TTL if you want to avoid repeated lookups for genuinely missing codes.

6. Assuming Free APIs Have SLAs

Explanation: Community-maintained datasets have no uptime guarantees. Data gaps, schema changes, and IP blocks are normal operating conditions. Fix: Treat free APIs as best-effort dependencies. Always pair them with a fallback tier. Monitor hit rates and failure patterns to detect upstream degradation early.

7. Cache Stampedes on Cold Starts

Explanation: When a distributed cache expires or restarts, thousands of concurrent requests can hit the external API simultaneously, triggering rate limits or bans. Fix: Implement request coalescing or a local in-memory cache layer in front of Redis. Use probabilistic early expiration (e.g., refresh at 80% of TTL) to stagger cache rebuilds.

Production Bundle

Action Checklist

  • Pre-resolve your entire postal code dataset at build time and embed it as a static lookup table
  • Implement a 5-tier resolution pipeline ordered by speed and cost
  • Add explicit AbortController timeouts to all external API calls
  • Normalize and pad postal codes before lookup to prevent cache fragmentation
  • Return structured failure objects instead of raw sentinel coordinates
  • Configure Redis with a 30-day TTL and implement cache stampede prevention
  • Set a valid User-Agent header and throttle requests to secondary APIs
  • Add downstream guards that skip spatial math when resolution fails

Decision Matrix

Scenario Recommended Approach Why Cost Impact
< 10k daily lookups, single region Static file + Redis cache Zero external API calls, minimal infrastructure $0
10k-100k daily lookups, multiple regions Static file + Redis + Primary API fallback Handles edge cases without overloading external services $0
> 100k daily lookups, global coverage Static file + Redis + Self-hosted Nominatim Avoids rate limits, full control over data freshness Hosting + maintenance
Real-time logistics routing Paid geocoding API (Mapbox/Google) + Fallback chain SLA-backed accuracy, higher precision, guaranteed uptime $5-$10 per 1k requests

Configuration Template

// geospatial.config.ts
import { Redis } from '@upstash/redis';
import { GeospatialResolver } from './GeospatialResolver';
import staticData from './data/postal_coordinates.json';

export function initializeGeospatialService(): GeospatialResolver {
  const redisClient = new Redis({
    url: process.env.REDIS_URL!,
    token: process.env.REDIS_TOKEN!,
  });

  return new GeospatialResolver(staticData, redisClient);
}

// Usage in route handler
export async function GET(request: Request) {
  const url = new URL(request.url);
  const postalCode = url.searchParams.get('postal_code');
  
  if (!postalCode) {
    return Response.json({ error: 'Missing postal_code' }, { status: 400 });
  }

  const resolver = initializeGeospatialService();
  const result = await resolver.resolve(postalCode);

  if (!result.success) {
    return Response.json({ 
      message: 'Location resolved with degraded accuracy',
      coordinates: null,
      source: result.source 
    }, { status: 200 });
  }

  return Response.json({
    coordinates: result.coordinates,
    source: result.source,
    timestamp: new Date().toISOString()
  });
}

Quick Start Guide

  1. Generate Static Dataset: Run a build-time script that iterates through all known postal codes, queries your primary and secondary APIs, and writes successful results to a JSON file. Implement checkpointing to resume interrupted runs.
  2. Initialize Resolver: Import the static JSON into your application. Instantiate the GeospatialResolver class with the static data and a Redis client.
  3. Wire Pipeline: Replace direct API calls in your route handlers with resolver.resolve(postalCode). Handle the success flag explicitly in downstream logic.
  4. Add Monitoring: Track source distribution in your metrics pipeline. Alert if degraded or secondary_api hit rates exceed 5% over a 24-hour window.
  5. Deploy & Validate: Run load tests to verify cache hit rates. Confirm that external API calls only occur for missing codes. Verify downstream distance calculations skip gracefully on failure.

This architecture transforms a fragile, single-point dependency into a production-grade lookup service. By front-loading data assurance and back-loading graceful degradation, you eliminate silent mathematical corruption while maintaining sub-5ms latency for the vast majority of requests. The pattern extends cleanly to any finite, deterministic mapping: currency conversion, timezone resolution, language code normalization, and regional compliance data. Build the chain before you need it.