← Back to Blog
TypeScript2026-05-09Β·63 min read

TypeScript at Scale: Why Your tsc Takes 90 Seconds and How to Fix It

By Alex Cloudstar

TypeScript Build Latency: Diagnostic Workflows and Pattern-Based Optimization

Current Situation Analysis

Engineering teams frequently attribute sluggish TypeScript compilation to the language itself, assuming that as codebases grow, build times must inevitably degrade. This assumption leads to premature architectural shifts, such as migrating to alternative runtimes or fragmenting repositories, without addressing the underlying type-checking inefficiencies.

The reality is that TypeScript performance degradation is rarely linear; it is typically quadratic or exponential, driven by specific type-checking patterns that force the compiler to perform redundant work. In a documented case study, a codebase exhibited a clean build time of 94 seconds and an incremental build time of 12 seconds. Editor responsiveness was also severely impacted, with the language server freezing for 2–3 seconds during simple hover actions on validation schemas.

By identifying and eliminating three specific patterns responsible for millions of redundant type instantiations, the team reduced the clean build time to 11 seconds and achieved sub-second incremental builds. Crucially, this optimization required no migration to Project Corsa, no switch to Bun, and no repository splitting. The intervention focused entirely on diagnostic profiling and pattern remediation, demonstrating that build latency is often a solvable engineering problem rather than an inherent language limitation.

WOW Moment: Key Findings

The correlation between type instantiation volume and build duration is the primary indicator of optimization potential. Profiling reveals that a small subset of files often accounts for the majority of compilation cost due to complex type expansions.

Metric Unoptimized Baseline Pattern-Optimized Improvement
Clean Build Time 94 seconds 11 seconds 88% reduction
Incremental Build 12 seconds <1 second 92% reduction
Type Instantiations ~4.5 million ~120,000 97% reduction
Editor Latency 2–3s freeze Responsive Immediate feedback

This data indicates that reducing the compiler's instantiation workload by nearly two orders of magnitude yields proportional gains in build speed. The optimization enables developers to maintain type safety without sacrificing development velocity, allowing for rapid iteration cycles even in large-scale applications.

Core Solution

Resolving TypeScript build latency requires a systematic approach: instrument the compiler to identify bottlenecks, remediate high-cost patterns, and configure the toolchain for efficiency.

1. Instrumentation and Profiling

Before modifying code, establish a baseline using TypeScript's built-in diagnostic flags.

  • Extended Diagnostics: Run npx tsc --extendedDiagnostics to obtain a breakdown of compilation phases. Focus on Check time; if this dominates, the issue lies within the type system. If I/O Read time or Parse time is high, the problem relates to file volume or size.
  • Trace Generation: Execute npx tsc --generateTrace ./trace to produce a Chrome trace file. Open this in chrome://tracing or https://ui.perfetto.dev to visualize a flame graph of type-checking activity. Healthy codebases show most files completing in under 100ms; files exceeding 500ms indicate problematic type constructs.
  • Trace Analysis: Use npx @typescript/analyze-trace ./trace to automatically surface the most expensive files, deepest type instantiations, and costliest type aliases. This tool directs attention to the specific locations requiring remediation.

2. Pattern Remediation

Profiling typically reveals a handful of recurring patterns that drive exponential complexity. Address these patterns with the following strategies.

A. Breaking Generic Inference Chains

Wrappers that infer types from heavily generic libraries (e.g., ORMs, RPC frameworks) force the compiler to re-expand complex types at every call site.

Problem:

// Forces re-evaluation of T's complex return type at every usage
function withMetrics<T extends (...args: any[]) => Promise<any>>(
  fn: T
): (...args: Parameters<T>) => Promise<Awaited<ReturnType<T>>> {
  return async (...args) => { /* ... */ return fn(...args); };
}

Solution: Decouple the wrapper from the specific generic signature. Accept a simpler function type and constrain the return type explicitly.

type AsyncFn<R> = (...args: any[]) => Promise<R>;

function withMetrics<R>(fn: AsyncFn<R>): AsyncFn<R> {
  return async (...args) => {
    // Metrics logic
    return fn(...args);
  };
}

This approach prevents the compiler from traversing the full generic tree of the wrapped function, reducing instantiation overhead.

B. Capping Recursive Utility Types

Recursive utility types like DeepReadonly or DeepPartial applied to large, nested structures cause type explosions. The compiler must recursively process every level of the type hierarchy.

Problem:

type DeepReadonly<T> = {
  readonly [K in keyof T]: T[K] extends object ? DeepReadonly<T[K]> : T[K];
};
// Applied to a large state object, this triggers infinite recursion risk.

Solution: Implement depth-capped recursion using a numeric counter. This guarantees termination and limits the compiler's workload.

type Prev = [never, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10];

type DepthCappedReadonly<T, Depth extends number = 4> = 
  Depth extends 0 
    ? T 
    : { readonly [K in keyof T]: DepthCappedReadonly<T[K], Prev[Depth]> };

By defaulting to a finite depth, you retain type safety for common use cases while preventing exponential expansion.

C. Replacing Massive Discriminated Unions

Unions with hundreds of variants, often generated from schemas, slow down type narrowing. Each switch statement requires the compiler to eliminate impossible variants, a process that scales with union size.

Problem:

type Command = 
  | { type: 'create_user'; payload: CreateUser }
  | { type: 'delete_user'; payload: DeleteUser }
  // ... 200+ variants
  ;

function process(cmd: Command) {
  switch (cmd.type) {
    // Compiler checks all variants at each case
  }
}

Solution: Convert the union to a record type keyed by the discriminator. Use satisfies to maintain exhaustiveness checking without the runtime narrowing cost.

const handlers = {
  create_user: handleCreateUser,
  delete_user: handleDeleteUser,
  // ...
} satisfies Record<Command['type'], (payload: any) => void>;

function process(cmd: Command) {
  return handlers[cmd.type](cmd.payload);
}

This lookup is constant-time, eliminating the linear scan associated with union narrowing.

D. Avoiding Cartesian Template Literals

Template literal types that compute the cartesian product of object keys can generate massive string unions, forcing the compiler to materialize every combination.

Problem:

const config = { /* large nested object */ } as const;
type AllKeys = `${keyof typeof config}.${keyof typeof config[keyof typeof config]}`;
// Generates thousands of string literals

Solution: Generate key combinations at runtime or narrow the type scope to specific sections. If compile-time autocompletion is required, restrict the template literal to a smaller subset of keys.

3. Architecture and Configuration

Project References

Project references enable incremental builds by splitting the codebase into independent projects. However, they introduce orchestration overhead and require composite mode, which mandates declaration file emission.

  • When to Use: Implement project references when the codebase exceeds 50,000 lines of TypeScript or contains three or more logical domains that change independently.
  • Setup: Configure each package with composite: true and declaration: true. The root tsconfig.json should reference these projects.
  • Caution: For smaller applications, project references may degrade performance due to build overhead. Evaluate based on profiling data.

Compiler Settings

  • skipLibCheck: true: This setting instructs the compiler to skip type checking of declaration files in node_modules. For most projects, this provides the highest performance gain with minimal risk, as library types are typically stable and pre-validated.

Pitfall Guide

Pitfall Explanation Fix
Blind Optimization Modifying code without profiling leads to wasted effort on non-critical paths. Always run --extendedDiagnostics and --generateTrace before making changes.
Unbounded Recursion Using DeepReadonly or similar utilities on large types causes exponential type expansion. Cap recursion depth using numeric counters or avoid deep utilities entirely.
Inference Cascades Generic wrappers that re-infer complex library types multiply compilation cost. Break inference chains by using simpler function types or explicit return constraints.
Union Bloat Massive discriminated unions slow down narrowing and exhaustiveness checks. Split unions across modules or use record lookups with satisfies.
Premature References Applying project references to small codebases adds overhead without benefit. Use project references only for large, multi-domain codebases (>50k LOC).
Cartesian Explosion Template literals computing key products generate huge string unions. Generate keys at runtime or narrow template literal scope.
Ignoring Library Types Failing to skip node_modules type checking wastes resources on stable code. Enable skipLibCheck: true in tsconfig.json.

Production Bundle

Action Checklist

  • Run npx tsc --extendedDiagnostics to identify dominant compilation phases.
  • Generate a trace file using npx tsc --generateTrace ./trace.
  • Analyze the trace with npx @typescript/analyze-trace ./trace to locate hot files.
  • Refactor generic wrappers to break inference chains in identified files.
  • Replace recursive utility types with depth-capped alternatives.
  • Convert massive discriminated unions to record lookups where appropriate.
  • Enable skipLibCheck: true in the root tsconfig.json.
  • Evaluate project references only if the codebase exceeds 50,000 lines.

Decision Matrix

Scenario Recommended Approach Why Cost Impact
< 50k LOC, Single Domain Single tsconfig with skipLibCheck Simpler setup; sufficient performance for smaller bases. Low overhead; fast builds.
> 50k LOC, Multiple Domains Project References with composite mode Enables incremental builds; isolates domain changes. Higher setup complexity; faster incremental builds.
Heavy Library Usage skipLibCheck: true + Thin Wrappers Skips stable library types; prevents type leakage. Significant speedup; minimal risk.
Complex State/Config Types Depth-Capped Utilities Prevents recursive type explosions. Maintains safety; reduces instantiation cost.
Auto-Generated Schemas Record Lookups + satisfies Avoids union narrowing overhead. Constant-time dispatch; preserves exhaustiveness.

Configuration Template

{
  "compilerOptions": {
    "target": "ES2020",
    "module": "ESNext",
    "moduleResolution": "bundler",
    "strict": true,
    "skipLibCheck": true,
    "noEmit": true,
    "esModuleInterop": true,
    "forceConsistentCasingInFileNames": true
  },
  "include": ["src/**/*.ts"],
  "exclude": ["node_modules", "dist"]
}

For project references, ensure each package includes:

{
  "extends": "../../tsconfig.base.json",
  "compilerOptions": {
    "composite": true,
    "declaration": true,
    "outDir": "./dist"
  },
  "include": ["src/**/*.ts"]
}

Quick Start Guide

  1. Profile: Run npx tsc --extendedDiagnostics and npx tsc --generateTrace ./trace in your project root.
  2. Analyze: Execute npx @typescript/analyze-trace ./trace to identify the top 3 most expensive files.
  3. Remediate: Apply pattern fixes (generic breaking, depth capping, union conversion) to the identified files.
  4. Configure: Add "skipLibCheck": true to your tsconfig.json.
  5. Validate: Re-run diagnostics to confirm build time reduction and check for regressions.