s lower-priority callbacks (setTimeout, I/O, UI events). Only one macrotask executes per cycle.
4. Render Cycle: If the stack is empty and microtasks are drained, the browser may perform style calculation, layout, and paint.
Implementation Pattern: The Chunked Processor
A common production requirement is processing large datasets without freezing the UI. The solution involves breaking work into chunks and yielding to the event loop using macrotasks, ensuring the render cycle can occur between chunks.
interface ProcessingConfig {
chunkSize: number;
maxDurationMs: number;
onProgress: (processed: number, total: number) => void;
}
class DataProcessor {
private isRunning = false;
async processDataset<T>(
items: T[],
transform: (item: T) => T,
config: ProcessingConfig
): Promise<T[]> {
if (this.isRunning) {
throw new Error('Processor is already active.');
}
this.isRunning = true;
const results: T[] = [];
let index = 0;
const total = items.length;
while (index < total) {
const chunkEnd = Math.min(index + config.chunkSize, total);
const startTime = performance.now();
// Process chunk synchronously
for (let i = index; i < chunkEnd; i++) {
results.push(transform(items[i]));
}
index = chunkEnd;
config.onProgress(index, total);
// Yield to event loop if more work remains
if (index < total) {
const elapsed = performance.now() - startTime;
if (elapsed < config.maxDurationMs) {
// Use microtask for rapid continuation if within time budget
await new Promise(resolve => queueMicrotask(resolve));
} else {
// Use macrotask to yield to render/input if time budget exceeded
await new Promise(resolve => setTimeout(resolve, 0));
}
}
}
this.isRunning = false;
return results;
}
}
// Usage Example
const processor = new DataProcessor();
const largeArray = Array.from({ length: 10000 }, (_, i) => i);
processor.processDataset(largeArray, (n) => n * 2, {
chunkSize: 100,
maxDurationMs: 40,
onProgress: (done, total) => console.log(`Progress: ${done}/${total}`),
}).then((result) => console.log('Processing complete', result.length));
Rationale:
- Adaptive Yielding: The code checks execution time. If a chunk finishes quickly, it uses
queueMicrotask to maintain throughput. If it approaches the time budget, it switches to setTimeout to yield to the render cycle, preventing jank.
- Type Safety: TypeScript interfaces enforce configuration contracts, reducing runtime errors.
- State Management: The
isRunning flag prevents concurrent execution conflicts.
Priority Demonstration
Understanding queue priority is essential for debugging. The following example illustrates the strict ordering: synchronous code first, then all microtasks, then one macrotask.
const demonstratePriority = () => {
console.log('1. Synchronous Start');
// Macrotask
setTimeout(() => console.log('4. Macrotask (setTimeout)'), 0);
// Microtask via Promise
Promise.resolve().then(() => console.log('2. Microtask (Promise)'));
// Microtask via queueMicrotask
queueMicrotask(() => console.log('3. Microtask (queueMicrotask)'));
console.log('5. Synchronous End');
};
// Execution Order:
// 1. Synchronous Start
// 5. Synchronous End
// 2. Microtask (Promise)
// 3. Microtask (queueMicrotask)
// 4. Macrotask (setTimeout)
Key Insight: queueMicrotask and Promise.then share the same queue. They execute in insertion order, but always before any macrotask, regardless of when the macrotask was scheduled.
Pitfall Guide
1. Microtask Starvation
Explanation: Scheduling microtasks recursively (e.g., a microtask that schedules another microtask) prevents the event loop from reaching the render cycle. The UI freezes, and input events are ignored.
Fix: Limit microtask depth. Use setTimeout or requestAnimationFrame to yield control periodically.
2. The Zero-Delay Fallacy
Explanation: setTimeout(fn, 0) does not execute immediately. It schedules a macrotask to run after the current stack and all microtasks complete. In a busy loop, this can result in significant delays.
Fix: Use queueMicrotask for immediate follow-up logic. Use setTimeout only when you intend to yield to the browser.
3. Render Blocking in Microtasks
Explanation: Performing heavy DOM manipulation or layout reads inside microtasks blocks the render cycle. This causes visual stuttering.
Fix: Batch DOM updates. Use requestAnimationFrame for visual changes to sync with the browser's paint cycle.
4. Async/Await Blocking Misconception
Explanation: Developers sometimes assume await blocks the thread. It only pauses the async function; the thread remains free to process other tasks. The code after await becomes a microtask.
Fix: Trust the non-blocking nature of await. Focus on ensuring the awaited operation is truly asynchronous and not a synchronous wrapper.
5. Closure Variable Capture
Explanation: Using var in loops with asynchronous callbacks captures the variable by reference, leading to all callbacks seeing the final loop value.
Fix: Use let for block-scoped binding or wrap the callback in an IIFE.
// β Incorrect
for (var i = 0; i < 3; i++) {
setTimeout(() => console.log(i), 10); // Logs 3, 3, 3
}
// β
Correct
for (let i = 0; i < 3; i++) {
setTimeout(() => console.log(i), 10); // Logs 0, 1, 2
}
6. MutationObserver Timing Confusion
Explanation: MutationObserver callbacks are microtasks but are scheduled specifically after DOM mutations. They may fire before or after Promise microtasks depending on the mutation timing, causing subtle race conditions.
Fix: Avoid relying on the relative order of MutationObserver and Promises. Use explicit state flags or queueMicrotask to sequence dependent logic.
7. RequestAnimationFrame vs SetInterval
Explanation: setInterval fires regardless of the render cycle, potentially causing dropped frames or excessive CPU usage when the tab is backgrounded.
Fix: Use requestAnimationFrame for animations. It syncs with the display refresh rate and pauses automatically when the tab is inactive.
Production Bundle
Action Checklist
Decision Matrix
| Scenario | Recommended Approach | Why | Cost Impact |
|---|
| Update DOM immediately after state change | Microtask (queueMicrotask) | Ensures DOM updates before render, maintaining consistency. | Low latency, high throughput. |
| Yield to browser for input handling | Macrotask (setTimeout) | Allows event loop to process user input and render. | Slight delay, improved responsiveness. |
| Sync animation frame updates | requestAnimationFrame | Aligns with browser paint cycle, prevents jank. | Optimized for visual smoothness. |
| Process large dataset without freeze | Chunked Macrotask | Breaks work into manageable pieces, yields frequently. | Higher CPU overhead, preserves UI. |
| Observe DOM changes | MutationObserver | Efficiently batches DOM mutations for processing. | Low overhead, microtask priority. |
Configuration Template
A robust scheduler utility for managing asynchronous workloads in production.
export class TaskScheduler {
private queue: Array<() => Promise<void>> = [];
private isProcessing = false;
enqueue(task: () => Promise<void>): void {
this.queue.push(task);
this.scheduleNext();
}
private async scheduleNext(): Promise<void> {
if (this.isProcessing || this.queue.length === 0) return;
this.isProcessing = true;
const task = this.queue.shift();
try {
await task!();
} catch (error) {
console.error('Task failed:', error);
} finally {
this.isProcessing = false;
this.scheduleNext();
}
}
clear(): void {
this.queue = [];
this.isProcessing = false;
}
}
// Usage
const scheduler = new TaskScheduler();
scheduler.enqueue(async () => {
await fetch('/api/data');
console.log('Task 1 complete');
});
scheduler.enqueue(async () => {
await fetch('/api/other');
console.log('Task 2 complete');
});
Quick Start Guide
- Identify Blocking Code: Use performance profiling to locate functions that run longer than 50ms on the main thread.
- Insert Yield Points: Add
await new Promise(resolve => setTimeout(resolve, 0)) at logical breakpoints in long loops.
- Measure Impact: Re-profile the application to verify that frame rates improve and input latency decreases.
- Optimize Chunk Size: Adjust the work per chunk based on device performance. Smaller chunks improve responsiveness but increase overhead.
- Validate Microtasks: Ensure critical state updates use
queueMicrotask to avoid race conditions with rendering.