MP-301c · Module 1
Async Patterns & Event Loop Health
3 min read
Node.js MCP servers run on a single event loop. Every synchronous operation in a tool handler blocks every other operation — including other tool calls, health checks, and protocol messages. A handler that synchronously parses a 10MB JSON file blocks the entire server for the duration of the parse. The fix is not "make everything async" — it is understanding which operations are blocking and choosing the right async pattern for each: streaming for large data, worker threads for CPU-intensive computation, and standard async/await for I/O.
Event loop lag is the canary for blocking operations. Measure it by scheduling a setImmediate callback and tracking the delay between scheduling and execution. Healthy servers show < 5ms lag. Lag above 50ms means something is blocking the event loop, and tool call latency is degrading for all concurrent clients. Add event loop monitoring to your server and alert when lag exceeds your threshold. This catches blocking operations before they become user-visible performance problems.
// Event loop lag monitor
function startEventLoopMonitor(thresholdMs: number = 50) {
let lastCheck = performance.now();
function check() {
const now = performance.now();
const lag = now - lastCheck - 1000; // expected 1s interval
if (lag > thresholdMs) {
console.error(JSON.stringify({
event: "event_loop_lag",
lagMs: Math.round(lag),
threshold: thresholdMs,
severity: lag > thresholdMs * 3 ? "critical" : "warning",
}));
}
lastCheck = now;
setTimeout(check, 1000);
}
setTimeout(check, 1000);
}
// Use worker threads for CPU-heavy operations
import { Worker } from "worker_threads";
async function cpuIntensiveInWorker<T>(scriptPath: string, data: unknown): Promise<T> {
return new Promise((resolve, reject) => {
const worker = new Worker(scriptPath, { workerData: data });
worker.on("message", resolve);
worker.on("error", reject);
worker.on("exit", (code) => {
if (code !== 0) reject(new Error(`Worker exited with code ${code}`));
});
});
}
// Handler using worker thread for heavy computation
async function analyzeHandler(args: { data: string }) {
// Offload CPU work to a worker thread
const result = await cpuIntensiveInWorker<AnalysisResult>(
"./workers/analyze.js",
{ data: args.data },
);
return { content: [{ type: "text" as const, text: JSON.stringify(result) }] };
}
Do This
- Use async/await for all I/O operations — database, network, filesystem
- Offload CPU-intensive work (parsing, analysis, compression) to worker threads
- Monitor event loop lag continuously and alert on threshold violations
- Stream large responses instead of buffering them in memory
Avoid This
- Use fs.readFileSync or other sync APIs in tool handlers — they block all concurrent operations
- Assume JSON.parse/stringify are "fast enough" for multi-megabyte payloads
- Ignore event loop lag because "it only affects other clients" — it affects your tool too
- Spawn a worker thread for every tool call — thread creation overhead defeats the purpose for fast operations