Async Patterns
Status: Complete
Category: Performance
Default enforcement: Advisory
Author: PushBackLog team
Tags
- Topic: performance, architecture
- Skillset: backend, frontend
- Technology: generic
- Stage: execution, review
Summary
Asynchronous programming patterns — including promises, async/await, queues, and event-driven architectures — allow work to proceed without blocking on slow operations (I/O, network, computation). Choosing the right pattern for the right problem prevents bottlenecks and keeps systems responsive under load.
Rationale
I/O is slow; blocking it is expensive
In most web applications, response time is dominated by I/O: database queries, external API calls, file reads. These operations take orders of magnitude longer than CPU operations. A thread or event loop iteration that waits for each I/O operation to complete before starting the next is wasting the time that could be used for other work.
Asynchronous patterns allow the runtime to initiate an I/O operation, register a callback or continuation, and move on to process other requests while waiting. On Node.js, the single-threaded event loop handles thousands of concurrent connections this way. In threaded runtimes (Java, Python with async), the same principle applies: don’t block a thread on I/O.
The spectrum of async
| Scope | Pattern | Use case |
|---|---|---|
| Single operation | async/await | A single DB query or HTTP call |
| Multiple independent operations | Promise.all / concurrent futures | Fetching user + settings + permissions in parallel |
| Fan-out to variable targets | Promise concurrency control | Processing 1,000 items with bounded parallelism |
| Long-running background work | Job queue (BullMQ, Sidekiq, Celery) | Email sending, report generation, webhooks |
| Event stream processing | Message queue (Kafka, SQS, RabbitMQ) | Cross-service communication, audit logs |
Guidance
Parallelise independent I/O operations
The most common and impactful async improvement: replace sequential awaits with parallel execution where operations are independent.
// Sequential: 300ms (100 + 100 + 100)
const user = await getUser(userId);
const permissions = await getPermissions(userId);
const preferences = await getPreferences(userId);
// Parallel: ~100ms (all three concurrent)
const [user, permissions, preferences] = await Promise.all([
getUser(userId),
getPermissions(userId),
getPreferences(userId),
]);
Rule: any two await statements that don’t depend on each other’s result should be Promise.all’d.
Bounded concurrency for bulk operations
import pLimit from 'p-limit';
const limit = pLimit(5); // Max 5 concurrent requests
const results = await Promise.all(
items.map(item => limit(() => processItem(item)))
);
Unbounded Promise.all on N items exhausts connection pools, rate limits external APIs, and can cause OOM. Always set a concurrency limit for bulk operations.
Job queues for work that can be deferred
Not all work needs to happen within the HTTP response cycle. Work that:
- Takes more than ~500ms
- Can fail and needs retry logic
- Should not block the user’s response
…belongs in a job queue:
// HTTP handler: fast response, enqueue work
app.post('/reports', async (req, res) => {
const job = await reportQueue.add('generate', { userId: req.user.id, params: req.body });
res.json({ jobId: job.id, status: 'queued' });
});
// Worker: runs separately, retries on failure
reportQueue.process('generate', async (job) => {
const report = await generateReport(job.data);
await notifyUserReportReady(job.data.userId, report.url);
});
Examples
Await in a loop (sequential anti-pattern)
// Bad: each email sent one at a time — N seconds for N users
for (const user of users) {
await emailService.send(user.email, template); // Sequential
}
// Better: bounded parallel
const limit = pLimit(10);
await Promise.all(
users.map(user => limit(() => emailService.send(user.email, template)))
);
// Best for large volumes: queue per user, worker handles individually
Unhandled promise rejections
// Bad: fire-and-forget with no error handling
sendWelcomeEmail(user.email); // If this rejects, the error is silently swallowed
// Good: always handle rejections
sendWelcomeEmail(user.email).catch(err => logger.error('Welcome email failed', { err, userId: user.id }));
// Better for non-critical async work: enqueue it
await emailQueue.add('welcome', { email: user.email });
Anti-patterns
1. Blocking I/O on the event loop / main thread
Synchronous file reads (fs.readFileSync), CPU-intensive loops, or sleep() calls in Node.js block the event loop for all requests. Use async variants; offload CPU work to worker threads or a queue.
2. Unhandled promise rejections
A rejected promise with no .catch() or try/catch silently fails. In Node.js, unhandled rejections terminate the process in newer versions. Always handle async errors.
3. Sequential awaits for independent operations
The most common performance mistake in async/await code. Two awaits that don’t need each other’s result should run in parallel.
4. Unbounded Promise.all on large arrays
Promise.all(thousandItems.map(processItem)) creates 1,000 concurrent DB connections or HTTP calls. Use p-limit or equivalent to bound concurrency.
5. Using queues for synchronous operations
The inverse mistake: routing simple CRUD operations through a message queue “for scalability” when a direct database call would be instant and sufficient. Queues add latency and operational complexity; they’re justified when the work is genuinely background, deferrable, or cross-service.
Related practices
Part of the PushBackLog Best Practices Library. Suggest improvements →