Throughput vs. Latency: The Highway Analogy

Understanding the fundamental trade-offs in system design.

In the world of concurrency and distributed systems, two metrics reign supreme: Throughput and Latency. While often discussed together, they represent distinct aspects of system performance, and optimizing for one often comes at the cost of the other.

Definitions

The Highway Analogy

Imagine a highway.

Interestingly, high traffic (high throughput attempt) often leads to congestion, which drastically increases the time it takes for any single car to arrive (high latency).

Interactive Visualization

Below is a simulation of a highway. You can adjust the number of lanes (capacity) and the arrival rate of cars (load). Observe how increasing the load impacts the speed of individual cars once the capacity is reached.

Little's Law

These concepts are mathematically related by Little's Law:

L = λ * W

Where:

Summary

When designing systems, decide what is more critical for your user experience. A real-time trading system needs low latency. A batch processing system for payroll needs high throughput.