Suppose a small bank has only one teller. Customers take an average of 10 minutes to serve and they arrive at the rate of 5.8 per hour. What will the expected waiting time be?
Assuming customers arrive at random times and take a random amount of time to serve (averaging 10 minutes), the mean wait time will be 5 hours.
But what will the wait time be when you add a second bank teller?
You might be tempted to assume that the wait time simply gets cut in half. But in fact, the second teller drops the wait time to 3 minutes. That's an astounding 99% reduction. 🤯
Why is this true? It's because this example is modeled off of the real world, where arrival and service times are truly random. If everyone arrived at regular intervals and took exactly 10 minutes to serve, there would be no waiting.
When we're designing a system, it's easy to fall into the trap of assuming optimal conditions.
But the world is messy, and by building in slack, we acknowledge that best laid plans often go awry.
Hat tip to Kottke, who provides additional examples and anecdotes of this.