Cache Complexity Tradeoffs
"Let's add caching to make it faster."
6 hours later: "Why is the cached response wrong?"
Caching trades simplicity for speed. You now have to think about:
- When to invalidate
- Cache stampedes
- Stale data serving
- Cache warming
- Distributed cache consistency
What happened:
We cached an API response that included user-specific data. The cache key didn't include user ID. Every user saw the first user's data for 6 hours until someone noticed.
Questions to ask before caching:
- Is the data user-specific?
- How often does it change?
- What's the cost of serving stale data?
- Can you easily invalidate it?
Lesson: The fastest cache is the one you don't need. First optimize the query. Add caching when you've measured the actual impact.