System Design
Concepts

Caching Strategies

Master the different caching patterns and when to use each one.

Caching Strategies

Caching is one of the most powerful tools for improving system performance. Understanding when and how to cache data is essential for system design interviews.

Cache-Aside (Lazy Loading)

The application manages the cache directly. On a cache miss, data is loaded from the database and stored in cache.

Pros: Only requested data is cached Cons: Cache miss is slow (3 round trips); data can become stale

Write-Through

Data is written to cache and database simultaneously.

Pros: Cache is always fresh Cons: Write latency increases; unused data may be cached

Write-Behind (Write-Back)

Data is written to cache immediately, then asynchronously to the database.

Pros: Very low write latency Cons: Risk of data loss if cache fails before flush

Read-Through

Cache sits in front of the database. Application always reads from cache; cache fetches from DB on miss.

Pros: Application code is simpler Cons: First request is always slow

Cache Eviction Policies

  • LRU (Least Recently Used) — Evict the item not used for the longest time
  • LFU (Least Frequently Used) — Evict the item used least often
  • FIFO — Evict in insertion order
  • TTL — Evict based on time-to-live