How CDNs Work: Edge Delivery, Caching & Performance
Learn the interview-ready mental model, practical trade-offs, and production patterns for this web fundamentals topic.
Topic content
A CDN is a distributed network of edge servers that cache and deliver content closer to users. It improves speed (shorter distance), reduces origin load (fewer requests), boosts reliability (distributed capacity), and adds security (edge protections). The real power lies in smart cache policies.
Instead of every customer traveling to the main factory (origin server) for every item, local warehouses (edge PoPs) keep popular items in stock. When a customer orders, they get it from the nearest warehouse. If the item isn't there, the warehouse quickly restocks from the factory and keeps it for the next customer.
1How a CDN Works
Client → DNS/Anycast routes to nearest PoP → Edge checks cache key → Hit: serve immediately. Miss: fetch from origin, cache response, return to client.
2Key Benefits
Faster load times, reduced bandwidth/egress costs, higher availability through distribution, and edge-layer security (DDoS protection, WAF).
3Cache Policy Fundamentals
Cache key design, TTL / Cache-Control headers, and purge strategy determine hit ratio, freshness, and cost.
- ✓CDN = distributed edge cache layer between users and origin
- ✓Cache hits serve from nearby PoP for speed and lower origin load
- ✓Cache key + TTL + purge strategy control effectiveness
- ✓Major benefits: speed, cost savings, reliability, security
- ✓Not a replacement for good origin architecture
- ✓Measure real impact on TTFB, LCP, and origin traffic
- ✓Policy design matters more than just having a CDN