Caching is a great way to offload servers and networks; reducing the amount of requests coming into and out of a network. The concept behind the need for caching is simple; store the content most frequently accessed closest to the end user to improve performance. The notion of caching can be applied in many areas: server side caching, client side caching, edge caching, and transparent caching. It is possible when accessing a web page that a user will encounter each of these caches.

Server-side Caching

One of the primary purposes of caching within the data center is to reduce the load on the web servers and build a more scalable infrastructure. Web servers can quickly become overwhelmed with requests to serve static content, it is best to let the servers handle the high value transactions than being overwhelmed with serving static content. Caching content closer to the edge of the data center improves the application response time and reduces server load.

Edge Caching

A logical extension of server side caching is edge caching. If the application performs better when content is at the edge of the data center; it would perform even better from the edge of the network. Content Delivery Networks (CDNs) are the most common edge caching solution.

Transparent Caching

Transparent caches are deployed at ISPs to cache frequently accessed items at the edge of the operator’s network. Deploying a transparent cache provides savings for the operator on the backhaul and improves delivery to the end user. End users and content owners are not aware that content was delivered from the operator’s network hence the classification as a transparent cache.

Forward Proxy Cache

Forward proxy caching addresses outbound HTTP versus inbound HTTP requests to any internet facing property. They are typically deployed to reduce upstream bandwidth utilization and costs.

Client-side Caching

Once a user has retrieved a web page resources can be cached on the local device to improve performance on subsequent visits. Browsers determine what content to cache based on a variety of response headers including: Cache-Control, ETag, and Last-Modified. Caching content on the client side provides the greatest benefit for repeat visits as the request never leaves the end-users device.

The below diagram illustrates how an end user may encounter each of these caches when requesting a web page. When the request hits a cache along the chain a cache hit will result in that cache serving the object a cache miss will result in the request moving along the chain until the next cache is encountered.

 

 

CachingFlowDiagram

Future blog posts in this series will offer an in-depth look at server side and client side caching.