paint-brush
The 6 Core Concepts for HTTP Cachingby@jerrychang
1,001 reads
1,001 reads

The 6 Core Concepts for HTTP Caching

by JerryNovember 14th, 2022
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Don’t spend hours and hours reading RFCs and documentation, just read this guide! We’ll go through the 6 core concepts you should know when working with HTTP caching. Then from there, it is easy to fill the details as you go.

Company Mentioned

Mention Thumbnail
featured image - The 6 Core Concepts for HTTP Caching
Jerry HackerNoon profile picture


Content

  • Introduction

  • Cache Control Headers

  • Validation

    • ETag
    • Validation Request
    • Validation Request outcomes
  • Vary

  • Request Collapsing

  • Response Staleness

    1. Serve Stale, while revalidating
    2. Revalidate before reuse
  • Deleting From The Cache

  • Conclusion

Introduction

HTTP caching is a standard that many software tools follow and are built on. The tools include browsers, content delivery networks (CDNs), proxy caches, gateways, and many more.


So, that means these tools either support it or are at least compatible with it because it is so widely used.


This is why it is important to understand it because so many tools are built on this caching standard.


Illustration of common software tools built on the HTTP caching standard


After going through this guide, you should have a better understanding of how HTTP caching works!


It’s going to save you hours of reading and researching!


There are many details that you can learn about HTTP caching but of all of them, I think these 6 concepts would give you enough to build a solid foundation.


It would be like learning the 80/20 of HTTP caching!


Here are the 6 concepts:


  1. Cache-Control headers
  2. Validation
  3. Vary
  4. Request Collapsing
  5. Response Staleness
  6. Deleting from the cache


Let’s dive right in!

1. Cache Control headers

A cache control header provides a way for the server to give instructions to the client or the shared cache about the caching behavior for a particular resource.


Here is an example:


Cache-Control: max-age=3600 // 60 minutes


max-age is one of the most common directives to use. It controls caching which allows us to specify how long to cache a particular response.


This is not the only directive, there are several other types that you can use.


Here are some of the more common types:


Common Cache-Control directives


For a full list of directives, check out MDN - Cache Control.


💡 Tip: You can also add several directives by separating them with a comma.

Here is an example: Cache-Control: public, max-age=3600

2. Validation

Of all the concepts, this is probably the most important. It has to do with validation.


When a resource in a cache becomes stale, the cache does not immediately remove it.


Rather it “validates” this resource with the server by making a request with the If-None-Match header and the ETag value.


In a later section, we’ll also discuss the different strategies for handling stale responses.

ETag

ETags are typically a checksum (it can be a hash or a version number) generated by the server to check whether or not a resource has changed.


Let’s see how this fits in with the validation request.

Validation request

To perform validation for a cached response, the client sends a GET request to the server with the appropriate details in the request including the If-None-Match header with the ETag as the value.


Illustration of a validation request: client validating a response with server


The response to this request has two outcomes.

The validation request outcomes

There are only two outcomes from the validation request.


  1. Resource is not modified - The server responds with a HTTP 304 Modified , the response remains the same

  2. New resource is available - The server responds with a HTTP 200 OK , and returns the latest version of the response


Illustration of the possible outcomes


Now, what happens if we have variations of the resource but the same URL?


How do we go about caching that? That’s where the Vary header comes in.

3. Vary

When a response comes back from the server, it is typically cached based on its URL.


However, there are times when the same URL can produce many variations (ie languages, and compression formats).


Some examples include:


  • Accept-Language - A different language or localization
  • Content-Encoding - a type of compression used for transferring between the client & origin server


So, what we can do in this scenario is to provide a Vary header in the response from the server to alter the cache key.


Example: Caching when using Accept-Language for the Vary header


With the Vary header, you are not limited to just one header.


You can provide one or many of them in a comma-separated format (ie Vary: Accept-Language,Content-Encoding ), those values will be used to create the cache key for storing the cached response.

4. Request collapsing

Another concept to understand in regard to HTTP caching is the idea of request collapsing.


When there is a shared cache, and multiple clients are requesting the same resource, it would reduce the number of forwarded requests to the origin server by collapsing it.


Let’s illustrate this.


When a request comes in for a resource and assuming a response is not in the cache, the shared cache would forward the request to the origin server.


During this time, let’s say more requests arrive for the same resource.


In this case, the shared cache would not forward any more requests to the server, rather it would wait for the first request.


Illustration of the request collapsing from multiple clients


Then when the request returns, it can use the same response to serve multiple clients who requested the same resource.


Illustration of shared response to serve multiple clients



⚠️ Important: The important thing to note here is that this only applies to responses that can be shared across the clients.


5. Response Staleness

When working with cached responses, it is also important to talk about how to handle situations when they become stale.


There are two common directives available that can be used to manage responses that have expired.


The strategy you choose to use will depend on the type of resource you are serving, and the experience you wish to provide.


The Two options are:

  1. Serve stale, while revalidate

  2. Revalidate before reuse


Illustration of the different options to managing stale responses


Let’s go through these options.

1. Serve stale, while revalidate

When the response expires, this option would serve the stale response. Then it would revalidate with the server in the background.


This means that some clients will temporarily get a stale response while the cache revalidates in the background. Just keep this in mind.


In some cases, you may want to always serve the latest response. So, this may or may not work depending on your use case.

2. Revalidate before reuse

When the response expires, this option will revalidate with the server before reusing a response.


If the response has changed, the server will return with the new response, which will be served to the client.


This means that the cache will revalidate immediately before serving a response to check if the current response is still up-to-date.


After doing so, it will either serve the existing response (if it hasn’t changed) or the new response to the client.


There is also Cache-Control: proxy-revalidate which is identical to must-revalidate but it is used by shared caches.

What happens if the validation request fails?

This is not directly related to stale responses but since we are discussing the validation request, it may be worth mentioning.


Upon receiving an error response from the origin server, the default behavior of the shared cache varies.


Most shared caches will try to serve the stale response (if applicable) — Please keep in mind that this is not always the case.


If you want to support this behavior, you can use the Cache-Control: stale-if-error=[time-in-seconds] directive.


This translates to: If the shared cache receives an error response then serve the stale response for the defined duration.


Illustration of using stale-if-error directive


Ok, we talked about managing stale responses but what about the cached responses that are no longer valid?

6. Deleting from the cache

Sometimes you may want to remove a stored response from the cache.


when working with a client or a shared cache, there are differences in how this is handled.

Client caches (Browser Cache)

When working with a client cache (browser cache), there isn’t a direct way to delete responses from the cache after caching it.


That means you would have to wait until the cached response expires before it can be changed.

There are proposals for a new HTTP header, Clear-Site-Data: cache , which can be set to clear the cache in the browser.


However, this may not be supported on all browsers, so just keep that in mind!

Shared caches

When working with a shared cache, you typically have more control over the items in the cache (if it is a self-managed service).


Most shared caches (ie CDNs, proxy caches, gateways) will provide an API to delete (or invalidate) the items in the cache.


Ultimately, this means when storing responses in the client (browser cache), there will be less control.


When devising a caching strategy, you should take that into consideration.

Conclusion

As I mentioned in the introduction, this guide is not meant to be a guide that covers everything.


Rather it covers some of the major elements (The 6 core concepts) of HTTP caching that would help you to better understand how it works.


Once you understand these elements, it should be just a matter of looking up the other details to fill in the gaps.


Let’s do a recap.


The Takeaways:


  • Cache-Control headers - This is the HTTP header that controls the caching behaviour, and you do so via directives


  • Validation - When a cached response goes stale, validation is when the cache reaches out to the server to confirm whether or not this response is up-to-date ⭐️


  • Vary - When working with variations of the response from the same URL, you can adjust the cache key based on other properties by using the Vary header


  • Request Collapsing - When multiple clients request the same resource, the shared cache will combine the forwarded request into one then return the same server response to all the clients (assuming this is a shared response)


  • Response staleness - There are two common strategies for managing stale responses

    1. stale-while-revalidate - Serve the stale response while the cache revalidates in the background

    2. must-revalidate (or proxy-revalidate for shared cache) - The cache must revalidate with server before serving the response


  • Deleting from the cache - Keep in mind that if you are caching responses using browser cache, there isn’t an easy way to invalidate or delete the cached responses


That’s it! I hope this guide was helpful.




If you found this helpful or learned something new, please share this article with a friend or co-worker 🙏🧡 (Thanks!)