paint-brush
The Differences between Shared and Private Cachingby@jerrychang
3,867 reads
3,867 reads

The Differences between Shared and Private Caching

by JerryNovember 14th, 2022
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Do you know the difference between private and share caches ? It’s quite important to know the difference! You may be unintentionally using the wrong one for you data!

Companies Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - The Differences between Shared and Private Caching
Jerry HackerNoon profile picture


Content

  • Introduction
  • Benefits of caching
    • Performance and UX
    • Reliability and availability
  • Private caches
  • Shared caches
  • How it all fits together
  • Conclusion

Introduction

Caches are an essential part of every infrastructure.


Whether it’s on the client side, the origin server, or everywhere in between.


Illustration on the types of caches from client to the origin server


The caches shown above are HTTP caching compliant, meaning they are built on top of the HTTP Caching standard.


Some of the modern tools used for caching are not just limited to this standard.


Not only do they support the existing HTTP caching standard but they also provide additional features for you to better manage the lifecycle of your resources 🤘.


These caches can be divided into two categories:

  • Private caches

  • Shared caches


An Analogy


A good analogy to understand the difference is to take an example of ordering a meal at a restaurant.


Private caches would be similar to a meal you ordered for yourself whereas shared caches would be the things you ordered to share with the group.


Its not a perfect analogy because you can still share meals you ordered but you get the idea!


Before we look at the two caches in detail, let’s better understand why we would want to cache in the first place!

Benefits of caching

There are benefits to adding a cache between the client and server.


Most of the time when we talk about adding a cache, it’s typically about performance. However, the benefits don’t stop there!


Let’s review the benefits.

Performance + UX

Ultimately, the faster the requests, the smoother the user experience is going to be.


By saving the round trip that the requests have to make from the client to the server, it reduces the response latency.


Sometimes the performance boost can be minor but sometimes it can make a big difference!


This is especially true when you are distributing your content via a Content delivery network (CDN) which sits closer to the end-users.

Reliability

When you have a cache in front of your origin server, that will aid by reducing the throughput and reducing some of the stress.


Rather than going to the server, the cache will handle the request by responding with a result that it has already received from the server.


This will lead to better reliability because there will be less traffic throughput that can potentially overload and degrade the performance of your service.

Availability

It would also add to the availability of your service if it experiences downtime.


In this scenario, the cache can continue to serve stale resources (if it is applicable and the services are decoupled).


It’s not perfect but it does still help!


All in all, the cache that sits in front of the server acts like a buffer.


Most of the time, caching deals with shared resources but what do we do if we are dealing with personalized resources?


That’s where private caches come in!

Private caches

Illustration of the Private cache (ie Browser)


For resources that you don’t want to share with anyone else except the client, you should use a private cache.


The way to hint to the caches that a resource should not be stored in a shared cache is to use the private (Cache-Control: private ) directive.


This tells the caches that it will only be stored in a private cache (ie browser cache).


Any other resources (with applicable Cache-Control headers) can be stored in a shared cache, meaning the resources are shared between multiple clients.

Shared caches

Illustration of the Shared caches (ie Reverse proxy, CDNs, service workers)


Shared caches are typically used to store shared resources. These could be images, videos, HTML or JS files.


How it works is the following:


  1. First request - reaches out to the server for a resource, then adds the resource to the cache

  2. Subsequent request - use the same resource from the cache


The performance gains come from the fact that we are re-using the resources in the subsequent requests when it gets cached.


In a shared cache, these resources can be used by multiple clients, and not just one.


The shared caches come in various forms, they can be:


  • a reverse proxy like Apache or Nginx
  • a Content Delivery Network (CDN) like AWS Cloudfront or Cloudflare
  • a Service worker (This is like a proxy service that runs in the browser)


They sit between the client and the origin server, and that typically means the setup is compatible with HTTP caching.


The only exception is a service worker - it works a little differently.


The service worker sits between the client and the browser cache, and it will intercept the requests before it goes out.


Even though it doesn’t directly support HTTP caching, it is still closely related to it.

How it all fits together

When a request goes from the client to the origin server, there are a few points along the way for you to cache your resource.


The points include:

  • The client (browser)
  • Content Delivery Network (CDN)
  • Reverse proxy
  • Service worker


Where you decide to cache and the overall caching strategies that you choose will depend on your use case.


Illustration of the types of caches a request may go through on its journey from the client to the server



One important thing to consider is how you will perform invalidation because, with some caches (ie browser cache), you will have no control over the invalidation.


When there is no directive to explicitly invalidate a cache on the client side, it will be cached for the duration specified by the Cache-Control header.


That said, there are headers like Clear-Site-Data being proposed but it is not supported by all browsers yet.


With a managed solution like a CDN or a reverse proxy or a service worker, you have full control over the caching behavior on that service.


This is just a small gotcha to keep in mind when coming up with your caching strategy.

Conclusion


To recap, here are some takeaways:


  • The benefits of caching include better performance, UX, reliability, and availability.


  • The caches can be divided into two categories

    1. Private - used for caching private resources for a particular client
      • On the client (browser cache)
    2. Shared - used for caching resource(s) for multiple clients
      • Reverse proxy

      • Content delivery network

      • Service worker


  • You have limited control over invalidation when leveraging a browser cache (gotcha)



Note: In this overview, I didn’t talk about too much about server caches (ie Redis or Memcached) because they are not influenced by the HTTP caching standards.


And that its! I hope you learned something new!


If you found this helpful or learned something new, please share this article with a friend or co-worker 🙏❤️! (Thanks!)



Also published here.