HTTP vs. Server-side Cache in Remix

When you build a web application, you may reach a point where some performance problems could be solved by adding a cache layer.

One of the most common ways to solve that server-side is to add a server-side cache like Redis or Memcached. Still, there's another way you could leverage the HTTP Cache-Control headers to cache in the HTTP layer. The main benefit is that you don't need to add another tool to your stack. You already have an HTTP layer, and most likely, your app is behind a CDN, so you could use the Cache-Control header to cache in the CDN directly.

But, when should you choose the HTTP Cache-Control header over a server-side cache?

Server-side Cache

A server-side cache usually needs an extra tool like Redis, but it gives you way more control of your cache. The way it's used is that you have some object (let's call it cache) and a few methods to get, set, or delete a value from the cache. And you can usually add a TTL (time to live) to the cache, so the cache will expire after a specific time.

interface Cache {
  set<Value>(key: string, value: Value, ttl?: number): Promise<void>;
  get<Value>(key: string): Promise<Value>;
  del(key: string): Promise<void>;
  has(key: string): Promise<boolean>;

Then in your loader you can do something like this:

let loader: LoaderFunction = async ({ request }) => {
  let userId = await authenticate.isAuthenticated(request);
  // check if it's cached
  if (await cache.has(`${userId}:routes/my/route`)) {
    // return it
    return await cache.get(`${userId}:routes/my/route`);
  // if it's not cached, get the data somehow
  let data = await getData();
  // and cache it
  await cache.set(`${userId}:routes/my/route`, data); // cache it
  // and finally return the data
  return data;

And in your action, you can invalidate the cache.

let action: ActionFunction = async ({ request }) => {
  let userId = await authenticate.isAuthenticated(request);
  // perform the action
  let form = await request.formData();
  await doSomething(form);
  // delete the cached data
  await cache.del(`${userId}:routes/my/route`);
  return redirect("/some/path");

A few things to note on the code above.

The cache key can include the user token or anything that helps you identify the user. This will allow you to have a unique cache key per user on your server, so other users will not receive the same cache. If you use the userId, it will even cache the data after the user logouts or switch to another device because the cache is on the server.

The control you have over the cache key is so powerful that you could also cache without the userId so you can share the cache between users, valid for public data.

You may also note that I also included the route path inside the app folder in the key. You don't need to do that, but I did it with a unique key for each loader. Use the request.url. You can have multiple routes with the same URL so only one will remain cached, and once the user tries to get some of the loaders again, it may get the cached data of another route.

HTTP Cache

The HTTP cache has two ways to work. You can have a public or a private cache.

The public cache is in a CDN, or another thing you can put between the server and the user like a proxy, usually just a CDN. The private cache is the user's browser's own cache because each browser can cache any response.

The way to use HTTP cache is by adding a Cache-Control header to a response. In a loader you could do this:

let loader: LoaderFunction = async ({ request }) => {
  let userId = await authenticate.isAuthenticated(request);
  let data = await getData();
  return json(data, {
    "Cache-Control": "public, s-maxage=60",

And for a document request in Remix you could do this:

export let headers: HeadersFunction = () => {
  return {
    "Cache-Control": "public, s-maxage=60",

That header says "cache in a CDN, not in the browser, and keep it for 60 seconds". So the first user receiving that response will make the CDN cache it for 60 seconds, and for the next minute, any request to the same URL will receive the cached response on the CDN.

This is really powerful because now the CDN can send responses without your server receiving any requests at all. Hence, you only need to process one request per minute.

Suppose your app goes to Hacker News and receives hundreds or thousands of requests per minute. In that case, you can use this to cache the response in the CDN and reduce your server load to 1 request per minute. In contrast, you would still receive the request with a server-side cache and have to send a response.

You may reduce your loader's time to process the request since the data is cached. Still, you don't reduce the number of requests you process, and if you use something like Serverless, that would mean you still have to pay for a new lambda instance running.

There are also more things you can do with Cache-Control. You could make a stale-while-revalidate pattern.

You could cache in the browser and in the CDN simultaneously. Still, with a different max-age, the browser could cache a few seconds while the CDN cache more.

Another header called Vary lets you tell the CDN to cache the response based on the request headers. If you have a header like Accept-Language, you could cache the response based on that header. If the user changes the language, you will send a different response. This is also useful to vary the cached response on the Cookie. This way, an authenticated user will get a different reaction.

In contrast, not authenticated users would get a shared response (if they have the same Cookie header) or start varying on different headers, like Cookie and Accept-Language. So public data for Spanish would receive a different response than public data for English.

The main issue with the Vary header is that not all the CDNs support it. You need to ensure your CDN does it before using it in your app. You may send a cached response with private data to another user.

When to use each one

Going back to the first question, when you should choose each one.

Use server-side cache when the performance problem you want to solve is the database query, or API fetch taking a lot of time. This will give you way more control, and in case of a mistake, it would be easier to clear the whole cache.

User HTTP cache when you need to reduce the number of requests reaching your server or when your data is public. And you don't have authentication, so your users could receive the same cache. Be sure to don't cache for a long time, especially on the user browsers.