Loader vs Route Cache Headers in Remix

Document Request

So, let's say you have a route at /my-super-route, and there you have a loader function, right? Maybe something like this:

// app/routes/my-super-route.tsx
export let loader: LoaderFunction = async () {
  let data = await getData()
  let headers = { "Cache-Control": "some cache" }
  return json(data, { headers })

If the user opens a tab and open example.app/my-super-route directly, Remix will call that loader function server-side, get the data and render an HTML, it will ignore the cache control headers set by the loader.

What happens if you want to cache the document request? You need to export a headers function:

// app/routes/my-super-route.tsx
export let headers: HeadersFunction = () => {
  return { "Cache-Control": "some cache" }

With this, when the user opens your route directly (this is called a document request), it will get that Cache-Control instead of the one set by the loader.

Client-Side Navigation

What happens if the user is in another route and navigate to /my-super-route? Remix will do a fetch to /my-super-route?_data=something-here-you-dont-care-about, that URL is going to run the loader and return a response, and in this response you will see the headers from the loader.

Share Headers between Loader and Route

So the document request of a route and the loader request of a route have different caches, what if I want to use the same headers? Remix gives you the loader headers as a parameter in the headers function so you can use it to access the Headers object of the loader response.

export let headers: HeadersFunction = ({ loaderHeaders }) => {
  return { "Cache-Control": loaderHeaders.get("Cache-Control") }

This way, you can re-use the headers from the loader, however, the benefit of using different headers is that you can say "cache data for a longer time than the document" or vice versa, so you have more control of how you cache data and documents.


Some personal recommendation is to never cache at the HTTP layer if you are not 100% sure of what you are doing. A bad Cache Control value could make a user not be able to use new data on a loader or always get the same old HTML.

An example of this, if you accidentaly cache the document request, and then deploy a new version of your app, the user will still get the same document trying to load the old scripts that may not exists anymore in your server.

If you are a little more sure of what you are doing, I recommend you to start caching the loader response, and if you ever cache documents do it for a few seconds. That way after a deploy it will not remain cached for a long time.