How toMeasure performance with the Server-Timing header in React Router

The Server-Timing header lets you add performance measurements to your response headers so you can later inspect them from the client (e.g. the browser).

If you see some of your routes have slow response times you may want to find out what's happening, and while adding a few console.time may work great locally it may not be that useful on production, specially if you have many users consuming your app at the same time and your logs will mix with other users logs.

While you could implement this manually by tracking times and adding headers, there's a better approach using the Server Timing middleware from Remix Utils.

Install Remix Utils

First, install Remix Utils and its optional dependency for server timing:

npm install remix-utils @edgefirst-dev/server-timing

Set Up the Server Timing Middleware

Create a server timing middleware file where you'll configure the timing collector:

app/middleware/server-timing.server.ts
import { createServerTimingMiddleware } from "remix-utils/middleware/server-timing"; export const [serverTimingMiddleware, getTimingCollector] = createServerTimingMiddleware();

The createServerTimingMiddleware function returns a tuple with the middleware function and a collector getter function.

Add the Middleware to Your Routes

Add the middleware to your root route to enable server timing globally:

app/root.tsx
import { serverTimingMiddleware } from "~/middleware/server-timing.server"; export const middleware: Route.MiddlewareFunction[] = [serverTimingMiddleware]; // ... rest of your root component

This middleware will automatically add the Server-Timing header to all responses from routes that use the timing collector.

Measure Performance in Your Loaders

Now you can use the timing collector in your loaders and actions to measure specific operations:

app/routes/products.tsx
import { getTimingCollector } from "~/middleware/server-timing.server"; export async function loader({ request, context }: Route.LoaderArgs) { let collector = getTimingCollector(context); let products = await collector.measure( "fetch-products", "Fetch products from database", () => getProducts(), ); let categories = await collector.measure("fetch-categories", () => { return getCategories(); }); return { products, categories }; }

The getTimingCollector function requires the context parameter from your loader args to access the timing collector instance that was set up by the middleware.

The measure function takes a name, an optional description, and the async function to measure. It will track the execution time and add it to the Server-Timing header.

Measure Multiple Operations

You can measure different parts of your loader to get granular timing information:

app/routes/user.profile.tsx
import { getTimingCollector } from "~/middleware/server-timing.server"; export async function loader({ request, params, context }: Route.LoaderArgs) { let collector = getTimingCollector(context); let userId = params.userId; let user = await collector.measure("fetch-user", () => { return getUserById(userId); }); let posts = await collector.measure("fetch-posts", () => { return getUserPosts(userId); }); let analytics = await collector.measure("fetch-analytics", () => { return getUserAnalytics(userId); }); return { user, posts, analytics }; }

Each measurement will appear as a separate entry in the Server-Timing header, allowing you to see which operations are taking the most time.

View Timing Data in Browser DevTools

Once you have the middleware set up and measurements in place, open your browser's DevTools and navigate to the Network tab. When you reload the page, you'll see the Server-Timing information in the response headers.

Most modern browsers also display server timing data in a visual timeline in the Network tab, making it easy to spot performance bottlenecks.