How toDedupe Server Calls with Remix Utils Batcher
Imagine you have nested routes that need the same data, like app/routes/profile
and app/routes/profile.settings
. Both routes need to fetch user information from an external API or database, which leads to duplicate requests being made on the same page load.
The traditional approach results in multiple identical API calls when both loaders run in parallel, wasting bandwidth and increasing response times.
app/lib/user.server.ts import { z } from "zod"; export const UserProfileSchema = z.object({ id: z.string(), email: z.string().email(), displayName: z.string(), avatar: z.string().url().optional(), plan: z.enum(["free", "pro", "enterprise"]), lastActiveAt: z.string().datetime(), }); export async function fetchUserProfile(userId: string) { const response = await fetch(`https://api.example.com/users/${userId}`, { headers: { Authorization: `Bearer ${process.env.API_TOKEN}` }, }); if (!response.ok) { throw new Error(`Failed to fetch user: ${response.status}`); } return UserProfileSchema.parse(await response.json()); }
Without deduplication, if both routes call fetchUserProfile
with the same userId
, you'll make two identical API requests. This becomes especially problematic with multiple nested routes or when routes share common data dependencies.
Enter Remix Utils Batcher.
The Batcher middleware from Remix Utils automatically deduplicates function calls with the same key during a single request lifecycle. If multiple loaders call the same function with the same parameters, only one execution happens and all callers receive the same result.
Create the Batcher Middleware
First, create a batcher middleware instance that you can use across your application.
app/middleware/batcher.server.ts import { createBatcherMiddleware } from "remix-utils/middleware/batcher"; export const [batcherMiddleware, getBatcher] = createBatcherMiddleware();
This creates both the middleware function and a getter function to access the batcher instance from your route context.
Add Batcher to Your Root Route
Add the batcher middleware to your root route so it's available to all child routes throughout your application.
app/root.tsx import type { Route } from "react-router"; import { Outlet } from "react-router"; import { batcherMiddleware } from "~/middleware/batcher.server"; export const middleware: Route.MiddlewareFunction[] = [batcherMiddleware]; // The rest of your root route component
Create Batched Data Functions
Wrap your data fetching functions to use the batcher. This approach lets you reuse the same batched function across different routes.
app/lib/user-batched.server.ts import type { Batcher } from "remix-utils/middleware/batcher"; import { fetchUserProfile } from "./user.server"; export function getUserProfile(batcher: Batcher, userId: string) { return batcher.batch(`user-profile:${userId}`, () =>{ fetchUserProfile(userId), }); }
The key user-profile:${userId}
ensures that requests for different users are not deduplicated, while multiple calls for the same user ID within the same request will be batched.
Use Batched Functions in Parent Route
Now implement the parent route that displays basic user information using the batched function.
app/routes/profile.tsx import type { Route } from "react-router"; import { Outlet, useLoaderData } from "react-router"; import { getBatcher } from "~/middleware/batcher.server"; import { getUserProfile } from "~/lib/user-batched.server"; export async function loader({ context, request }: Route.LoaderArgs) { let batcher = getBatcher(context); let user = getUser(request); // Somehow get the user from the request let userProfile = await getUserProfile(batcher, user.id); return { userProfile }; } export default function Profile() { let { userProfile } = useLoaderData<typeof loader>(); // Your UI here - display user profile with userProfile data return <Outlet />; }
Use the Same Data in Child Route
The child route also needs user information for displaying settings, but thanks to the batcher, it won't trigger a duplicate API call.
app/routes/profile.settings.tsx import type { Route } from "react-router"; import { useLoaderData } from "react-router"; import { getBatcher } from "~/middleware/batcher.server"; import { getUser, getUserProfile } from "~/lib/user-batched.server"; export async function loader({ context, request }: Route.LoaderArgs) { let batcher = getBatcher(context); let user = getUser(request); let userProfile = await getUserProfile(batcher, user.id); return { userProfile }; } export default function ProfileSettings() { let { userProfile } = useLoaderData<typeof loader>(); // Your UI here - settings form using userProfile data return null; }
When a user visits /profile/settings
, both the parent profile.tsx
and child profile.settings.tsx
loaders run in parallel. Since they both call getUserProfile
with the same user ID from the request, the batcher ensures only one API call is made to fetch the user data, and both routes receive the same result.
Batch Database Queries Too
The batcher works just as well for database queries, preventing duplicate reads when multiple routes need the same data.
app/lib/database-batched.server.ts import type { Batcher } from "remix-utils/middleware/batcher"; import { db } from "./database.server"; export function getProjectWithMembers(batcher: Batcher, projectId: string) { return batcher.batch(`project-with-members:${projectId}`, () => { return db.project.findUnique({ where: { id: projectId }, include: { members: true, owner: true, }, }); }); }
This pattern prevents duplicate database queries when multiple routes need the same project data, reducing database load and improving response times.
Final Thoughts
The Batcher middleware is particularly valuable in applications with complex nested routing where multiple loaders often need the same data. Unlike client-side caching solutions, this approach works entirely on the server and requires no changes to your existing data fetching functions beyond wrapping them with the batcher.
The key benefit is automatic deduplication without manual cache management or complex coordination between loaders. Each request gets its own batcher instance, so there's no risk of serving stale data between different user requests.