React Parallel Data Fetching Patterns: From Promise.all to Suspense and Server Components
Master React parallel data fetching with Promise.all, Suspense, React Query, SWR, Next.js, and Router loaders. Avoid waterfalls and ship faster UIs.
Image used for representation purposes only.
Overview
Parallel data fetching is the art of starting multiple requests at the same time and rendering as soon as each piece of data is ready. Done well, it trims seconds off time‑to‑interactive, avoids “waterfalls,” and simplifies error handling. In React, you can implement parallelism at different layers: plain fetch with Promise.all, data‑fetching libraries (TanStack Query/React Query, SWR), route loaders (React Router), and framework primitives (Next.js Server Components and Suspense).
This guide explains when and how to use each pattern, trade‑offs, and practical code you can paste into your app today.
Why parallel fetching matters
- Faster first paint: independent requests don’t wait on each other.
- Smoother UX: sections can stream in independently with Suspense boundaries.
- Lower complexity: fewer “if (dataA) then fetchB” chains.
- Better network utilization: browsers multiplex HTTP/2/3 effectively when you fire together.
The waterfall problem (and how to spot it)
A waterfall happens when request B starts only after request A completes. Common causes:
- Sequential awaits inside a single effect or loader.
- Fetches triggered by rendering state that depends on earlier results.
- Coalescing unrelated data into a single “mega endpoint.”
You’ll see this in DevTools Network as strictly staggered start times. The fix: kick off unrelated requests at the same time and render them behind independent boundaries.
Strategy 1: Promise.all in a custom hook (client only)
Great for simple apps or when you want minimal dependencies.
import { useEffect, useState } from 'react';
function useUserAndPosts(userId) {
const [state, setState] = useState({ data: null, error: null, loading: true });
useEffect(() => {
const ac = new AbortController();
setState({ data: null, error: null, loading: true });
(async () => {
try {
const [user, posts] = await Promise.all([
fetch(`/api/users/${userId}`, { signal: ac.signal }).then(r => r.json()),
fetch(`/api/users/${userId}/posts`, { signal: ac.signal }).then(r => r.json()),
]);
setState({ data: { user, posts }, error: null, loading: false });
} catch (e) {
if (e.name !== 'AbortError') setState({ data: null, error: e, loading: false });
}
})();
return () => ac.abort();
}, [userId]);
return state;
}
Pros:
- No extra library.
- Full control over abort, retries, and error mapping.
Cons:
- You must build caching, dedupe, retries, revalidation, and pagination yourself.
Tips:
- Always pass AbortController signals so unmounted components don’t race to set state.
- If some queries are optional or dependent, split them into separate effects or use conditional Promise.all.
Strategy 2: React Query (TanStack Query) parallel queries
React Query offers batteries‑included caching, deduplication, retries, background revalidation, and DevTools.
import { useQueries } from '@tanstack/react-query';
function Profile({ userId }) {
const results = useQueries({
queries: [
{ queryKey: ['user', userId], queryFn: () => getUser(userId) },
{ queryKey: ['posts', userId], queryFn: () => getPosts(userId) },
],
});
const [userQ, postsQ] = results;
if (userQ.isLoading || postsQ.isLoading) return <Spinner />;
if (userQ.error || postsQ.error) return <Error />;
return (
<>
<UserCard user={userQ.data} />
<PostsList posts={postsQ.data} />
</>
);
}
Notes:
- useQueries fires all queries at once and renders as each settles.
- For dependent queries, use the enabled flag to gate start conditions.
- Prefetch ahead of navigation:
queryClient.prefetchQuery({ queryKey: ['user', id], queryFn: () => getUser(id) });
When to choose: you need robust caching, easy retries, mutations with optimistic updates, and a mature ecosystem.
Strategy 3: SWR co‑located hooks (parallel by default)
SWR keys are independent; multiple useSWR calls in the same render kick off in parallel. It dedupes requests and supports focus/reconnect revalidation.
import useSWR from 'swr';
const fetcher = url => fetch(url).then(r => r.json());
function Profile({ userId }) {
const { data: user, error: uErr } = useSWR(`/api/users/${userId}`, fetcher);
const { data: posts, error: pErr } = useSWR(`/api/users/${userId}/posts`, fetcher);
if (!user || !posts) return <Spinner />;
if (uErr || pErr) return <Error />;
return (
<>
<UserCard user={user} />
<PostsList posts={posts} />
</>
);
}
Tips:
- Add suspense: true to integrate with Suspense boundaries (see next section).
- For immutable data, use useSWRImmutable for fewer revalidations.
Strategy 4: Suspense boundaries for streaming UIs
Suspense lets you declare “this part can wait,” improving perceived performance. You can combine Suspense with React Query or SWR by enabling suspense mode, or use framework‑level Suspense with route loaders and Server Components.
function Page() {
return (
<>
<Suspense fallback={<SkeletonUser />}> <UserPanel /> </Suspense>
<Suspense fallback={<SkeletonPosts />}> <PostsPanel /> </Suspense>
</>
);
}
Guidelines:
- Use multiple small boundaries instead of a single giant one.
- Pair with an ErrorBoundary to isolate failures per section.
Strategy 5: React Router loaders with defer and
React Router (v6.4+) moves data fetching to the router. defer allows parts of the route data to stream in parallel.
// router.ts
import { defer } from 'react-router-dom';
export async function loader({ params }) {
return defer({
user: getUser(params.id), // starts immediately
posts: getPosts(params.id), // starts immediately
});
}
// route component
import { useLoaderData, Await } from 'react-router-dom';
function ProfileRoute() {
const data = useLoaderData();
return (
<>
<Suspense fallback={<SkeletonUser />}>
<Await resolve={data.user}>{u => <UserCard user={u} />}</Await>
</Suspense>
<Suspense fallback={<SkeletonPosts />}>
<Await resolve={data.posts}>{p => <PostsList posts={p} />}</Await>
</Suspense>
</>
);
}
Benefits:
- Parallel start of all loader promises.
- Natural integration with Suspense for streaming and with ErrorBoundaries per
.
Strategy 6: Next.js Server Components (App Router)
In Server Components, you can fetch in the component body and await Promise.all without client waterfalls. Because it runs on the server, there’s no fetch in the browser for the same data (unless you mark cache: ’no-store’).
// app/users/[id]/page.tsx (Server Component)
export default async function Page({ params }) {
const [user, posts] = await Promise.all([
fetch(`${process.env.API}/users/${params.id}`, { next: { revalidate: 300 } }).then(r => r.json()),
fetch(`${process.env.API}/users/${params.id}/posts`, { cache: 'no-store' }).then(r => r.json()),
]);
return (
<>
<UserCard user={user} />
{/* Render posts behind a boundary to stream sooner */}
<Suspense fallback={<SkeletonPosts />}>
<PostsListServer data={posts} />
</Suspense>
</>
);
}
Notes:
- fetch is cached by default in Server Components; use next: { revalidate } for ISR.
- Promise.all at the server eliminates client waterfalls and ships less JavaScript.
- Split long‑tail data behind separate Suspense boundaries to stream the shell early.
Choosing a pattern
- Minimal setup, no caching needs: Promise.all in a custom hook.
- Rich caching/mutations, client rendering: React Query.
- Lightweight caching, great DX: SWR.
- Route‑centric apps with streaming: React Router loaders + defer.
- Next.js app using the App Router: server‑side Promise.all in Server Components, Suspense for streaming, and selective cache policies.
Error handling in parallel flows
- Isolate with boundaries: pair each Suspense boundary with an ErrorBoundary so one failure doesn’t blank the page.
- Map errors to UI: allow each panel to show its own retry button.
- Retries: libraries provide exponential backoff and stale‑while‑revalidate; avoid home‑rolled loops unless you need custom policies.
- Partial success: render what’s ready; don’t block the whole page for a single slow widget.
Cancellation, timeouts, and race safety
- Always use AbortController with fetch in custom hooks.
- Consider per‑request timeouts (AbortController + setTimeout) for worst‑case servers.
- Guard setState calls after unmount; libraries already handle this.
function fetchWithTimeout(url, ms, options = {}) {
const ac = new AbortController();
const id = setTimeout(() => ac.abort(), ms);
return fetch(url, { ...options, signal: ac.signal })
.finally(() => clearTimeout(id));
}
Avoiding duplication and stale data
- Deduplicate: React Query and SWR automatically dedupe concurrent identical requests.
- Memoize query keys: ensure stable keys across renders.
- Normalize TTLs: coordinate cache times so panels don’t thrash the network.
- In Next.js Server Components, fetch is memoized per request; use cache: ’no-store’ only when you truly need fresh data.
Dependent vs. independent queries
Parallelize what’s independent; gate what depends on another response.
// React Query dependent fetch example
const user = useQuery({ queryKey: ['user', id], queryFn: () => getUser(id) });
const posts = useQuery({
queryKey: ['posts', id],
enabled: !!user.data, // waits until user is ready
queryFn: () => getPosts(id, user.data.role),
});
Prefetching and speculative parallelism
- On hover/viewport: prefetch likely data before a click to hide latency.
- On route transition: start loaders early with router prefetch APIs.
- Background refresh: revalidate on focus or interval; users return to fresh data.
Anti‑patterns to avoid
- Sequential awaits for unrelated data:
// Bad
const user = await getUser(id);
const posts = await getPosts(id); // waits needlessly
// Good
const [user, posts] = await Promise.all([getUser(id), getPosts(id)]);
- Fetching in render of client components without a Suspense‑aware library (can cause waterfalls and double fetches).
- Global state stores for remote data without caching semantics—prefer dedicated data tools.
- One mega endpoint for everything; it ties unrelated data lifecycles together and blocks streaming.
Measuring impact
- Chrome DevTools Network: verify start times overlap and total load shrinks.
- React Profiler: ensure Suspense boundaries improve commit timing.
- Lighthouse/Web Vitals: watch LCP and TTI after parallelization.
- Server logs: track upstream concurrency and error rates; tune timeouts accordingly.
Advanced: batch, aggregate, and reduce chattiness
- API Gateway fan‑out: keep the client simple by letting the server call multiple backends in parallel.
- GraphQL batching: use persisted queries or DataLoader to minimize over‑the‑wire requests while still resolving fields in parallel server‑side.
- HTTP/2/3 and keep‑alive: reuse connections; avoid CORS preflight costs by consolidating origins.
- Caching headers: strong ETags and Cache‑Control: stale-while-revalidate pair well with client libraries.
A practical decision checklist
- Can the requests start at the same time? If yes, use Promise.all or multiple library hooks.
- Do you need cache, retries, mutations? Pick React Query or SWR.
- Is your app route‑driven with streaming needs? Use React Router loaders + defer.
- Building with Next.js App Router? Fetch in Server Components, cache intentionally, add Suspense boundaries.
- Are some queries dependent? Use enabled flags or split loaders.
- Do you need prefetching? Wire up hover/viewport triggers.
- Are errors isolated? Wrap each panel with ErrorBoundary + retry.
Conclusion
Parallel data fetching is less about a single API and more about composing layers: start requests together, cache wisely, and render progressively. Whether you choose Promise.all, a data library, route loaders, or Server Components, the winning recipe is consistent: fire early, isolate with Suspense and ErrorBoundaries, and measure the outcome. Adopt the simplest option that meets your caching and UX needs today, and evolve toward framework‑level streaming where it makes sense.
Related Posts
Building Performant Infinite Scroll in React: Patterns, Pitfalls, and Production Tips
Build performant React infinite scroll using IntersectionObserver, React Query, and virtualization, with production tips and code examples.
Next.js Image Optimization Tutorial: Build Faster React Apps with next/image
Learn Next.js image optimization with next/image: responsive sizes, AVIF/WebP, blur placeholders, CDN loaders, caching, and production checklist.
React Form Libraries Comparison (2026): The Practical Buyer’s Guide
React form libraries in 2026 compared: React Hook Form, Formik, React Final Form, TanStack Form, RJSF/Uniforms/JSON Forms, and Conform with Server Actions.