Streaming
Streaming lets you send parts of a page to the browser as they become ready, instead of waiting for everything to load before showing anything. In Next.js, this is built on React Suspense and works automatically with Server Components.
How Streaming Works
Without streaming, the server renders the entire page, then sends it:
Server: fetch data A (200ms) -> fetch data B (800ms) -> render -> send HTML
Client: [blank screen for 1000ms+] -> full page appears
With streaming, content arrives progressively:
Server: start render -> send shell + loading states
-> data A ready (200ms) -> stream component A
-> data B ready (800ms) -> stream component B
Client: [instant shell] -> A appears -> B appears
Route-Level Loading with loading.tsx
The simplest form of streaming. Drop a loading.tsx in any route folder:
// app/dashboard/loading.tsx
export default function Loading() {
return (
<div className="animate-pulse space-y-4">
<div className="h-8 bg-gray-200 rounded w-1/3" />
<div className="grid grid-cols-3 gap-4">
{[1, 2, 3].map((i) => (
<div key={i} className="h-32 bg-gray-200 rounded" />
))}
</div>
</div>
);
}
// app/dashboard/page.tsx
export default async function Dashboard() {
const stats = await getStats(); // slow query
return <DashboardContent stats={stats} />;
}
Next.js wraps the page in a <Suspense> boundary with loading.tsx as the fallback. The skeleton shows instantly, then the real content streams in when getStats() resolves.
Tip: Design your
loading.tsxto match the actual page layout. Users should see content “fill in” rather than a jarring layout shift.
Component-Level Streaming with Suspense
For more granular control, use <Suspense> boundaries directly:
// app/dashboard/page.tsx
import { Suspense } from "react";
export default function Dashboard() {
return (
<div>
<h1 className="text-2xl font-bold">Dashboard</h1>
{/* This shows immediately */}
<Suspense fallback={<StatsSkeleton />}>
<StatsCards />
</Suspense>
{/* This loads independently */}
<Suspense fallback={<ChartSkeleton />}>
<RevenueChart />
</Suspense>
{/* This loads independently too */}
<Suspense fallback={<TableSkeleton />}>
<RecentOrders />
</Suspense>
</div>
);
}
Each <Suspense> boundary streams independently. Fast queries show first, slow queries stream in later.
// These are async Server Components - each fetches its own data
async function StatsCards() {
const stats = await getStats(); // 100ms
return (
<div className="grid grid-cols-3 gap-4">
<Card title="Users" value={stats.users} />
<Card title="Revenue" value={stats.revenue} />
<Card title="Orders" value={stats.orders} />
</div>
);
}
async function RevenueChart() {
const data = await getRevenueData(); // 500ms
return <Chart data={data} />;
}
async function RecentOrders() {
const orders = await getRecentOrders(); // 300ms
return <OrderTable orders={orders} />;
}
Nested Suspense
Suspense boundaries can nest. Inner boundaries stream independently from outer ones:
<Suspense fallback={<PageSkeleton />}>
<Header />
<Suspense fallback={<SidebarSkeleton />}>
<Sidebar />
</Suspense>
<Suspense fallback={<ContentSkeleton />}>
<MainContent />
<Suspense fallback={<CommentsSkeleton />}>
<Comments />
</Suspense>
</Suspense>
</Suspense>
The page shell loads first. Sidebar and main content stream independently. Comments (nested inside content) stream after the main content boundary resolves.
Gotcha: Don’t wrap every component in Suspense. Too many loading states create a “popcorn” effect where content pops in at random times. Group related content within the same boundary.
Streaming with Parallel Data Fetching
Start multiple fetches in parallel, but stream them as they resolve:
import { Suspense } from "react";
export default function ProductPage({ params }: { params: Promise<{ id: string }> }) {
return (
<div>
<Suspense fallback={<ProductSkeleton />}>
<ProductDetails params={params} />
</Suspense>
<Suspense fallback={<ReviewsSkeleton />}>
<ProductReviews params={params} />
</Suspense>
<Suspense fallback={<RecommendationsSkeleton />}>
<Recommendations params={params} />
</Suspense>
</div>
);
}
async function ProductDetails({ params }: { params: Promise<{ id: string }> }) {
const { id } = await params;
const product = await getProduct(id); // 100ms
return <div>{product.name} - ${product.price}</div>;
}
async function ProductReviews({ params }: { params: Promise<{ id: string }> }) {
const { id } = await params;
const reviews = await getReviews(id); // 800ms - slow external API
return <ReviewList reviews={reviews} />;
}
Product details show quickly. Reviews stream in later. Neither blocks the other.
UX Patterns for Loading States
Skeleton screens (preferred)
function CardSkeleton() {
return (
<div className="animate-pulse rounded-lg border p-4">
<div className="h-4 bg-gray-200 rounded w-3/4 mb-2" />
<div className="h-4 bg-gray-200 rounded w-1/2" />
</div>
);
}
Spinner (use sparingly)
function Spinner() {
return (
<div className="flex justify-center p-8">
<div className="h-8 w-8 animate-spin rounded-full border-4 border-gray-200 border-t-blue-600" />
</div>
);
}
Stale-while-revalidate (for navigations)
"use client";
import { useRouter } from "next/navigation";
import { useTransition } from "react";
export function NavLink({ href, children }: { href: string; children: React.ReactNode }) {
const router = useRouter();
const [isPending, startTransition] = useTransition();
return (
<a
href={href}
onClick={(e) => {
e.preventDefault();
startTransition(() => router.push(href));
}}
className={isPending ? "opacity-50" : ""}
>
{children}
</a>
);
}
Edge vs Node.js Runtime
Choose the runtime per route for optimal streaming performance:
// app/api/stream/route.ts
export const runtime = "edge"; // or "nodejs" (default)
| Node.js (default) | Edge | |
|---|---|---|
| Cold start | ~250ms | ~1ms |
| APIs available | Full Node.js | Web APIs (limited) |
| Streaming | Yes | Yes |
| Database access | Direct connections | HTTP-based only (Prisma Edge, Neon, PlanetScale) |
| Max execution | No limit | 30s (Vercel) |
| Region | Single region | Every edge location |
Tip: Use Edge runtime for routes that don’t need Node.js APIs. Auth pages, content pages, and simple API routes are good candidates. Use Node.js for anything that needs direct database connections or Node-specific libraries.
// app/blog/[slug]/page.tsx
export const runtime = "edge";
export default async function BlogPost({ params }: { params: Promise<{ slug: string }> }) {
const { slug } = await params;
// Use edge-compatible data fetching (HTTP-based)
const post = await fetch(`${process.env.API_URL}/posts/${slug}`).then((r) => r.json());
return <article>{post.content}</article>;
}
Debugging Streaming
Chrome DevTools Network tab shows streamed chunks. Look for the initial HTML response that arrives quickly, followed by additional chunks that update Suspense boundaries.
// Add artificial delay to see streaming in action during development
async function SlowComponent() {
await new Promise((resolve) => setTimeout(resolve, 2000));
const data = await getData();
return <div>{data}</div>;
}
Next: Turbopack | Server Actions