Complete Guide: Next.js cache layers and GraphQL optimization
Next.js implements four different cache layers. Understanding how these work - and why GraphQL queries and ORM calls behave differently than fetch() requests - is essential for building fast websites.
The Four Cache Layers
This article describes the caching architecture as it works in Next.js 14.
Next.js 15 changes:
As of Next.js 15, `fetch()` is no longer cached by default. You must explicitly use { cache: 'force-cache' } to enable caching. The four cache layers and concepts in this article remain valid, but the default settings have changed.
Next.js 16 changes:
Next.js 16 introduces a completely new cache model with "Cache Components" and the "use cache"`directive. In Next.js 16, nothing is automatically cached - you must explicitly mark what should be cached. While unstable_cache is still supported, "use cache" is the new recommended approach. This requires enabling cacheComponents: true in your next.config.
Next.js uses multiple cache mechanisms, each serving a specific purpose:
- Request Memoization - Automatic deduplication during a render
- Data Cache - Persistent cache of fetch requests
- Full Route Cache - Static page cache
- Router Cache - Client-side navigation cache
Let's examine each of these layers in detail.
Request Memoization
Request Memoization is React's built-in mechanism to prevent duplicate requests during a single render.
A practical example: suppose you use the same data fetching function for both metadata and your page content:
export async function generateMetadata() {
const data = await fetchData(id);
return {
title: data.title,
description: data.description
};
}
export default async function Page() {
const data = await fetchData(id);
return <Article data={data} />;
}When fetchData has the same parameters, the function is only executed once during the same render, even though you call it twice.
Data Cache
The Data Cache is Next.js's persistent cache for data fetching. The cache persists between requests and even between deployments. Responses are automatically stored and reused here.
Full Route Cache
The Full Route Cache stores complete rendered routes - both the HTML and the RSC (React Server Component) payload. This cache is generated during build time for static routes, or on-demand for dynamic routes with revalidation.
// Static route - cached during build
export default async function Page() {
const data = await fetchData();
return <Content data={data} />;
}
// Dynamic route with revalidation
export const revalidate = 3600; // 1 hour
export default async function Page() {
const data = await fetchData();
return <Content data={data} />;
}This cache is independent of how you fetch data. The final rendered page can end up in the Full Route Cache, regardless of which data fetching method you use.
Router Cache
The Router Cache is a client-side cache that Next.js uses for navigation between pages. When a user navigates through your application, Next.js temporarily stores the RSC payload in the browser's memory.
This cache provides instant navigation when going back and forward, but has a relatively short lifespan (default 30 seconds for dynamic routes, 5 minutes for static routes).
Why GraphQL and ORMs Work Differently
Request Memoization and Data Cache don't work automatically for GraphQL queries and ORM calls. They only work out-of-the-box for standard fetch() requests with GET. This difference lies in how Next.js determines what should be cached.
The GET vs POST Distinction
HTTP GET requests are idempotent by definition - they fetch data without side effects. Next.js uses this as a signal that the response can be safely cached.
POST requests, on the other hand, are seen as mutations - operations that change data. Next.js doesn't cache these by default, because you always want to see the most recent state after a mutation.
// GET - gets cached
fetch('https://api.example.com/posts', { method: 'GET' });
// POST - does NOT get cached
fetch('https://api.example.com/posts', {
method: 'POST',
body: JSON.stringify(data)
});Same Result, Different Approach
Let's look at a practical example: fetching a list of blog posts. With fetch() and GET you get automatic cache:
// REST API with fetch - Request Memoization and Data Cache automatic
export async function getPosts() {
const response = await fetch('https://api.example.com/posts', {
next: {
revalidate: 3600,
tags: ['posts']
}
});
return response.json();
}
// Use in generateMetadata
const posts = await getPosts(); // Hits the API
// Use in Page component
const posts = await getPosts(); // Uses Request Memoization (same render)
// Later, new request
const posts = await getPosts(); // Uses Data Cache
// Cache invalidation
revalidateTag('posts'); // Invalidates the Data CacheNext.js handles both cache layers automatically - Request Memoization within one render, Data Cache between requests.
The same functionality with GraphQL requires explicit configuration:
import { cache } from 'react';
import { unstable_cache } from 'next/cache';
// GraphQL - manually add both layers
export const getPosts = cache(async () => { // Request Memoization
return await unstable_cache( // Data Cache
async () => graphqlClient.query(GET_POSTS_QUERY),
['posts'],
{
revalidate: 3600,
tags: ['posts']
}
)();
});
// Use in generateMetadata
const posts = await getPosts(); // Hits the API
// Use in Page component
const posts = await getPosts(); // Uses Request Memoization
// Later, new request
const posts = await getPosts(); // Uses Data Cache
// Cache invalidation
revalidateTag('posts'); // Invalidates the Data CacheThe end result is the same - revalidate and tags work identically - but you must explicitly wrap both layers.
The same pattern applies to ORMs:
// Prisma - manually add both layers
export const getPosts = cache(async () => {
return await unstable_cache(
async () => prisma.post.findMany(),
['posts'],
{
revalidate: 3600,
tags: ['posts']
}
)();
});Why GraphQL Uses POST
GraphQL uses POST for all requests, even for queries that conceptually just fetch data. This is a protocol choice - queries can become complex and exceed URL length limitations.
But Next.js only sees a POST request and treats it as a potential mutation. The framework cannot distinguish between a GraphQL query and a GraphQL mutation.
Why ORMs Bypass the Cache
ORMs like Prisma, Drizzle, or TypeORM use their own APIs that don't go through fetch():
// Prisma makes a direct database connection
const posts = await prisma.post.findMany();
// Drizzle too
const posts = await db.select().from(postsTable);These calls go directly to the database without using the fetch() API, so Next.js's automatic cache is never activated.
Implementing Cache Yourself
When you don't use GET requests, you lose the automatic caching benefit of Next.js. This doesn't mean your application has to be slower - you just need to take responsibility for caching that Next.js normally handles automatically.
React's cache() for Request Memoization
React's cache() provides deduplication within one render:
import { cache } from 'react';
export const getPost = cache(async (id: string) => {
return await graphqlClient.query(GET_POST, { id });
});Next.js's unstable_cache for Data Cache
unstable_cache provides persistent cache between requests:
import { unstable_cache } from 'next/cache';
export const getCachedPost = unstable_cache(
async (id: string) => {
return await graphqlClient.query(GET_POST, { id });
},
['post'],
{ revalidate: 3600, tags: ['posts'] }
);Combining Both Layers
In practice, you often combine both mechanisms for optimal caching:
import { cache } from 'react';
import { unstable_cache } from 'next/cache';
export const getPost = cache(async (id: string) => {
return await unstable_cache(
async () => graphqlClient.query(GET_POST, { id }),
[`post-${id}`],
{ revalidate: 3600, tags: ['posts'] }
)();
});Now you get:
- Request deduplication within one render (via
cache()) - Persistent cache between requests (via
unstable_cache()) - Controllable revalidation and cache invalidation
Cache Key Strategies
The cache key array in unstable_cache is crucial - it must be unique for each variant of your data:
// Basic cache key
unstable_cache(fetchPosts, ['posts'], options);
// With parameters - key must include variables
unstable_cache(
async (status: string) => fetchPosts(status),
['posts', status],
options
);
// With multiple parameters
unstable_cache(
async (userId: string, limit: number) => fetchUserPosts(userId, limit),
['user-posts', userId, limit.toString()],
options
);Each unique combination of parameters must result in a unique cache key, otherwise you'll get wrong data back from the cache.
Cache Invalidation
A major advantage of unstable_cache is the ability to invalidate caches in a targeted way with tags:
import { revalidateTag } from 'next/cache';
// In your API route or Server Action after a mutation
export async function updatePost(id: string, data: PostData) {
await graphqlClient.mutate(UPDATE_POST, { id, data });
// Invalidates all caches with the 'posts' tag
revalidateTag('posts');
}This is similar to how Next.js can automatically revalidate fetch caches, but now with full control over when and which caches are invalidated.
When Not to Cache
Not every situation requires caching. Consider skipping unstable_cache for:
Real-time or rapidly changing data:
// No Data Cache - always fresh data
export const getLiveScore = cache(async (matchId: string) => {
return await graphqlClient.query(GET_LIVE_SCORE, { matchId });
});User-specific content:
// Only Request Memoization, no persistent cache
export const getUserCart = cache(async (userId: string) => {
return await prisma.cart.findUnique({ where: { userId } });
});The trade-off is always: added complexity versus actual performance gain. Measure first, optimize later.
Practical Implications
Understanding these cache layers helps you make better decisions about data fetching and performance optimization.
When Each Layer is Active
Request Memoization only works with the cache() wrapper. For fetch() with GET this is automatic, for GraphQL and ORMs you must add this explicitly.
Data Cache only works for fetch() with GET, or explicitly via unstable_cache for GraphQL and ORMs.
Full Route Cache works independently of your data fetching method. A page with GraphQL queries can still be fully cached as a static route or with revalidate.
Router Cache is always active on the client, but has little impact on server-side data fetching decisions.
Deliberately Bypassing Cache
Sometimes you specifically don't want data to be cached:
// Explicit no-cache for fetch
fetch(url, { cache: 'no-store' });
// Or per route
export const dynamic = 'force-dynamic';This is relevant for user-specific data, real-time content, or when you want to be certain of fresh data.
Performance Considerations
The Data Cache can provide an enormous performance boost because responses don't need to be fetched again. For GraphQL and ORM users this means:
- Request Memoization (
cache()) for deduplication within one render unstable_cachefor persistent caching between requests- Full Route Cache with
revalidatefor static or semi-static pages - External cache (Redis, CDN) as an additional layer
- Database query optimization and connection pooling
The most important thing is to understand which cache layers are active for your specific setup, so you can make conscious choices about where and when to optimize.
Conclusion
Next.js's cache architecture consists of four separate layers, each serving a specific purpose. For developers using fetch() with GET requests, much of this cache works automatically. But as soon as you introduce GraphQL or an ORM, you need to take control of caching strategies yourself.
The good news: with cache() and unstable_cache you have the same capabilities as automatic cache, with the bonus of full control over what gets cached and for how long. The trade-off is added complexity - you must consciously think about cache keys, revalidation, and invalidation.
Understand which layers are active in your application, measure where performance problems lie, and then optimize in a targeted way. Not every query needs persistent cache, but for those cases where it does make an impact, Next.js's tools provide all the flexibility you need.