The Setup
A Next.js corporate website. Server-side rendering enabled. Static site generation configured. The build system reports every page as "prerendered as static content." Lighthouse scores are green. WCAG compliance is solid. Technical SEO is clean.
By every traditional metric, the site is healthy.
Then an AI agent tries to read it and gets back this:
<html>
<head><!-- scripts, stylesheets, meta tags --></head>
<body>
<div id="__next"></div>
</body>
</html>
An empty div. No navigation. No headings. No content. No calls to action. Nothing.
The site is invisible.
The Discovery
The problem surfaced during a routine check of automated access. A search agent was returning empty results for the site. Initial investigation focused on the obvious suspects:
Cloudflare's managed robots.txt was prepending directives that blocked every major AI crawler:
User-agent: ClaudeBot
Disallow: /
User-agent: GPTBot
Disallow: /
That was fixed in the Cloudflare dashboard. But the real problem was deeper.
Even after the robots.txt block was removed, agents that successfully fetched the page received a 200 response with a valid HTML document — containing zero rendered content. The <div id="__next"> was empty. Every page on the site was an empty shell.
The Next.js build output confirmed it. Despite labeling pages as ○ (Static) prerendered as static content, the actual HTML files contained no pre-rendered markup.
The Root Cause
The entire site was wrapped in a dark mode context provider:
// _app.tsx
function MyApp({ Component, pageProps }: AppProps) {
return (
<DarkModeProvider>
<Component {...pageProps} />
</DarkModeProvider>
);
}
Inside that provider:
// DarkModeContext.tsx
export const DarkModeProvider = ({ children }) => {
const [isDarkMode, setIsDarkMode] = useState(false);
const [mounted, setMounted] = useState(false);
useEffect(() => {
setMounted(true);
// ... read localStorage, set theme
}, []);
// Don't render children until mounted to prevent hydration mismatches
if (!mounted) {
return null;
}
return (
<DarkModeContext.Provider value={{ isDarkMode, toggleDarkMode }}>
{children}
</DarkModeContext.Provider>
);
};
Line 52: if (!mounted) { return null; }
This is the line that made the entire website invisible.
Why This Pattern Exists
This is a well-known React pattern for avoiding hydration mismatches. Here's the problem it solves:
- The server renders HTML with a default theme (light mode)
- The browser loads the page and reads
localStorageto determine the user's preference - If the user prefers dark mode, React re-renders with different classes
- For a brief moment, the server-rendered HTML (light) doesn't match the client state (dark)
- React logs a hydration mismatch warning, and in some cases, the UI flickers
The mounted guard prevents this by rendering nothing on the server and nothing on the client until useEffect fires (which only happens in the browser). The page appears fully formed on the client, with the correct theme, and no hydration mismatch occurs.
For human users with JavaScript enabled, this works perfectly. The page loads in milliseconds, the theme is correct, and there's no flash of wrong content.
Why This Pattern Kills Content Survivability
The mounted state starts as false. On the server, useEffect never runs. So during server-side rendering:
- React calls
DarkModeProvider mountedisfalse- The provider returns
null - React renders nothing
- The HTML file is written with an empty
<div id="__next"></div>
Every page. Every route. Every piece of content. All of it gated behind a state variable that can only become true in a browser.
The build system doesn't warn you. Next.js still labels the pages as "prerendered." The HTML files exist. They just contain nothing.
The Blast Radius
Because the provider wraps the entire application in _app.tsx, the impact is total:
- Homepage: empty
- Service pages: empty
- Product pages: empty
- Blog posts: empty
- Documentation: empty
- Landing pages with
getStaticProps: data is fetched and serialized into__NEXT_DATA__, but the HTML is still empty
The data is in the page. The content is in the JavaScript bundle. But the HTML — the thing that AI agents, search engine crawlers, screen readers in degraded mode, and any non-JS consumer actually reads — is an empty shell.
The Fix
Remove the mounted guard. Render children unconditionally:
export const DarkModeProvider = ({ children }) => {
const [isDarkMode, setIsDarkMode] = useState(false);
useEffect(() => {
// ... read localStorage, set theme (unchanged)
}, []);
return (
<DarkModeContext.Provider value={{ isDarkMode, toggleDarkMode }}>
{children}
</DarkModeContext.Provider>
);
};
The hydration mismatch concern is already handled by a separate mechanism — an inline script in _document.tsx that reads localStorage and applies the dark class before React hydrates:
// _document.tsx — DarkModeScript
<script dangerouslySetInnerHTML={{ __html: `
(function() {
var savedTheme = localStorage.getItem('darkMode');
if (savedTheme === 'true') {
document.documentElement.classList.add('dark');
}
})();
`}} />
This script runs synchronously before paint. The CSS class is applied before React even starts. The server renders with light mode defaults; the inline script corrects the class before the browser paints; React hydrates with the correct state. No flash. No mismatch. No need to suppress rendering.
The Result
| Page | Before (empty shell) | After (pre-rendered) |
|---|---|---|
| Homepage | 2.2 KB | 22.7 KB |
| AXIOM Landing | 18.2 KB* | 59.5 KB |
| Services | 2.2 KB | 22.1 KB |
*The AXIOM page had getStaticProps data serialized in __NEXT_DATA__, inflating the response size despite the empty DOM.
After the fix, every page serves complete, semantic HTML to any consumer — browsers, search engines, screen readers, and AI agents. The content exists without JavaScript.
The Broader Lesson
This isn't a bug. It's a design trade-off that made perfect sense in a JavaScript-only world.
The mounted guard pattern appears in official React documentation, popular UI libraries, and thousands of production applications. It solves a real problem (hydration mismatches) with a clean solution (don't render until you're in the browser). For the past decade, the only consumer of your HTML that mattered was a browser with JavaScript. The pattern works. Nobody complains.
But the web now has a new class of consumer. AI agents don't execute JavaScript. They read your HTML the way a search engine crawler did in 2005 — raw, static, as-served. When they hit a page that returns <div id="__next"></div>, they don't wait for React to hydrate. They see an empty page and move on.
Three Observations
1. Your build tools won't tell you.
Next.js labeled every page as "prerendered as static content" despite producing empty HTML files. The build succeeded. The dev server worked. Lighthouse scored it well (Lighthouse executes JavaScript). The only way to catch this is to look at the raw HTML response — the thing your users never see but your non-browser consumers depend on.
2. The pattern is everywhere.
Any React application that conditionally renders based on a mounted, isClient, or hasMounted state variable at the top of its component tree has this problem. It's especially common in:
- Dark mode / theme providers
- Authentication wrappers that check
localStoragebefore rendering - Feature flag providers that read client-side configuration
- Any "client-only" wrapper component
If these components wrap your layout or _app, your entire site is client-rendered regardless of what your framework configuration says.
3. The fix is usually simple.
In most cases, the server can render with sensible defaults (light mode, logged-out state, default feature flags) and the client can correct after hydration. The brief flash of default state — if it's even visible — is a far better outcome than serving an empty page to every non-browser consumer on the internet.
For dark mode specifically, the inline-script-in-document pattern solves the flash problem without suppressing server rendering. The script runs before paint; the user never sees the wrong theme; and the HTML contains real content.
How to Check Your Own Site
Open a terminal and run:
curl -s https://yoursite.com | grep '<div id="__next">'
If the output is <div id="__next"></div> — an empty, self-closing div with nothing inside it — your site has a Content Survivability score of zero. Every page is invisible to AI agents, and your server-side rendering configuration is being silently bypassed.
The fix might be one line of code.
This case study documents a real finding from an AXIOM (Agent eXecution, Information & Orchestration Markup) audit performed by Clocktower and Associates. The site in question was our own.