The SPA Indexation Problem
Single-page applications render content in the browser using JavaScript. When Googlebot or any other crawler requests a page, it receives a mostly-empty HTML shell:
<html>
<body>
<div id="root"></div>
<script src="/bundle.js"></script>
</body>
</html>
The actual content — headings, paragraphs, images, links — only appears after JavaScript executes and populates the DOM. While Googlebot can execute JavaScript, the process is:
- Delayed — Google queues JavaScript rendering, sometimes waiting days
- Incomplete — Complex SPAs with lazy loading, auth gates, or API dependencies may not render fully
- Invisible to AI crawlers — GPTBot, ClaudeBot, and PerplexityBot do not execute JavaScript at all
The result: your SPA content exists for users but is partially or fully invisible to search engines and AI platforms.
Why Traditional SPA SEO Fails
Client-side meta tags don't work
// This runs in the browser AFTER the page loads
document.title = "My Page Title"
document.querySelector('meta[name="description"]').content = "..."
By the time this code executes, Googlebot has already parsed the HTML and moved on. The meta tags it sees are whatever's in the initial HTML response — often a generic default.
Hash-based routing is invisible
https://example.com/#/products/widget-a
https://example.com/#/about
Everything after # is a client-side fragment. Search engines treat all these URLs as the same page: https://example.com/. Use history-based routing (/products/widget-a) instead.
API-dependent content may not render
If your SPA fetches data from APIs during rendering, Googlebot may:
- Hit API rate limits and get empty responses
- Encounter authentication requirements
- Time out on slow API calls
- Miss content behind pagination or infinite scroll
Rendering Strategies That Work
Static Site Generation (SSG)
Pre-render every page to static HTML at build time:
| Aspect | Details |
|---|---|
| Frameworks | Next.js, Gatsby, Nuxt, Astro |
| When to use | Content that changes less than daily |
| SEO benefit | Full HTML available instantly, zero JS required for indexing |
| Limitation | Build time increases with page count |
SSG is the most reliable approach for content pages. The HTML is complete and ready for any crawler — no JavaScript execution needed.
Server-Side Rendering (SSR)
Render HTML on each request:
| Aspect | Details |
|---|---|
| Frameworks | Next.js, Nuxt, Remix |
| When to use | Dynamic content, personalized pages |
| SEO benefit | Full HTML per request, always current |
| Limitation | Server cost, response time overhead |
Incremental Static Regeneration (ISR)
Combine SSG with on-demand updates:
| Aspect | Details |
|---|---|
| Frameworks | Next.js (native), Nuxt (with plugins) |
| When to use | Large sites where full rebuilds are too slow |
| SEO benefit | Static performance + content freshness |
| Limitation | Stale content possible during revalidation window |
Dynamic Rendering (Last Resort)
Serve pre-rendered HTML to bots and the SPA to users:
| Aspect | Details |
|---|---|
| Tools | Rendertron, Prerender.io, Puppeteer |
| When to use | Legacy SPAs that can't adopt SSR/SSG |
| SEO benefit | Bots see complete HTML |
| Limitation | Google considers this acceptable but not ideal, maintenance overhead |
Critical SEO Elements for SPAs
Unique meta tags per route
Every SPA route needs its own title tag, meta description, and canonical URL in the initial HTML:
<!-- Each route must have unique metadata in the HTML response -->
<head>
<title>Widget A - Product Details | YourSite</title>
<meta
name="description"
content="Widget A specifications, pricing, and reviews..."
/>
<link rel="canonical" href="https://example.com/products/widget-a" />
</head>
With SSG/SSR, this happens automatically. With client-side rendering, you need pre-rendering or dynamic rendering.
XML sitemap covering all routes
SPAs with dynamic routes need a generated sitemap:
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>https://example.com/products/widget-a</loc>
<lastmod>2026-04-13</lastmod>
</url>
<!-- One entry per route -->
</urlset>
Submit the sitemap to Google Search Console and reference it in robots.txt.
Internal links as standard <a> tags
React Router's <Link> and Vue Router's <router-link> render as standard <a> tags with href attributes. Verify this in your HTML output. Search engines need crawlable <a href="..."> links to discover pages.
// Good: renders as <a href="/products/widget-a">
<Link to="/products/widget-a">Widget A</Link>
// Bad: no href for crawlers
<div onClick={() => navigate("/products/widget-a")}>Widget A</div>
Frequently Asked Questions
Can Googlebot render my React app?
Googlebot uses a headless Chromium browser that can execute JavaScript. However, rendering is queued (not instant), may time out on heavy apps, and cannot handle authenticated content. SSG/SSR is more reliable.
Do AI search engines execute JavaScript?
No. GPTBot, ClaudeBot, and PerplexityBot request HTML and do not execute JavaScript. If your content only exists after JS runs, it's invisible to AI search. You need server-rendered or pre-rendered HTML for AI crawler accessibility.
Should I migrate my entire SPA to SSR?
Not necessarily. A hybrid approach works well: SSG for content pages (blog, docs, landing pages), client-side rendering for authenticated app sections. Most frameworks support this pattern natively.
How do I test if my SPA is crawlable?
- View page source (not DevTools Elements) — this shows what crawlers see
- Use Google's URL Inspection tool in Search Console
- Use
curlto fetch the URL — the response should contain your content - Check the Google cache of your pages for completeness
Does SPA SEO affect page speed?
SSG improves page speed (pre-built HTML is served from CDN). SSR adds server processing time but eliminates the "blank screen" period of client-side rendering. Both approaches improve Core Web Vitals compared to pure client-side rendering.
Related Resources
- JavaScript SEO — How search engines process JavaScript
- JavaScript SEO Best Practices — Technical checklist
- Next.js SEO Setup Template — Ready-to-use Next.js SEO configuration
- JavaScript Rendering for SEO — How browsers and bots render JS