Industry5 min read

Rankwise for Single-Page Application SEO

Solve JavaScript rendering challenges for SPAs built with React, Vue, and Angular. Ensure search engines and AI crawlers can index your dynamic content.

100%
Pages Indexed
< 1.5s
Median FCP
3x
Organic Traffic Growth
Common Challenges
  • Googlebot not indexing client-rendered content
  • Empty HTML source visible in search result cache
  • React/Vue SPAs invisible to AI search crawlers
  • Dynamic routes not appearing in search index
Goals
  • Full indexation of all SPA routes by Google
  • Correct meta tags and structured data for each page
  • AI crawler accessibility without SSR infrastructure
  • Fast first contentful paint despite JavaScript rendering

How Rankwise Helps

Pre-Rendered Content Pages

Generate static HTML for each content page that search engines can index without executing JavaScript.

100% of content pages indexable regardless of rendering strategy

Dynamic Meta Tag Management

Title tags, meta descriptions, and Open Graph tags set correctly for each SPA route.

Every page has unique, optimized metadata in search results

Structured Data Generation

JSON-LD schema markup generated per page and embedded in the HTML response.

Rich results eligibility without client-side schema injection

Sitemap Generation

Automatic XML sitemap covering all dynamic SPA routes with proper change frequencies.

Search engines discover every route without manual sitemap maintenance

AI Crawler Compatibility

Content accessible to GPTBot, ClaudeBot, and PerplexityBot without requiring JavaScript execution.

AI search engines can cite your SPA content

Our React SPA had 30% of pages indexed. After moving content to Rankwise with SSG output, we hit 100% indexation in two weeks.
James Liu
Engineering Manager, AppGrid

The SPA Indexation Problem

Single-page applications render content in the browser using JavaScript. When Googlebot or any other crawler requests a page, it receives a mostly-empty HTML shell:

<html>
  <body>
    <div id="root"></div>
    <script src="/bundle.js"></script>
  </body>
</html>

The actual content — headings, paragraphs, images, links — only appears after JavaScript executes and populates the DOM. While Googlebot can execute JavaScript, the process is:

  1. Delayed — Google queues JavaScript rendering, sometimes waiting days
  2. Incomplete — Complex SPAs with lazy loading, auth gates, or API dependencies may not render fully
  3. Invisible to AI crawlers — GPTBot, ClaudeBot, and PerplexityBot do not execute JavaScript at all

The result: your SPA content exists for users but is partially or fully invisible to search engines and AI platforms.

Why Traditional SPA SEO Fails

Client-side meta tags don't work

// This runs in the browser AFTER the page loads
document.title = "My Page Title"
document.querySelector('meta[name="description"]').content = "..."

By the time this code executes, Googlebot has already parsed the HTML and moved on. The meta tags it sees are whatever's in the initial HTML response — often a generic default.

Hash-based routing is invisible

https://example.com/#/products/widget-a
https://example.com/#/about

Everything after # is a client-side fragment. Search engines treat all these URLs as the same page: https://example.com/. Use history-based routing (/products/widget-a) instead.

API-dependent content may not render

If your SPA fetches data from APIs during rendering, Googlebot may:

  • Hit API rate limits and get empty responses
  • Encounter authentication requirements
  • Time out on slow API calls
  • Miss content behind pagination or infinite scroll

Rendering Strategies That Work

Static Site Generation (SSG)

Pre-render every page to static HTML at build time:

AspectDetails
FrameworksNext.js, Gatsby, Nuxt, Astro
When to useContent that changes less than daily
SEO benefitFull HTML available instantly, zero JS required for indexing
LimitationBuild time increases with page count

SSG is the most reliable approach for content pages. The HTML is complete and ready for any crawler — no JavaScript execution needed.

Server-Side Rendering (SSR)

Render HTML on each request:

AspectDetails
FrameworksNext.js, Nuxt, Remix
When to useDynamic content, personalized pages
SEO benefitFull HTML per request, always current
LimitationServer cost, response time overhead

Incremental Static Regeneration (ISR)

Combine SSG with on-demand updates:

AspectDetails
FrameworksNext.js (native), Nuxt (with plugins)
When to useLarge sites where full rebuilds are too slow
SEO benefitStatic performance + content freshness
LimitationStale content possible during revalidation window

Dynamic Rendering (Last Resort)

Serve pre-rendered HTML to bots and the SPA to users:

AspectDetails
ToolsRendertron, Prerender.io, Puppeteer
When to useLegacy SPAs that can't adopt SSR/SSG
SEO benefitBots see complete HTML
LimitationGoogle considers this acceptable but not ideal, maintenance overhead

Critical SEO Elements for SPAs

Unique meta tags per route

Every SPA route needs its own title tag, meta description, and canonical URL in the initial HTML:

<!-- Each route must have unique metadata in the HTML response -->
<head>
  <title>Widget A - Product Details | YourSite</title>
  <meta
    name="description"
    content="Widget A specifications, pricing, and reviews..."
  />
  <link rel="canonical" href="https://example.com/products/widget-a" />
</head>

With SSG/SSR, this happens automatically. With client-side rendering, you need pre-rendering or dynamic rendering.

XML sitemap covering all routes

SPAs with dynamic routes need a generated sitemap:

<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
  <url>
    <loc>https://example.com/products/widget-a</loc>
    <lastmod>2026-04-13</lastmod>
  </url>
  <!-- One entry per route -->
</urlset>

Submit the sitemap to Google Search Console and reference it in robots.txt.

React Router's <Link> and Vue Router's <router-link> render as standard <a> tags with href attributes. Verify this in your HTML output. Search engines need crawlable <a href="..."> links to discover pages.

// Good: renders as <a href="/products/widget-a">
<Link to="/products/widget-a">Widget A</Link>

// Bad: no href for crawlers
<div onClick={() => navigate("/products/widget-a")}>Widget A</div>

Frequently Asked Questions

Can Googlebot render my React app?

Googlebot uses a headless Chromium browser that can execute JavaScript. However, rendering is queued (not instant), may time out on heavy apps, and cannot handle authenticated content. SSG/SSR is more reliable.

Do AI search engines execute JavaScript?

No. GPTBot, ClaudeBot, and PerplexityBot request HTML and do not execute JavaScript. If your content only exists after JS runs, it's invisible to AI search. You need server-rendered or pre-rendered HTML for AI crawler accessibility.

Should I migrate my entire SPA to SSR?

Not necessarily. A hybrid approach works well: SSG for content pages (blog, docs, landing pages), client-side rendering for authenticated app sections. Most frameworks support this pattern natively.

How do I test if my SPA is crawlable?

  1. View page source (not DevTools Elements) — this shows what crawlers see
  2. Use Google's URL Inspection tool in Search Console
  3. Use curl to fetch the URL — the response should contain your content
  4. Check the Google cache of your pages for completeness

Does SPA SEO affect page speed?

SSG improves page speed (pre-built HTML is served from CDN). SSR adds server processing time but eliminates the "blank screen" period of client-side rendering. Both approaches improve Core Web Vitals compared to pure client-side rendering.

Ready to optimize for AI search?

Start generating AI-optimized content that gets cited by ChatGPT, Perplexity, and other AI assistants.

Start Free Trial
Newsletter

Stay ahead of AI search

Weekly insights on GEO and content optimization.