Learnbeginner

JavaScript Rendering for SEO: How Browsers and Search Engines Process JS

Learn how browsers render JavaScript, how Googlebot processes JS pages differently from users, and why rendering strategy matters for search visibility.

Rankwise Team·Updated Apr 13, 2026·6 min read

Every web page goes through a rendering process before it becomes visible. For static HTML pages, this process is straightforward. For JavaScript-powered pages, it's a multi-stage pipeline that browsers, search engines, and AI crawlers handle very differently. Understanding these differences is the foundation of JavaScript SEO.


How browsers render pages

When a user visits a page, the browser follows this sequence:

  1. Fetch HTML — Download the initial HTML document
  2. Parse HTML — Build the DOM (Document Object Model) tree
  3. Fetch CSS — Download and parse stylesheets, build the CSSOM
  4. Fetch JavaScript — Download script files
  5. Execute JavaScript — Run scripts that may modify the DOM
  6. Render tree — Combine DOM and CSSOM into the render tree
  7. Layout — Calculate element positions and sizes
  8. Paint — Draw pixels to the screen

For a server-rendered page, the DOM is complete after step 2. The browser can start rendering immediately.

For a client-rendered SPA, the DOM after step 2 is nearly empty (just a <div id="root">). The actual content doesn't appear until step 5, when JavaScript executes and populates the DOM.


How Googlebot renders JavaScript

Google uses a two-phase process:

Phase 1: Crawling (immediate)

Googlebot fetches the initial HTML and extracts:

  • Links (for discovering other pages)
  • Meta tags (title, description, robots)
  • Canonical URLs
  • Structured data

If the HTML already contains content (SSR/SSG), Google can index the page immediately.

Phase 2: Rendering (deferred)

For JavaScript-dependent pages, Google queues the page for rendering:

  1. Google launches a headless Chromium instance
  2. Loads the page and executes JavaScript
  3. Waits for the DOM to stabilize
  4. Extracts the rendered content
  5. Updates the index with the rendered version

Key limitation: Rendering is not instant. Pages sit in a queue that can take hours or days to process. During this gap, the page is either not indexed or indexed with incomplete content.

AspectPhase 1 (Crawl)Phase 2 (Render)
TimingImmediateHours to days later
What's processedInitial HTML onlyJavaScript-rendered DOM
Content availableServer-rendered contentAll content including JS-rendered
Resource costLowHigh (Chromium instance per page)

How AI crawlers handle JavaScript

AI search crawlers — GPTBot (ChatGPT), ClaudeBot (Claude), PerplexityBot (Perplexity) — do not execute JavaScript. They fetch the HTML response and process it as-is.

This means:

  • Client-rendered content is invisible to AI search
  • Meta tags injected by JavaScript are not seen
  • Dynamic routes that rely on JS routing are not discovered
  • AI search citations require server-rendered HTML

If your content only exists after JavaScript runs, it cannot be cited by AI answer engines.


Rendering strategies compared

Client-Side Rendering (CSR)

JavaScript runs in the browser and builds the page from scratch.

Server sends: <div id="root"></div> + bundle.js
Browser: downloads JS → executes → renders content
Crawler sees: empty HTML (initially)

SEO impact: Content invisible until rendering queue processes the page. AI crawlers never see content.

Server-Side Rendering (SSR)

The server executes JavaScript and sends complete HTML.

Server: executes React/Vue → generates HTML → sends to browser
Browser: displays HTML immediately → hydrates with JS
Crawler sees: full content in initial HTML

SEO impact: Content immediately indexable. All crawlers can process it.

Static Site Generation (SSG)

Pages are pre-rendered at build time and served as static HTML.

Build step: executes React/Vue → generates HTML files
CDN: serves pre-built HTML
Browser: displays HTML → hydrates with JS
Crawler sees: full content in initial HTML

SEO impact: Best performance (CDN-served), full content available. Suitable for content that doesn't change on every request.

Incremental Static Regeneration (ISR)

Pages are statically generated but regenerate on demand.

First request: serves cached static HTML
Background: regenerates page after revalidation period
Next request: serves updated HTML

SEO impact: Combines SSG performance with content freshness. Good for large sites where full rebuilds are impractical.


What Googlebot can and cannot render

Can render

  • React, Vue, Angular applications
  • Standard DOM manipulation
  • Most CSS features
  • Fetch/XHR API calls to public endpoints
  • Web Components (with polyfills)

Cannot render reliably

  • Content behind authentication (login walls)
  • Content that requires user interaction (click to load)
  • Infinite scroll without pagination
  • Content loaded by IntersectionObserver (sometimes)
  • Content dependent on service workers

Never processes

  • localStorage/sessionStorage data
  • IndexedDB
  • WebSocket connections
  • Browser notifications
  • Geolocation API

The rendering gap in practice

Consider a React e-commerce site with 10,000 product pages:

PhaseServer-RenderedClient-Rendered
Day 1Google indexes all 10K pagesGoogle crawls all 10K pages (HTML is empty)
Day 2-5Pages appear in search resultsPages queued for rendering
Day 5-14Full index, rich results eligible~7K pages rendered and indexed
Day 30Stable indexation~9K pages rendered, some still missing

The client-rendered site loses 2-4 weeks of indexation time and may never achieve 100% coverage.


How to verify your rendering

Browser: View Source vs. Inspect

  • View Source (Ctrl+U) shows the initial HTML that crawlers receive
  • Inspect Element (DevTools) shows the live DOM after JavaScript execution

If content appears in Inspect but not View Source, crawlers may not see it.

Google URL Inspection Tool

In Google Search Console, enter a URL and click "Test Live URL." Google shows:

  • The rendered HTML as Googlebot sees it
  • A screenshot of the rendered page
  • Any JavaScript errors encountered

curl test

# See what crawlers receive (no JavaScript)
curl -s https://yoursite.com/page | grep "<h1>"

If the <h1> tag appears, the content is server-rendered. If it doesn't, the page relies on client-side JavaScript.


Choosing the right strategy

ScenarioRecommended Strategy
Blog, docs, marketing pagesSSG
E-commerce product pagesSSG or ISR
User dashboards (authenticated)CSR (doesn't need indexing)
Search results pagesSSR
Real-time data pagesSSR with caching
Large content sites (50K+ pages)ISR

The general rule: if a page should appear in search results, serve its content in the initial HTML. Use CSR only for authenticated sections that don't need search visibility.


Frequently Asked Questions

Does Googlebot use the latest version of Chrome?

Google updates the rendering engine to the latest stable Chromium version. However, the rendering environment has no GPU acceleration, limited memory, and cannot handle all browser APIs (localStorage, notifications, etc.).

How much JavaScript can Googlebot handle?

There's no official limit, but Googlebot has a rendering budget. Complex SPAs with large bundles, many API calls, and heavy DOM manipulation are more likely to time out. Keep JavaScript execution lean for pages that need indexing.

Do I need to worry about Bingbot rendering?

Bingbot has limited JavaScript rendering capabilities compared to Googlebot. If Bing traffic matters to you, server-side rendering is even more important.

Can I use Suspense and lazy loading with SSR?

Yes. React 18+ supports streaming SSR with Suspense. Lazy-loaded components can have server-rendered fallbacks that provide content for crawlers while the interactive version loads for users.

Is dynamic rendering (serving different content to bots) considered cloaking?

Google explicitly says dynamic rendering is not cloaking, as long as the rendered content matches what users see. However, they recommend SSR/SSG as the preferred long-term solution over dynamic rendering.

Part of the SEO Fundamentals topic

Newsletter

Stay ahead of AI search

Weekly insights on GEO and content optimization.