What is Indexability?
Indexability refers to whether a search engine can add a page to its index. A page must be crawlable and indexable to appear in search results.
Factors Affecting Indexability
Makes pages non-indexable:
noindexmeta tagX-Robots-Tag: noindexheader- Blocked by robots.txt
- Canonicalized to another URL
- Login-required content
- Redirect chains
Makes pages indexable:
- No blocking directives
- Self-referencing canonical
- Included in sitemap
- Linked from other indexed pages
Checking Indexability
Google Search Console:
- URL Inspection tool
- Coverage report
- Index status
On-page checks:
- Meta robots tag
- Canonical tag
- HTTP headers (X-Robots-Tag)
Common Indexability Issues
- Accidental noindex - Left over from staging
- Robots.txt blocking - Too restrictive rules
- Canonical confusion - Pointing to wrong URL
- Orphan pages - No internal links
- Thin content - May be excluded by quality filters
Indexability vs. Ranking
Being indexed doesn't guarantee rankings. A page must be:
- Crawlable
- Indexable
- High enough quality to rank
- Relevant to search queries
FAQs
What is the difference between crawlability and indexability?
Crawlability is whether a search engine can access and read a page. Indexability is whether it can add the page to its index after crawling. A page can be crawlable but not indexable — for example, if it has a noindex tag, Googlebot can crawl it but won't add it to the index.
How do I check if a specific page is indexable?
Use Google Search Console's URL Inspection tool. Enter the URL and check the "Indexing" section. It will tell you if the page is indexed, and if not, why — whether it's blocked by robots.txt, has a noindex directive, or is canonicalized to another URL.
Can a page lose its indexability over time?
Yes. Common causes include accidental noindex tags deployed during a release, robots.txt changes that block crawling, canonical tag updates pointing to a different URL, or content quality drops that trigger Google's quality filters.
How long does it take for indexability fixes to take effect?
After fixing a noindex tag or robots.txt block, request re-indexing via Search Console's URL Inspection tool. Most pages get re-crawled within days, but indexing can take 1-4 weeks depending on your site's crawl frequency and the page's perceived importance.
Related resources
- Guide: /resources/guides/robots-txt-for-ai-crawlers
- Template: /templates/definitive-guide
- Use case: /use-cases/saas-companies
- Glossary:
- /glossary/robots-txt
- /glossary/canonical-url