Noindex, Nofollow - Understanding and Using Meta-Robots-Tags Correctly

Meta-Robots-Tags are HTML meta elements that allow website operators to specifically control search engine crawlers. They provide instructions on how a page should be crawled, indexed, and linked.

What are Meta-Robots-Tags?

Basic Functionality

Meta-Robots-Tags function as direct communication between website and search engine. They are placed in the <head> section of an HTML page and give crawlers specific instructions.

The Most Important Meta-Robots Directives

Noindex - Prevent Indexing

The noindex directive prevents a page from being included in the search engine index.

<meta name="robots" content="noindex">

Use Cases for Noindex:

  • Test and development pages
  • Private areas (login, admin)
  • Duplicate content (e.g., print versions)
  • Temporary pages
  • Pages with low value for users

Nofollow - Don't Pass Link Juice

The nofollow directive prevents PageRank or link juice from being passed through links.

<meta name="robots" content="nofollow">

Use Cases for Nofollow:

  • User-generated content
  • Sponsored links
  • External links to untrustworthy pages
  • Internal links to unimportant pages

Combined Directives

Often multiple directives are combined:

<meta name="robots" content="noindex, nofollow">
<meta name="robots" content="index, nofollow">
<meta name="robots" content="noindex, follow">

Meta-Robots-Tags in Detail

Indexing Directives

Directive
Function
Default Behavior
index
Page may be indexed
Standard (when not specified)
noindex
Page may NOT be indexed
Page is removed from index

Crawling Directives

Directive
Function
Impact
follow
Follow links on the page
Link juice is passed
nofollow
Do NOT follow links on the page
No link juice transfer

Additional Directives

Directive
Function
Application
noarchive
Don't save cached version
For time-critical content
nosnippet
No snippets in SERPs
For sensitive content
noodp
Ignore Open Directory Project
Deprecated directive
notranslate
Don't offer translation
For language-specific content
noimageindex
Don't index images
For copyrighted images

Practical Use Cases

E-Commerce: Product Variants

For product pages with many variants (color, size), noindex can be useful for variant pages:

<!-- Main product page -->
<meta name="robots" content="index, follow">

<!-- Product variants -->
<meta name="robots" content="noindex, follow">

Blog: Category and Tag Pages

Often category pages have little unique content:

<!-- Main article -->
<meta name="robots" content="index, follow">

<!-- Category overview -->
<meta name="robots" content="noindex, follow">

Corporate Websites: Legal Pages

Imprint, privacy policy and terms of service usually have no SEO value:

<meta name="robots" content="noindex, nofollow">

Common Mistakes and Pitfalls

1. Incorrect Implementation

❌ Wrong:

<meta name="robot" content="noindex">

✅ Correct:

<meta name="robots" content="noindex">

2. Contradictory Directives

❌ Problematic:

<meta name="robots" content="noindex, index">

3. Forgotten Canonical Tags

For noindex pages, canonical tags should be removed:

❌ Wrong:

<meta name="robots" content="noindex">
<link rel="canonical" href="https://example.com/page">

✅ Correct:

<meta name="robots" content="noindex">

X-Robots-Tag as Alternative

For dynamic control, X-Robots-Tags can be set in the HTTP header:

X-Robots-Tag: noindex, nofollow

Advantages of X-Robots-Tag:

  • Works with non-HTML files too
  • Can be set dynamically server-side
  • Allows multiple headers

Testing and Validation

Google Search Console

Check the indexing status in GSC:

  1. "Coverage" → "Excluded"
  2. Check "Excluded by 'noindex'"

Browser Tools

Use browser developer tools:

  1. F12 → Elements
  2. Search for <meta name="robots">

Online Tools

  • Google Rich Results Test
  • Screaming Frog SEO Spider
  • SEO Spider Tools

Best Practices for 2025

1. Strategic Application

  • Use noindex only for justified cases
  • Document all noindex decisions
  • Regular review of indexing

2. Performance Optimization

  • Combine with other technical SEO measures
  • Use X-Robots-Tag for better performance
  • Implement server-side when possible

3. Monitoring

  • Monitor indexing changes
  • Check crawl budget efficiency
  • Analyze impact on rankings

Checklist: Using Meta-Robots-Tags Correctly

Before Implementation:

  • ☐ Page goal defined
  • ☐ SEO value assessed
  • ☐ Alternative solutions checked
  • ☐ Impact on link juice considered

During Implementation:

  • ☐ Correct syntax used
  • ☐ Contradictions avoided
  • ☐ Canonical tags adjusted
  • ☐ Sitemap updated

After Implementation:

  • ☐ GSC status checked
  • ☐ Crawling behavior observed
  • ☐ Performance metrics analyzed
  • ☐ Regular reviews scheduled

Related Topics