Free Robots.txt Tester

Fetch and analyze your robots.txt file. Test specific URLs to see if they are allowed or blocked for different search engines.

What is a Robots.txt File?

A robots.txt file tells search engine crawlers which pages or files the crawler can or can't request from your site. It's used mainly to avoid overloading your site with requests.

Robots.txt Best Practices

  • Place the file in the root directory (e.g., example.com/robots.txt).
  • Include a link to your XML sitemap.
  • Don't use robots.txt to hide pages from Google Search results. Use noindex meta tags instead.