Fetch and analyze your robots.txt file. Test specific URLs to see if they are allowed or blocked for different search engines.
A robots.txt file tells search engine crawlers which pages or files the crawler can or can't request from your site. It's used mainly to avoid overloading your site with requests.