If you want to test a specific robots.txt rule against a file that isn't on the web yet, or test a new rule, you can use a third-party robots.txt tester.
Test and validate your robots.txt. Check if a URL is blocked and how. You can also check if the resources for the page are disallowed.
Use Search Console to monitor Google Search results data for your properties.
Quickly check your pages' crawlability status. Validate your Robots.txt by checking if your URLs are properly allowed or blocked.
Checks a list of URLs against the live, or a custom, robots.txt file one to see if they are allowed, or blocked and if so, by what rule.
What is it? Check if your website is using a robots.txt file. When search engine robots crawl a website, they typically first access a site's robots.txt file.
Jul 16, 2014 ¡P You can find the updated testing tool in Webmaster Tools within the Crawl section: Here you'll see the current robots.txt file, and can test new URLs.
A robots.txt file lives at the root of your site. Learn how to create a robots.txt file, see examples, and explore robots.txt rules.
This robots.txt tester shows you whether your robots.txt file is blocking Google crawlers from accessing specific URLs on your website.
Rating
(15)
Check your robots.txt like Google. This tool uses the official library. It's a replacement for the canceled tester in the Search Console. Enjoy :).