Robots.txt Analyzer
Enter any domain name to fetch and parse its robots.txt file. The tool displays
every user-agent group, allow/disallow rules, crawl-delay directives, and linked sitemaps.
Optionally enter a path (e.g., /blog/) to test whether Googlebot, Bingbot, or
other crawlers are permitted to access it.
This tool was created by Ben Crittenden, an IT professional with experience in web development, systems administration, and project management.