AI Crawler Checker
Check if ChatGPT, Claude, Perplexity, Google AI and 12 other AI crawlers can access your site — with one-click robots.txt fixes.
What are AI Crawlers and Why They Matter?
AI crawlers are bots operated by AI companies — OpenAI's GPTBot, Anthropic's ClaudeBot, Google-Extended, Apple's Applebot-Extended, Perplexity-User, ByteDance's Bytespider, and roughly a dozen others — that fetch web pages for two purposes: building model training datasets and powering real-time AI search. They obey robots.txt by user-agent string, just like Googlebot, but each operator typically runs separate bots for training versus search, so a single Disallow line rarely covers everything you intend. Blocking training bots opts you out of model training; blocking search bots removes you from AI Overviews, ChatGPT citations, and Perplexity answers — usually the opposite of what publishers want. This checker fetches your robots.txt and tells you which of the 16 known AI crawlers can reach your site, then gives one-click snippets to fix any gaps. While you're auditing, run the SEO Checker for full-page issues or check rendering with the Mobile-Friendly Test.
How to use this tool
- 1Paste your URLEnter any public URL — we fetch the /robots.txt at the origin server-side, so it works on any site you can reach.
- 2Read the per-bot gridScan the 16-bot grid grouped by operator. Each row shows whether the bot is allowed, partially allowed, or blocked, plus what the bot is for.
- 3Copy a recommended robots.txtPick the policy that matches your goals (block training, allow search, or block all AI) and paste the snippet into your /robots.txt.
