Skip to main content
TL;DR — Run the Technical Analysis audit in Strategy to get your AI Readiness Score and fix any crawl barriers blocking AI bots. Cross-reference with Analytics > Crawler Analytics to confirm GPTBot, ClaudeBot, and others are actually reaching your pages with 200 OK responses. Check Analytics > Page Performance sorted by AI Score to find your weakest pages. Pro tip: if a bot shows zero visits despite being allowed in robots.txt, your CDN or WAF may be blocking it at the network layer.

The Question

“Is my website properly optimized for AI crawlers?”
AI models do not browse the web in real time — they learn from crawled content. If your site has crawl barriers, missing structured data, or misconfigured directives, your pages may never enter the training and retrieval pipelines that determine what AI models say about your industry. Technical optimization for AI crawlers is a prerequisite for any GEO strategy. You might also be wondering:
  • “Which AI bots are actually visiting my site, and are they being blocked?”
  • “Do I have the right robots.txt and llms.txt configuration for AI crawlers?”
  • “Which of my pages score highest for AI readiness?”

Where to Go in Qwairy

1

Start here: Technical > Technical Analysis

Navigate to Technical > Technical Analysis — your primary audit view. Focus on the Overall AI Readiness Score at the top of the page: it aggregates structured data coverage, crawl accessibility, schema markup, and content formatting into a single score out of 100. Scan the Issue Breakdown panel below it — issues are categorized as Critical, Warning, and Info, and each line item links to the affected pages.
2

Go deeper: Analytics > Crawler Analytics

Cross-reference with Analytics > Crawler Analytics. This view shows you which AI bots are actually reaching your server and whether the responses they receive are 200 OK, 403 Forbidden, or blocked by robots directives. Use the Bot filter to isolate GPTBot or ClaudeBot specifically and confirm they have unrestricted access to your key pages.
3

Complete the picture: Analytics > Page Performance

Navigate to Analytics > Page Performance to get a page-level AI readiness breakdown. Each row shows an individual URL with its crawl status, structured data presence, schema types detected, and an AI Score. Sort ascending by AI Score to find the pages most in need of remediation.

What to Look For

Technical Analysis — Audit Score and Issue List

The audit runs against your connected domain and checks for the signals AI crawlers and retrieval systems rely on most: semantic HTML structure, JSON-LD schema markup, page speed, canonical tags, and the presence of an llms.txt file.
ElementWhat it tells you
Overall AI Readiness ScoreComposite health score — below 60 means significant crawl or comprehension barriers exist
Critical Issues countBlockers that directly prevent crawling or indexing — fix these first
Schema Coverage %Share of pages with at least one JSON-LD schema type present
llms.txt StatusWhether a valid llms.txt file exists and is properly formatted
robots.txt StatusWhether AI crawler user-agents are explicitly allowed or inadvertently blocked

Crawler Analytics — Bot Access Verification

The audit tells you what should be happening; Crawler Analytics tells you what is actually happening. Even a perfectly configured robots.txt can be bypassed or misread. Seeing real bot hits — or their absence — confirms whether your configuration changes have taken effect.
Pro Tip: If GPTBot visits are zero across a 30-day window despite your robots.txt explicitly allowing it, your CDN or WAF may be blocking the user-agent at the network layer before requests reach your server.

Filters That Help

FilterHow to use it for this question
ProviderMap each crawler (GPTBot, ClaudeBot, PerplexityBot) to the AI product it feeds — prioritize the providers where your brand visibility is weakest
PeriodSet to 90 days after making robots.txt or llms.txt changes to confirm the new configuration is being picked up
Topic / TagFilter Page Performance to a specific content cluster (e.g., “pricing”, “integrations”) to audit readiness within a strategic section

How to Interpret the Results

Good result

Overall AI Readiness Score above 75, zero Critical Issues, all major AI crawler user-agents returning 200 OK in Crawler Analytics, schema markup on at least 70% of indexed pages, and an llms.txt file present and valid. At this level, technical barriers are unlikely to be the primary reason for any visibility gaps.

Needs attention

AI Readiness Score below 50, or any Critical Issue flagging a blocked user-agent, missing canonical tags across key pages, or a robots.txt directive that inadvertently disallows GPTBot/ClaudeBot. A 403 or 0-visit pattern in Crawler Analytics for a major bot is an immediate fix priority — no amount of content optimization will help if the crawler cannot reach the page.
A high AI Readiness Score does not guarantee citation. Technical optimization removes barriers; it does not create authority. Pages still need to be cited by trusted external sources to appear in AI responses. Use Technical Analysis alongside Citations data — not as a standalone success metric.

Example

Scenario: Your e-commerce marketplace has thousands of product category pages with rich buying guides, but AI assistants never recommend your platform when shoppers ask “where to buy [product]”. You want to determine if the problem is technical or content-related.
  1. Open Technical > Technical Analysis and check the Overall AI Readiness Score. It reads 41 — well below the threshold. Drill into Critical Issues: your entire /collections/ directory (320 category pages) is blocked by a wildcard Disallow: /collections/ rule in robots.txt, a leftover from when those URLs were staging previews during the platform migration.
  2. Navigate to Analytics > Crawler Analytics and filter to GPTBot. Confirm: zero visits to any /collections/ URL over the past 90 days, while your homepage and blog receive regular crawl activity. PerplexityBot shows the same pattern.
  3. Remove the legacy Disallow rule and add explicit Allow: /collections/ directives for GPTBot, ClaudeBot, and PerplexityBot. Return to Crawler Analytics after 4 weeks to verify bot traffic resumes on your category pages. Then check Analytics > Page Performance for those URLs to confirm their AI Scores improve as crawl data refreshes and your buying guides become indexable.

Go Further