We watched a site's AI search traffic crater 80× in two months. Here's what we saw.
A 90-day case study of one site's crawl-to-click ratio — the share of AI bot fetches that turn into a real human click — and what its collapse from 14.9% to 0.18% means for anyone betting on AI as a distribution channel.
One site. 90 days. Same content, same crawlers — and AI-driven traffic dropped by nearly two orders of magnitude.
This is what we saw monitoring one customer's AI traffic from January to May 2026. The site is content-heavy: ~450 indexed blog posts in the careers/resume space, mature, regularly updated. We've anonymized the domain.
The headline: AI bots kept fetching. The humans stopped showing up.
The metric: crawl-to-click ratio
Most analytics tools collapse "AI traffic" into one number. We separate two sides:
- AI training crawls — bots like GPTBot, ClaudeBot, Bytespider, PerplexityBot fetching pages to ingest into models or answer indices.
- AI referrals — a real human clicking through to the site from an AI surface (
chat.openai.com,claude.ai,perplexity.ai,duckduckgo.comAI mode, etc.).
The crawl-to-click ratio is the share of crawls that result in a human click in the same time window:
crawl-to-click % = AI referrals / AI training crawls × 100
For every 100 pages an AI bot fetches, how many produced a human visitor? That's the only metric that matters if you think AI is a distribution channel rather than just a training corpus.
What we saw on this site
| Month | AI crawls | AI referrals | Crawl-to-click % | |---|---|---|---| | 2026-01 | 13,255 | 1,978 | 14.93% | | 2026-02 | 2,284 | 127 | 5.56% | | 2026-03 | 5,014 | 9 | 0.18% | | 2026-04 | 5,494 | 65 | 1.18% | | 2026-05 (partial) | 1,527 | 0 | 0% |
In January, nearly 15 in every 100 AI bot fetches produced a human click. By March that number was fewer than 2 in 1,000. The crawl volume held — bots fetched 5,000+ pages per month from March on. The referral side fell off a cliff.
What likely happened
We don't know yet. Three plausible drivers, none confirmed:
- Referer-header stripping changed. AI products on mobile, in-app, and inside Arc/Brave already strip referer headers. If ChatGPT or Claude tightened referer policies between January and March, the clicks didn't disappear — they became invisible to attribution.
- The site dropped out of common answer sets. Citations are not stable. A page that's quoted heavily in January can be displaced by newer content within weeks. The crawls keep coming — bots re-fetch known URLs — but the page stops appearing in actual answers.
- The AI surfaces themselves changed. ChatGPT and Claude both shipped updates to how they generate citations vs. paraphrase in early 2026. A model that paraphrases instead of linking will keep crawling and stop sending traffic.
We're working on telling these apart. The point of the case study is the gap, not the cause.
What this means
Two things, and one of them is uncomfortable.
1. Crawl volume and AI traffic are decoupled — and the gap is widening.
If your AI strategy is "get crawled, traffic will follow," you're optimizing for an input that no longer correlates with the output. This site's crawl volume in March was 38% of its January peak. Its referral volume was less than 1%.
2. Most analytics tools cannot show you this.
Google Analytics rolls AI bot fetches into "direct" or filters them out entirely. Server logs show the crawls but not the referrals on the same axis. Cloudflare bot management shows the bots but not the conversion side. To track crawl-to-click you need both signals joined per page, per AI source, per day — and that's the gap we built SeeLLM to fill.
If you've been watching "AI bot traffic is growing!" charts and assuming traffic from AI is growing too, this is the data point that breaks the assumption.
How we measured this
- Source: server-side observation via SeeLLM edge worker, joined to AI source classification
- Window: 2026-01-01 to 2026-05-05
- Domain: one customer, content site, ~450 blog posts
- AI training crawls: UA matches against the published bot signature list (GPTBot, ClaudeBot, Bytespider, OAI-SearchBot, Amazonbot, Applebot-Extended, CCBot, PerplexityBot, and others)
- AI referrals: request with referer header pointing at known AI surface domains
- Caveat: referer stripping is real. Mobile ChatGPT, Arc, Brave, and increasingly the AI products themselves drop referers. The 2026-01 number is likely closest to "ground truth" because referer policies were looser then. The 2026-03+ numbers are the new normal — partly real drop, partly attribution loss.
Full classifier rules: see our methodology page.
What to do with this
If you publish content and care about AI distribution:
- Track crawl-to-click as a ratio, not crawl volume in isolation. Crawl volume is a vanity metric. The ratio tells you whether the bots are sending anything back.
- Watch the trend, not the snapshot. A single month tells you nothing. The collapse in this case study is only visible because we were watching the same site for 90+ days.
- Don't trust your default analytics for this. GA4, server logs alone, and Cloudflare bot dashboards each see one side of the equation. You need both joined.
We monitor this ratio for customers via Cloudflare logs or a 5-minute edge worker install. If you want the same view of your own site, run a free Score on any URL, or set up the worker to start collecting your own crawl-to-click data.
Related reading:
Continue reading
More from the field notes
April 30, 2026
What Is Crawled But Not Cited?
Crawled but not cited is the gap between AI systems fetching a page and actually reusing it in answers, citations, referrals, or recommendations.
April 29, 2026
AI Visibility vs SEO Rankings: What Changes?
SEO rankings measure discoverability in search results. AI visibility measures whether important pages are fetched, interpreted, cited, skipped, or reused by AI systems.
April 28, 2026
How to Diagnose AI-Crawled Pages
A practical workflow for reviewing important pages that AI systems fetch but do not clearly reuse.
From reading to action
See which pages AI systems can actually use.
Start with the free AI Visibility Score. When you need page-level evidence, move from static checks to monitoring the pages that matter.