TL;DR — Each AI platform sends different volumes and quality of traffic. Open Analytics > Referrer Analytics to see the per-platform breakdown of sessions, bounce rate, and pages/session, then filter Analytics > Page Performance by provider to discover which URLs each platform’s users land on. Pro tip: connect GA4 before drawing conclusions — ChatGPT traffic is undercounted by 3-5x without it, which can completely distort your platform comparison.
The Question
“How much traffic am I getting from each AI platform?”Not all AI platforms send traffic the same way. Perplexity surfaces clickable citations prominently in every answer. ChatGPT now includes links in browse mode but historically drove dark traffic. Claude tends to cite sources less aggressively. Understanding which platform sends the most visitors — and what those visitors do when they arrive — tells you where to focus your content distribution and source-building efforts. This breakdown is also essential for demonstrating which AI channels are maturing fastest as acquisition sources. You might also be wondering:
- “Is Perplexity sending more traffic than ChatGPT to my site?”
- “How engaged are visitors from each AI platform — do they bounce immediately?”
- “Which AI platforms cite my content most frequently versus just mentioning my brand?”
Where to Go in Qwairy
Start here: Analytics > Referrer Analytics
Navigate to Analytics > Referrer Analytics — this is your primary view.
The AI Referrer breakdown table is the core of this page: it lists each platform (ChatGPT, Perplexity, Claude, Copilot, Gemini, and others detected) with sessions, bounce rate, and pages per session for the selected period.
The platform share chart at the top visualizes the distribution at a glance — which platforms dominate your AI traffic mix and which are marginal contributors.
Go deeper: Analytics > Page Performance
Cross-reference with Analytics > Page Performance filtered by a specific provider.
This tells you which pages are receiving traffic from each AI platform — and whether different platforms drive visitors to different parts of your site. ChatGPT might drive traffic primarily to your blog while Perplexity drives it to your product pages, which has direct implications for page-level optimization.
Complete the picture: GA4 Integration
Connect your GA4 property under workspace settings for:
- Complete ChatGPT session data (ChatGPT apps do not pass HTTP referrers; GA4 captures these via session parameters)
- Conversion event attribution by AI platform — which platform sends visitors who actually sign up or purchase
- Audience segmentation — build a GA4 audience segment per AI platform to track LTV and downstream behavior
What to Look For
AI Platform Breakdown Table — Analytics > Referrer Analytics
The breakdown table is the core of this view. Each row represents a distinct AI platform detected as the referring source for at least one session during the selected period.| Element | What it tells you |
|---|---|
| Sessions | Raw visit volume originating from that platform in the selected period |
| % of AI traffic | That platform’s share of your total AI-driven sessions — useful for spotting concentration risk |
| Bounce rate | Percentage of single-page sessions from that platform — a proxy for landing page relevance |
| Pages / session | Average depth of engagement — higher values suggest visitors explore beyond the entry page |
| Session trend | Week-over-week or month-over-month direction — identifies which platforms are growing as traffic sources |
| Avg. session duration | Available with GA4 connected — time on site from each AI platform |
Page Performance — By-Provider Filter
Filtering Page Performance to a single AI provider reveals which URLs that platform’s users land on most. This matters because different AI platforms have different citation patterns: Perplexity tends to cite specific deep-content pages, while Copilot often links to homepages and product pages.| Element | What it tells you |
|---|---|
| Top landing pages per provider | Content that earns citations with links on that specific platform |
| AI sessions vs cited sessions gap | Pages where citation count and traffic diverge — signals unlinked mentions |
Pro Tip: If a specific provider drives significantly higher bounce rates than others, the issue is usually a content format mismatch. AI platforms that cite long-form guides tend to send visitors who expect in-depth content — landing them on a short product page produces high bounce. Fix the landing page, not the citation strategy.
Filters That Help
| Filter | How to use it for this question |
|---|---|
| Provider | Isolate a single AI platform across both Referrer Analytics and Page Performance for a focused deep-dive |
| Period | Compare 30d vs 90d — some platforms spike during product launches or news cycles, others show steady growth |
| Topic / Tag | Cross-reference which topic clusters drive traffic from each provider — useful for content investment decisions |
How to Interpret the Results
Good result
Perplexity and one other platform together account for at least 60% of your AI traffic, with bounce rates below 65% and pages/session above 1.8. Traffic from at least two platforms is growing week-over-week. GA4 data confirms that at least one AI platform is producing conversions at a cost-per-acquisition comparable to or better than paid search.Needs attention
One platform dominates your AI traffic completely (above 85% share) while others are near zero — this is concentration risk. If that platform changes its citation behavior or product, your AI traffic collapses. Additionally, if all platforms show bounce rates above 80%, the problem is not citation frequency but landing page quality: visitors are arriving, finding content that doesn’t match their intent, and leaving.Example
Scenario: You run an online education platform offering professional certification courses. Enrollment season is approaching and you want to understand which AI platforms are sending prospective students to your site, so you can allocate your content marketing budget across the right channels.
- Open Analytics > Referrer Analytics and set the period to the last 60 days. The table shows: Perplexity (1,120 sessions, 49% bounce, 2.4 pages/session), ChatGPT (310 sessions — GA4 connected, capturing app traffic), Copilot (185 sessions, 72% bounce, 1.3 pages/session), Claude (64 sessions, 41% bounce, 3.1 pages/session).
- Filter Page Performance to Provider: Perplexity. The top landing pages are
/blog/best-data-analytics-certifications,/courses/project-management-professional, and/blog/career-switch-to-ux-design. Filter to Provider: ChatGPT. The top landing pages are/courses/(the course catalog) and/pricing— visitors arriving with direct enrollment intent. - Conclude: Perplexity drives discovery traffic through long-form comparison guides, sending learners who browse multiple course pages. ChatGPT sends navigational traffic from students who already know the platform name. Content investment for Perplexity should focus on detailed “best certifications for X” guides; ChatGPT presence depends more on being recognized as a recommendation in career-advice prompts.
Go Further
Per-platform traffic breakdown
Build a per-platform AI traffic breakdown dashboard in Looker Studio using the performance-overview data source
How AI referrers are detected
Read the Referrer Analytics documentation to understand how Qwairy identifies traffic from each AI platform
Query by platform via API
Use the performance endpoint to filter and query AI traffic data by specific platform programmatically
Related Questions
Is AI visibility actually driving traffic to my website?
Connect your overall AI mention rate to measurable session volume
How visible is my brand in AI-powered search engines?
Understand your Brand Mention Visibility before drilling into traffic impact
Which AI providers mention my brand the most?
Compare mention frequency by provider — the supply side of AI traffic

