TL;DR — Set a custom date range on Overview > Performance spanning before and after your campaign to spot a step change in the evolution chart. Go to Compare > By Topic to verify the lift is concentrated on campaign-relevant topics, not seasonal noise. Check Analytics > Referrer Analytics for the same window to see if AI-referred traffic spiked alongside visibility. Pro tip: export the Compare evolution data showing your brand separating from the competitor pack during the campaign window — it is your most compelling proof-of-impact slide.
The Question
“Did my latest marketing campaign improve my AI visibility?”Campaigns drive content, press, backlinks, and social proof — all inputs that AI models weigh when deciding what to recommend. But the lag between a campaign going live and AI models updating their outputs can range from days to weeks depending on the provider. This page shows you how to isolate a campaign window in Qwairy and read the before/after signal with precision. You might also be wondering:
- “How long does it take for AI models to reflect my campaign’s content?”
- “Did my competitors also gain visibility during my campaign period, or did I pull ahead?”
- “Did sentiment improve alongside visibility, or just raw mention count?”
Where to Go in Qwairy
Start here: Overview > Performance
Navigate to Overview > Performance — your primary before/after view.
Set the Period filter to a custom range: start one full period before the campaign launch (e.g., 30 days pre-launch) and end at today. The evolution chart will show you the trend line across Brand Mention Visibility, Source Citation Visibility, and Average Sentiment with the campaign date as an inflection point to look for.
Focus on the evolution sparkline per metric card — a genuine campaign lift shows a visible step change in the curve, not just noise fluctuation.
Go deeper: Overview > Compare
Cross-reference with Overview > Compare (Competitor Compare).
Use the By Provider and By Topic breakdowns to verify whether your visibility gain was broad-based or concentrated. A campaign about a specific product feature should show a lift specifically on topic tags related to that feature — not a uniform lift across all topics, which would suggest the change is seasonal rather than campaign-driven.
Use the Evolution tab in Compare to overlay your brand’s trend against competitors: if competitors also moved up during the same window, external factors (seasonality, industry news) are likely driving the change. If only you moved, your campaign earned it.
Complete the picture: GA4 Traffic Correlation
Connect Google Analytics 4 referrer data by opening Analytics > Referrer Analytics for the same campaign window.
Look for a concurrent spike in AI-referred traffic (sessions from chatgpt.com, perplexity.ai, claude.ai, gemini.google.com) that coincides with the visibility lift in Qwairy. A visibility gain without a traffic signal means AI models are mentioning you but users are not clicking through — valuable brand-building, but a different type of win. A visibility gain with a matching traffic spike confirms the campaign is driving bottom-of-funnel AI impact.
What to Look For
Evolution Chart — Performance Dashboard
The evolution chart is your primary instrument for reading campaign impact. It plots each tracked metric as a time series across the selected period. Set a custom date range that spans at least two weeks before the campaign launch and the full duration after, so the baseline is clearly established before the step change.| Element | What it tells you |
|---|---|
| Brand Mention Visibility trend | Whether AI models are citing your brand more often after the campaign |
| Source Citation Visibility trend | Whether new content or press from the campaign earned incoming AI citations |
| Average Sentiment trend | Whether campaign messaging improved how AI describes your brand |
| Provider-level sparklines | Which AI platforms responded fastest to your campaign content |
Competitor Compare — Evolution Tab
The Compare evolution view adds competitive context. Without it, you cannot distinguish a true campaign win from a rising tide that lifted all boats in your category.Pro Tip: Export the Compare evolution data via Workspace > Exports and paste it into a slide. A chart showing your brand’s curve separating from the competitor pack during the exact campaign window is your most compelling proof-of-impact asset for a marketing review.
Filters That Help
| Filter | How to use it for this question |
|---|---|
| Period (custom range) | Isolate exactly the campaign window — 4 weeks pre-launch through 4 weeks post-launch is a reliable baseline-to-impact window |
| Provider | Check which AI platforms responded to the campaign — campaign PR and content often moves Perplexity and Google AI first, then ChatGPT with a lag |
| Topic / Tag | Verify that visibility lifted on the topics your campaign targeted, not just overall — confirms the right signal is moving |
How to Interpret the Results
Good result
Brand Mention Visibility increased by 5 percentage points or more within three weeks of campaign launch, with the lift sustained (not a single spike that reverted). Source Citation Visibility also moved — at least a 2-point increase — indicating that new content or press coverage from the campaign is being indexed by AI models as a trusted source. Sentiment remained stable or improved. The Compare evolution tab shows your brand outpacing competitors over the same window.Needs attention
Brand Mention Visibility spiked briefly (one data point) and then returned to baseline, suggesting a temporary crawl rather than lasting incorporation into AI training signals. Or: visibility lifted on providers that send no measurable referrer traffic, so the brand uplift has no revenue correlation. Or: competitors moved by the same magnitude in the same window, indicating seasonal lift rather than campaign impact.Example
Scenario: A D2C skincare brand launches a new anti-aging serum in October with a 3-week product launch campaign — publishing clinical trial results on their blog, earning 15 beauty editor reviews, and running influencer partnerships across Instagram and TikTok.
- Open Overview > Performance and set the period to September 1 – November 15. The evolution chart shows Brand Mention Visibility at a stable 18% through September, then a visible step to 27% starting in week 2 of October — aligning with the beauty editor reviews going live and being crawled by AI models.
- Navigate to Overview > Compare > By Topic and filter to the topic tag “anti-aging skincare.” Your visibility on that tag rose from 12% to 33%, while the two main competitors stayed flat at 26% and 29% respectively. This confirms the lift is campaign-specific and tied to the product launch, not a seasonal trend.
- Open Analytics > Referrer Analytics for the same window and filter to AI referrers. Sessions from Perplexity increased 280% month-over-month — beauty publications that reviewed the serum are being cited as sources. ChatGPT referrers show a 45% increase with a 2-week lag, consistent with its slower crawl cycle for new product content.
Go Further
Export before/after campaign metrics
Export pre-campaign and post-campaign visibility metrics as XLSX for campaign impact analysis
Campaign impact dashboard
Build a campaign impact dashboard in Looker Studio using the performance-overview data source with date comparison
Set up monitoring for campaigns
Read the Monitoring documentation to configure automated monitoring that captures AI visibility changes during campaigns
Related Questions
How has my AI visibility changed over time?
Understand long-term visibility trends and what caused each inflection point
How quickly do AI models pick up new content I publish?
Measure the lag between content publication and AI model incorporation
What is my share of voice vs competitors?
Put your campaign gains in competitive context
How do I set up automated GEO reporting?
Automate campaign impact reporting for your team

