
Most SEO teams are making decisions based on numbers that are quietly, consistently wrong. Not dramatically wrong. Just wrong enough that the trend lines look fine, the reports go out, and nobody notices until a client asks why conversions dropped 18% while "traffic is up."
This playbook covers the full organic traffic check process: verifying your own site's numbers with GA4, Google Search Console, server logs, and BigQuery; estimating competitor traffic with a CTR model updated for the AI Overview era; diagnosing drops with an agency-grade forensic checklist; and translating all of it into revenue figures that hold up in a boardroom.
By the end, you'll have a repeatable audit framework you can run quarterly, not just a list of tools someone copy-pasted from a 2021 blog post.
What an Organic Traffic Check Actually Is
An organic traffic check is two things at once: a point-in-time snapshot (how many unpaid search visits did this site receive?) and a diagnostic process (is that number trustworthy, and what's actually driving it?). Most people do the first part. Almost nobody does the second.
That's the gap this guide fills.
There are three scenarios that typically trigger an organic traffic check. The first is routine reporting — you need a number for a weekly dashboard, a client deck, or a quarterly review. The second is drop diagnosis — something changed, sessions are down, and you need to figure out whether it's a real traffic loss or a measurement artifact. The third is competitive intelligence — you want to estimate how much organic traffic a competitor is getting, with no access to their analytics.
Each scenario requires a different emphasis, but the same underlying methodology. Owner-first diagnostics using GA4 Traffic Acquisition reports, Google Search Console Performance data, and server logs or BigQuery exports give you ground truth for your own site. Competitor estimation uses keyword-rank-CTR models from third-party tools, adjusted for the zero-click reality of 2024–2026 SERPs. Reconciliation ties the two together and surfaces discrepancies. Revenue translation converts sessions into a number that non-SEOs actually care about.
Run this audit once and you'll understand your traffic better than 90% of the industry.
Run it quarterly and you'll never be blindsided by a drop again.
Checking Your Own Site's Organic Traffic

First-party data is always the starting point. GA4 and Google Search Console together give you more signal than any third-party tool, but they measure different things in different ways, and treating them as interchangeable is where most teams go wrong.
GA4 + GSC Side by Side
In GA4, go to Reports → Acquisition → Traffic Acquisition. Set your primary dimension to "Session default channel group" and filter for "Organic Search." That's your baseline organic session count. Simple enough. The trap is the "Direct" channel bucket, which absorbs a significant share of dark traffic: sessions where GA4 can't determine the source because the referrer was stripped (think HTTPS-to-HTTP transitions, mobile apps, email clients, Slack previews). Depending on your site, Direct can be inflating by 10–20% while quietly stealing organic credit.
If your Direct share is above 15–20%, investigate before trusting your Organic number.
In Google Search Console, open the Performance report, set Search type to "Web," and look at clicks and impressions over the same date range. GSC counts a click as one user clicking one URL for one query on one day. GA4 counts sessions, which can span multiple page views and reset at midnight or after 30 minutes of inactivity. These will never match.
That's not a bug. That's two different measurement units.
Here's the reconciliation formula that actually explains the gap:
GSC Clicks × (1 − bounce-before-ping rate) × (1 − bot share) ≈ GA4 Organic Sessions
Let's walk through a real example. Say GSC reports 10,000 clicks over 30 days.
- Bounce-before-ping rate (users who land and leave before GA4's tracking fires, typically 8–12%): subtract 10% → 9,000 remaining
- Bot share (crawlers and scrapers that trigger server-side requests but aren't filtered by GA4's client-side tag, typically 5–8%): subtract 7% → 8,370 remaining
- Cross-device and session-reset inflation (one user, multiple sessions): add roughly 1–3% back → ~8,400 GA4 organic sessions
A gap of 15–20% between GSC clicks and GA4 organic sessions is normal. A gap above 30% is a red flag worth investigating. Below 10% and you should double-check your bot filtering, because something is probably overcounting on the GA4 side.
Server Logs and BigQuery
Client-side analytics miss things. Server logs don't. If you have GA4 data exported to BigQuery (set this up if you haven't, it's free up to 10GB/month), you can cross-reference GA4 events with server-side HTTP logs to surface phantom sessions: requests where GA4 fired but the user-agent in the server log is a known crawler.
Here's a ready-to-run SQL query that does exactly that:
SELECT
g.user_pseudo_id,
g.event_timestamp,
h.request_path,
h.user_agent
FROM
`your_project.analytics_XXXXXXX.events_*` AS g
JOIN
`your_project.http_logs.requests` AS h
ON
g.user_pseudo_id = h.client_id
AND TIMESTAMP_MICROS(g.event_timestamp) BETWEEN
TIMESTAMP_SUB(PARSE_TIMESTAMP('%d/%b/%Y', h.log_date), INTERVAL 1 MINUTE)
AND TIMESTAMP_ADD(PARSE_TIMESTAMP('%d/%b/%Y', h.log_date), INTERVAL 1 MINUTE)
WHERE
REGEXP_CONTAINS(LOWER(h.user_agent), r'(googlebot|bingbot|ahrefsbot|semrushbot|dotbot)')
AND g.event_name = 'session_start'
Any rows returned here are sessions that GA4 counted as real users but the server identified as bots. These inflate your organic session count and, if they're concentrated on specific pages, can distort your conversion rate calculations downstream.
Filtering Bot and Crawler Noise
GA4 has built-in bot filtering, but it's not exhaustive. For server-side filtering, flag any IP address generating more than 200 requests per day, any session with a duration of 0 milliseconds, and any user-agent matching known crawler signatures.
For GA4's internal filters, use this regex in your data stream's internal traffic rules or in a custom filter:
(?i)(bot|crawl|spider|slurp|mediapartners|adsbot|facebookexternalhit|twitterbot)
One more thing that gets missed constantly: UTM collision. A paid campaign tagged with utm_medium=organic (yes, this happens, more often than anyone admits) silently reclassifies paid sessions as organic. Run this BigQuery query to surface the problem:
SELECT
traffic_source.medium,
traffic_source.source,
traffic_source.name,
COUNT(*) as session_count
FROM
`your_project.analytics_XXXXXXX.events_*`,
UNNEST(event_params) AS ep
WHERE
event_name = 'session_start'
AND traffic_source.medium = 'organic'
AND traffic_source.name IS NOT NULL
AND traffic_source.name != '(not set)'
GROUP BY 1, 2, 3
ORDER BY 4 DESC
If you see campaign names in the results, those are UTM-tagged sessions misclassified as organic. Fix the tags at the source, then reprocess historical data if BigQuery exports allow it.
This single issue explains a surprising number of "organic traffic spikes" that disappear when you look harder.
Estimating Competitor Organic Traffic

You don't have access to a competitor's GA4 or GSC. What you have is keyword ranking data, estimated search volumes, and CTR models built from clickstream panels. That's the foundation every third-party tool uses, and understanding the mechanics tells you exactly when to trust the numbers and when to treat them as rough directional signals.
Third-Party Tools and Their Limits
Semrush, Ahrefs, and Similarweb all use some variation of the same approach: estimate which keywords a domain ranks for, apply a position-based CTR curve to those rankings, multiply by search volume, and sum it up. The differences lie in their keyword databases, their panel sizes, and how frequently they update their CTR curves.
Accuracy degrades sharply for sites under roughly 50,000 monthly visits. Below that threshold, the keyword database coverage is sparse, ranking position estimates are less reliable, and the margin of error balloons. For non-English markets, especially Southeast Asia, Eastern Europe, and LATAM, the panels are thinner and the estimates are even less reliable. A 2024 accuracy study comparing third-party tool estimates to verified GSC data found median error rates of 32% for mid-size sites and over 50% for sites under 20,000 monthly visits.
Semrush tends to outperform Ahrefs for large sites with diverse, broad keyword sets, partly because of its larger keyword database. Ahrefs is generally stronger for backlink-heavy analyses and tends to be more conservative in its traffic estimates, which can actually be useful when you want a floor rather than a ceiling.
Similarweb uses a different methodology, blending clickstream panel data with ISP data and web crawls. It's often better for estimating total site traffic across all channels but less precise for organic-specifically. Use it as a sanity check, not a primary source.
Adjusting for Zero-Click SERPs
Here's the problem nobody's talking about loudly enough. The CTR curves these tools use were largely calibrated on pre-2023 SERP data, when position 1 might earn around 28% CTR for a competitive informational query.
That world is gone.
With AI Overviews appearing at the top of the SERP for a growing share of informational queries, position-1 CTR for those query types has dropped 15–30% in studies published between 2024 and 2026. Users get an answer from the AI Overview and never click. Third-party tools haven't fully corrected their models for this yet, which means their estimates are systematically overstated for informational-heavy keyword profiles.
Here's an adjusted CTR model you can apply manually:
Adjusted Traffic = Tool Estimate × (1 − SERP Feature Discount)
Where: SERP Feature Discount = (% of target keywords triggering AI Overview or Featured Snippet) × 0.25
Walk through an example. A competitor has 500 ranking keywords. You run a SERP feature audit (Semrush's SERP Features filter or a manual sample) and find that 40% of those keywords trigger an AI Overview or Featured Snippet. The discount is 40% × 0.25 = 10%. If the tool estimates 80,000 monthly visits, your adjusted estimate is 80,000 × (1 − 0.10) = 72,000.
That 10% might sound small, but for a site with heavy informational content, AI Overview coverage can reach 60–70% of the keyword set, pushing the discount to 15–17.5%. On a 500,000-visit estimate, that's 75,000–87,500 sessions you'd be over-attributing to a competitor.
The triangulation method: pull estimates from two tools (say Semrush and Ahrefs), apply the zero-click discount to both, and report the range. If Semrush says 80,000 adjusted and Ahrefs says 65,000 adjusted, your estimate is "65,000–80,000 monthly organic visits." That's a defensible range. A single number from a single tool, unadjusted, is not.
Always present competitor estimates with explicit ±30–50% error bars in stakeholder decks. It makes you look more credible, not less.
Traffic Forensics: Diagnosing Drops
A sudden organic traffic drop is one of the most stressful things an SEO team deals with. It's also one of the most frequently misdiagnosed. Before you start panicking about algorithm updates, you need to establish whether the drop is real or a measurement artifact. The checklist below is the sequence that technical SEOs at agencies run, in order, every time.
- Pin the exact drop date in GA4. Use the date comparison feature to find the first day the decline appears. A sharp single-day cliff suggests a technical event or algorithm update. A gradual slope over weeks suggests content erosion or competitive displacement.
- Cross-check GSC clicks on the same date range. This is the most important diagnostic step. If GSC also drops, the traffic loss is real. If GA4 drops but GSC is flat, you have a tracking problem, not a traffic problem. Don't spend three days auditing your content when the issue is a broken GA4 tag.
- Check the Google Search Status Dashboard and MozCast. Correlate your drop date with confirmed algorithm updates. If MozCast shows a temperature spike of 90°+ on or near your drop date, an update is the likely cause.
- Run the Index Coverage report in GSC. Look for a spike in "Excluded" pages, particularly in the "Crawled, currently not indexed" or "Discovered, currently not indexed" categories. A sudden exclusion spike points to a crawl or indexing issue, not an algorithm penalty.
- Audit SERP feature capture for your top 10 queries. Did a competitor steal a Featured Snippet? Did an AI Overview appear for a query that was previously a clean blue-link result? SERP feature displacement is a real traffic loss that won't show up as a ranking drop because your position may be unchanged.
- Check server response codes via log analysis. A spike in 5xx errors during the drop window means your server was returning errors to Googlebot, which suppresses crawling and can cause temporary ranking losses even after the server issue is resolved.
- Run the UTM audit query from Section 2. Rule out misattribution before concluding the drop is organic. A new paid campaign launched the same week with a broken UTM tag is a common culprit that looks exactly like an organic drop in GA4.
The three drop archetypes map to this decision tree:
- Algorithm update: GSC clicks drop, impressions hold or rise, average position worsens across a broad keyword set.
- Technical issue: GSC impressions collapse (not just clicks), server logs show crawl errors or 5xx spikes, index coverage shows new exclusions.
- Measurement error: GA4 drops but GSC is flat. UTM audit reveals a new campaign tag collision. The traffic didn't go anywhere; the attribution broke.
If GSC and GA4 both drop, the traffic loss is real. If only GA4 drops, it's a tracking problem. If only impressions drop, it's an indexing or crawl issue.
Memorize that. It will save you hours of misguided investigation.
One more thing worth mentioning: thin or duplicate pages that pass unnoticed during stable periods become disproportionately vulnerable during broad core updates. A content audit that identifies these pages before an update hits is worth far more than a post-mortem after the fact.
Converting Traffic Into Revenue

Traffic numbers are interesting. Revenue numbers get budget approved. If you're presenting organic performance to leadership or a client, the conversation needs to end with money, not sessions. Here's how to build that calculation in a way that's defensible rather than optimistic.
The conversion-adjusted traffic value formula:
Organic Revenue = Monthly Organic Sessions × CVR × AOV (or Lead Value)
Example: 50,000 monthly organic sessions, 2.1% conversion rate, $180 average order value. That's 50,000 × 0.021 × $180 = $189,000/month attributed to organic search. Annualized, that's $2.27 million. That's a number that gets attention in a budget meeting.
The paid-search equivalent calculation adds another layer of credibility:
Traffic Value = Organic Sessions × Blended Average CPC of Ranking Keywords
Pull your top queries from GSC, match them against Google Keyword Planner CPC estimates, calculate a blended average. If your blended CPC is $2.40 across 50,000 sessions, your traffic value is $120,000/month in avoided paid search spend. This is the number that resonates with CFOs who don't understand SEO but absolutely understand "we're getting $120,000 worth of traffic we're not paying for."
Neither of these numbers should be presented as a single point estimate. Build a sensitivity table that varies the key inputs:
| CVR \ AOV | $100 | $180 | $300 |
|---|---|---|---|
| 1.0% | $50,000 | $90,000 | $150,000 |
| 2.1% | $105,000 | $189,000 | $315,000 |
| 3.0% | $150,000 | $270,000 | $450,000 |
(All figures based on 50,000 monthly organic sessions)
Presenting a range instead of a single number does something counterintuitive: it makes you more trustworthy, not less. Anyone who's been in a boardroom knows that a single precise number from an SEO report is almost certainly made up. A range with labeled assumptions signals that you understand the math.
To automate this, build a Looker Studio dashboard connected to your GA4 and GSC BigQuery exports. The four core tiles you need: Organic Sessions (GA4 filtered), Conversion Rate (GA4 goal completions / sessions), Estimated Revenue (sessions × CVR × AOV, with CVR and AOV as editable parameters), and Traffic Value (sessions × blended CPC, updated monthly from keyword planner data). This dashboard refreshes automatically, takes about 20 minutes to build, and eliminates the manual spreadsheet rebuild every reporting cycle.
Organic Traffic FAQs

How do I check organic traffic for any website?
For your own site, GA4 and Google Search Console are the ground truth sources. In GA4, go to Acquisition → Traffic Acquisition and filter by Organic Search channel group. In GSC, use the Performance report filtered to Web search type. For competitor sites, use Semrush or Ahrefs to pull estimated organic traffic, then apply the zero-click SERP discount described in this guide to correct for AI Overview and Featured Snippet coverage. Never rely on a single tool or a single number for competitor estimates.
What is the most accurate tool for checking organic traffic?
For your own site, no third-party tool beats first-party data. GA4 combined with GSC is definitively more accurate than any external estimator. For competitor research, Semrush edges out Ahrefs for large sites with broad, diverse keyword sets based on 2024 accuracy benchmarks, while Ahrefs tends to be more conservative and useful as a floor estimate. Similarweb is better for total traffic across all channels but less precise for organic specifically. The most accurate approach is always two tools, cross-referenced, with a zero-click discount applied.
Why does GSC differ from GA4 organic numbers?
GSC and GA4 measure different things, so they will always differ. GSC counts clicks: one user clicking one URL for one query equals one click. GA4 counts sessions, which reset at midnight and after 30 minutes of inactivity, meaning one user can generate multiple sessions from one click. Additional factors include bounce-before-ping (users who leave before GA4's JavaScript fires), bot filtering differences between client-side and server-side detection, and cross-device attribution gaps. A 15–20% gap is normal. Neither is wrong. They're just measuring different units of the same underlying behavior.
How do I estimate a competitor's organic traffic?
Use two tools (Semrush and Ahrefs are the standard pair), pull their organic traffic estimates for the target domain, and apply the AI Overview/zero-click discount: multiply the estimate by (1 − (percentage of target keywords with AI Overview or Featured Snippet × 0.25)). Report the result as a range, not a single number, and explicitly note the ±30–50% error margin in any stakeholder-facing document. For markets outside English-language search, widen that error bar further.
What causes sudden organic traffic drops?
Sudden organic traffic drops fall into five categories: a broad core algorithm update (check Google Search Status Dashboard and MozCast for correlation), a technical crawl issue (5xx errors, robots.txt changes, accidental noindex tags), UTM misattribution where a new paid campaign is stealing organic credit in GA4, SERP feature displacement where an AI Overview or competitor Featured Snippet captures clicks that previously went to your result, or a manual penalty in GSC. Use the forensic checklist in this guide to isolate which category applies before taking any corrective action.
Run Your First Audit Today
If you do nothing else after reading this, do this: open GSC and GA4 side by side for the same 90-day window and calculate the gap percentage between GSC clicks and GA4 organic sessions. If the gap is under 20%, your tracking is reasonably healthy and you can focus on the content and competitive work. If it's over 20%, you have a measurement problem that needs to be fixed before any other optimization work is meaningful.
This matters more than most teams realize. The average SEO team is optimizing against numbers that are 15–25% off from reality. That means their top-performing pages might not be their actual top performers, their conversion rate benchmarks are wrong, and their ROI calculations are built on a shaky foundation.
You can't grow what you can't measure correctly. That's not a motivational poster, it's a practical constraint.
An organic traffic check isn't a one-time project. It's a quarterly hygiene habit, like reconciling your bank statement, except the stakes are your entire acquisition strategy. Run the reconciliation formula. Apply the zero-click discount to your competitor estimates. Build the Looker Studio dashboard. Then do it again in 90 days and compare.
That's the whole framework.