Google Analytics 4 counts every visit, including automated ones. Bots, crawlers, and spam referrers inflate your traffic numbers, distort engagement metrics, and make it harder to understand real visitor behavior.
GA4's built-in bot filtering catches some known bots, but many slip through. In our analysis of real websites, about 60% had meaningful bot traffic that GA4 didn't filter.
Page Analytics shows two numbers for bot traffic on your dashboard. They measure different things:
Estimated bot traffic is the total percentage of sessions that look automated. This includes everything suspicious: unusual screen resolutions, geographic anomalies, missing referrers, and other patterns. Some of these are near-certain bots, others are just unusual but not conclusive.
Filterable bot traffic is the subset that Page Analytics can safely remove from your reports. Only high-confidence patterns qualify: screen resolutions that no real monitor uses (like 1280x1200 or 800x600) and similar clear-cut signals. These have near-zero risk of accidentally excluding real visitors.
The gap between the two numbers is intentional. A site might show 35% estimated bot traffic but only 14% filterable. The remaining 21% looks suspicious but could include real visitors from unusual locations or devices. Page Analytics errs on the side of keeping real traffic in your reports rather than risking false positives.
Bot traffic usually stands out once you know what to look for. These are the most common patterns:
The strongest signal. Bots often run in headless browsers with unusual or square screen resolutions like 1280x1200, 800x600, or 1024x1024. Real visitors almost never have these. In GA4, check Reports > Tech > Tech details and filter by screen resolution.
Bots often originate from specific cities or countries and show near-zero engagement rates. Look for cities like Lanzhou or Singapore appearing with 0% engagement rate and significant session counts. In GA4, go to Reports > Demographics > Demographic details and add engagement rate as a metric.
Sessions with no source attribution and low engagement often indicate automated traffic. Check Reports > Acquisition > Traffic acquisition for "(not set)" entries with suspiciously low engagement rates.
GA4 lets you exclude specific traffic using segment comparisons or data filters, but it takes work:
1. Identify the bot signature (e.g., screen resolution 1280x1200)
2. In GA4, go to Admin > Data Streams > your stream > Data filters
3. Create a filter to exclude traffic matching that pattern
4. Repeat for each bot signature you find
For ad-hoc analysis, you can also use GA4 explorations with segments to exclude known bot dimensions. This doesn't change your reports, but lets you see clean data in a specific exploration.
Page Analytics detects bot traffic automatically for each of your websites. It combines pattern matching (known bot fingerprints) with AI analysis to identify suspicious traffic.
Page Analytics runs a monthly scan of your GA4 data and builds a bot profile for each website. The scan looks at screen resolutions, geographic patterns, traffic sources, and engagement metrics to identify bot signatures.
On your dashboard, each website shows a bot traffic indicator when bots are detected. The percentage tells you how much of your traffic is likely automated.
In the browser extension, open the settings panel (gear icon) on any page. If a bot profile exists for that site, you'll see a "Filter bot traffic" toggle. Turn it on, and all your reports for that site will exclude detected bot traffic automatically.
The filter applies GA4 dimension filters behind the scenes, excluding traffic from known bot screen resolutions and suspicious geographic sources. Your raw GA4 data stays untouched.
These patterns appear across many different websites, regardless of industry or size:
Pattern | What it means | Confidence |
|---|---|---|
Screen resolution 1280x1200 | Not a real monitor resolution. Classic bot fingerprint. | High |
Screen resolution 800x600 | Outdated resolution rarely seen on real devices since 2010. | High |
Square resolutions (e.g., 1024x1024) | No mainstream monitors are square. Headless browser default. | High |
City clusters with 0% engagement | Significant traffic from a city with no real interaction. | Medium |
Source "(not set)" with low engagement | No referrer and no engagement often means automated visits. | Low |
When you enable bot filtering in Page Analytics, your reports show cleaner numbers. Specifically:
Session and user counts drop (the bot sessions are excluded)
Engagement rate goes up (bots had 0% engagement)
Geographic and source breakdowns become more accurate
Heatmaps and click data remain unchanged (bots rarely trigger real click events)
GA4 excludes traffic from known bots and spiders (based on the IAB bot list), but this only catches a fraction of automated traffic. Many bots mimic real browsers and slip through.
No. Page Analytics applies filters at the reporting level only. Your GA4 property keeps all data as-is.
Bot profiles are rebuilt monthly. They're also refreshed when you add a new website or reconnect your account.
Page Analytics only filters with high confidence by default. The excluded patterns (like screen resolution 1280x1200) have near-zero false positive rates. If you're unsure, toggle the filter off and compare the numbers.