Google Analytics: How to Exclude Search Engine Bots and Spiders

Insights

17 November 2016

Google Analytics:  How to Exclude Search Bots and Spiders from Your Reports

Sadly, you can’t always stop spam bots from reaching your website.  Fortunately however, Google Analytics does allow you to exclude a large percentage of spam and search bot traffic in your reporting, increasing the value and reliability of the visits which are being recorded. This will help make your data much more trustworthy, validating customer patterns and the peaks and troughs you see within your Google Analytics graphs.

Google Analytics runs on your site using JavaScript, and until recently, search and spam bots had difficulty crawling JavaScript.  However, as technology becomes smarter, this is now not the case.  Most spam bots will be excluded from Google Analytics automatically, however there are still many bots that enjoy spamming websites and breaking servers and it is these bots that still show up in your analytics data.

How to exclude search engine bots in Google Analytics

For those of you that are worried about how much of your data is now valid human activity, there is a way to purify this data going forward.  Quite simply, the power to stop bot data showing within your analytics lies within the ‘Exclude all hits from known bots and spiders check box within your View admin section in Google Analytics.

Steps to exclude all hits from known bots and spiders:

1. Create a ‘test’ view within your Google Analytics Property

This will allow you to make changes whilst keeping all your original data safe in the ‘master’ view, should you need to return to it in future.  It will also help you identify the change that takes place once the exclusion box has been checked. Once you are satisfied with the results, exclude bots from your main view.

website-test-view

2. Exclude Bots and Spiders

Go to the View Settings section under the Admin section in your Google Analytics view, and check the ‘Exclude all hits from known bots and spiders  box. You will then start filtering out bot and spider traffic which will make reporting on human visits and activity much clearer.

exclude all bots and spiders in Google Analytics

3. Annotate in Google Analytics

Add an annotation to your Google Analytics graph, so that you can identify any traffic drop that may take place after the bot exclusion filter has been turned on.

Add annotation in Google Analytics

 

What can we expect after filtering bot and spider traffic?

It’s safe to say that you may notice a slight drop in traffic, but the level of the drop entirely depends on how much of your traffic was made up of bots and spiders in the first place.  If you found your website subject to high volumes of spam visits in Google Analytics, then the drop in traffic may be more than if your site were to experience a smaller volume of spam visits for example.  Comparing the ‘test’ view with the ‘master’ view will help you identify where traffic has decreased, and make reporting much more reliable going forward.

If you’d like more advice on blocking unwanted traffic from Google Analytics, please contact the Varn team who will be happy to advise.

Article by: Aimee, Head of Innovation More articles by Aimee

Share this article:


Warning: Trying to access array offset on value of type null in /home/varnco/public_html/varn2022co/wp-content/themes/varn/flexible-templates/colourful_banner.php on line 5

Sign up for the latest SEO insights

Stay up to date with the very latest search marketing insights and news from Varn

Perform Better

Sign Up for Varn Insights
Sign Up for Latest Insights

Keep up to date with the latest search marketing news, insights, algorithm changes and research