Google Analytics: How to Exclude Search Bots and Spiders from Your Reports
Reporting on website traffic in Google Analytics can be a constant battle of good and evil. Are you aware that some of your traffic data could be coming from robots? These bots and spiders could be influencing the data in your Google Analytics reports. What if there was a way to exclude bot and spider traffic, which meant that your reporting in Google Analytics was much more reliable?
How much of your data could be bot traffic?
Sadly, you can’t always stop spam bots from reaching your website. Fortunately however, Google Analytics does allow you to exclude a large percentage of spam and search bot traffic in your reporting, increasing the value and reliability of the visits which are being recorded. This will help make your data much more trustworthy, validating customer patterns and the peaks and troughs you see within your Google Analytics graphs.
How to exclude search engine bots in Google Analytics
For those of you that are worried about how much of your data is now valid human activity, there is a way to purify this data going forward. Quite simply, the power to stop bot data showing within your analytics lies within the ‘Exclude all hits from known bots and spiders’ check box within your View admin section in Google Analytics.
Steps to exclude all hits from known bots and spiders:
1. Create a ‘test’ view within your Google Analytics Property
This will allow you to make changes whilst keeping all your original data safe in the ‘master’ view, should you need to return to it in future. It will also help you identify the change that takes place once the exclusion box has been checked. Once you are satisfied with the results, exclude bots from your main view.
2. Exclude Bots and Spiders
Go to the View Settings section under the Admin section in your Google Analytics view, and check the ‘Exclude all hits from known bots and spiders’ box. You will then start filtering out bot and spider traffic which will make reporting on human visits and activity much clearer.
3. Annotate in Google Analytics
Add an annotation to your Google Analytics graph, so that you can identify any traffic drop that may take place after the bot exclusion filter has been turned on.
What can we expect after filtering bot and spider traffic?
It’s safe to say that you may notice a slight drop in traffic, but the level of the drop entirely depends on how much of your traffic was made up of bots and spiders in the first place. If you found your website subject to high volumes of spam visits in Google Analytics, then the drop in traffic may be more than if your site were to experience a smaller volume of spam visits for example. Comparing the ‘test’ view with the ‘master’ view will help you identify where traffic has decreased, and make reporting much more reliable going forward.
If you’d like more advice on blocking unwanted traffic from Google Analytics, please contact the Varn team who will be happy to advise.