Small websites with traffic under 10K visitors are largely visited by robots – some good, some bad.
What is certain is that they are not human. And bad bots artificially inflate website traffic resulting in misleading metrics. If your company is using Google Analytics to measure traffic, be sure your growth is not a result of bot traffic. Robots that create artificial impressions can lead to real problems. There’s no sense acting on faulty metrics. So what can you do?
How to fix Bad Bots
There are a few steps you can take to first recognize the bots, and second, filter them out. Search Warrant can remove bot traffic from your logs and metrics to provide you with an accurate dataset for 2016. Your final filters will look something like this:
There are more robust methods to remove robot traffic from your website too but filtering it in Google Analytics will do the trick for most of you. At the very least, consider removing bot traffic from your logs. Recall the mantra from our last post – what’s measured gets managed – and be sure to measure real people, not robots. Unless the robots are buying a lot of product. In that case we can help you find and target more robots.