How To Banish Bad Bots From Your Site Analytics
One study estimates that bots account for 56% of website traffic -- and it's likely their influence is skewing your analytics. Columnist Ben Goodsell outlines the threat and explains how to mitigate it.
Many marketers have a vague awareness about server Log Files, but few know that they can be used to clean up the analytics data you’re using to make decisions about your site.
You can do that by using them to identify bad bots, which, more and more, are executing JavaScript, inflating analytics numbers, expending your resources and scraping and duplicating content.
The Incapsula 2014 bot traffic report looked at 20,000 websites (of all sizes) over a 90-day period and found that bots account for 56% of all website traffic; 29% were malicious in nature.
Additional insight showed the more you build your brand, the larger a target you become.
There are services available that automate much more advanced techniques than what I discuss in the full article on Search Engine Land, but the column is a starting point to understand the basics and clean up your reports using Excel. Check out the full article on our sister site.
Contributing authors are invited to create content for MarTech and are chosen for their expertise and contribution to the martech community. Our contributors work under the oversight of the editorial staff and contributions are checked for quality and relevance to our readers. The opinions they express are their own.
Related stories
New on MarTech