One study estimates that bots account for 56% of website traffic — and it’s likely their influence is skewing your analytics. Columnist Ben Goodsell outlines the threat and explains how to mitigate it.
Many marketers have a vague awareness about server Log Files, but few know that they can be used to clean up the analytics data you’re using to make decisions about your site.
You can do that by using them to identify bad bots, which, more and more, are executing JavaScript, inflating analytics numbers, expending your resources and scraping and duplicating content.
The Incapsula 2014 bot traffic report looked at 20,000 websites (of all sizes) over a 90-day period and found that bots account for 56% of all website traffic; 29% were malicious in nature.
Additional insight showed the more you build your brand, the larger a target you become.
There are services available that automate much more advanced techniques than what I discuss in the full article on Search Engine Land, but the column is a starting point to understand the basics and clean up your reports using Excel. Check out the full article on our sister site.
Some opinions expressed in this article may be those of a guest author and not necessarily Marketing Land. Staff authors are listed here.
(Some images used under license from Shutterstock.com.)
Marketing Land – Internet Marketing News, Strategies & Tips
(192)