New report: Almost all bad bots are highly sophisticated and hard to detect
According to anti-bot service Distil Networks, 88 percent of all malicious bots are now "advanced persistent bots."
There is one bright spot in Distil Network’s third annual report on bad bots, out this week: Human-generated internet traffic outnumbered bot traffic last year from the first time since 2013. So our carbon-based life form has retaken the lead online.
But it’s downhill from there, according to “The 2016 Bad Bot Landscape Report: The Rise of Advanced Persistent Bots.”
About 28 percent of all web traffic comes from non-malicious bots, the report said, plus the bad bots represent another 18 percent of the total.
Good bots include search engine spiders, Facebook pulling in external content or the Internet Archives adding images of websites to its collection. Bad bots can fraudulently inflate web traffic or load ads, conduct competitive data mining, harvest financial and personal data, attempt brute-force logins, spam, wage man-in-the-middle attacks and transaction fraud and more.
One of the biggest takeaways from this new report, co-founder and CEO Rami Essaid told me, is that about 88 percent of bad bots are now highly sophisticated, which the report brands for the first time as advanced persistent bots, or APBs. This is an increase over 77 percent in 2014.
“A lot of previous bots were written for specific purposes and script-driven,” he said, but websites have become more complex, and bad bots have adapted.
Although highly sophisticated bots are increasing as a percentage of overall bad bots, it turns out that bad bot traffic overall decreased from 23 percent in 2014 to about 18 percent last year. Good bots also decreased, 36 percent to 28 percent. The reasons, according to the report:
“First, there has been a significant influx of new internet users, especially from China, India, and Indonesia. Second, bot operators continue to improve their software, creating more advanced persistent bots (APBs). Bad bot operators are opting for quality over quantity.”
Sites’ bot traffic varies by industry. But small digital publishers that rank between 50,001 and 150,000 on Alexa are being hit the hardest. Distil found that an astounding 56 percent of their traffic comes from bad bots.
The report said that internet service providers Comcast and Time Warner are no longer on the Top 20 Bad Bot Originators list, as they were in 2013 and 2014. Essaid attributed this to better anti-bot protection on the residential computers served by the two major ISPs, so that fewer home-based computers were spewing bots after being hijacked into botnets.
The report’s data comes from the traffic on Distil clients’ websites, which generated trillions of site requests. Essaid said this represents somewhere between .1 and one percent of all web traffic, enough to be “statistically significant.”
While Distil’s traffic is focused on the US, there is also representation from other countries, which he noted tends to be “a couple of years behind us in trends.”
He acknowledged that a report like this is self-serving, in that the solution to the presented problem is his company’s services. “But the solution doesn’t have to be us,” he said, adding that it could be other anti-bot services, or sites could “get smarter about bots.”
One way, he suggested, is for site marketers to include bots in their thinking. So if there’s A/B testing, he said, it might be best to do the testing over a longer period of time so you can “look for abnormalities.”
As for efforts by some brands and networks to emphasize performance-based advertising to get around bots, Essaid pointed out that bots can “already fill out a form or [give] an email address.” Or even make a purchase with a stolen credit card, if their makers thought it was worth the effort.
Opinions expressed in this article are those of the guest author and not necessarily MarTech. Staff authors are listed here.