The security firm Incapsula released a report recently on the impact of bots on Internet traffic. It’s down 10 percent from 2013 but still represents 56 percent of all web visits.
The good news is bad bot traffic is down, but it’s still having a major impact on website traffic. As CIOInsights.com reported, “Bad bot traffic declined by 10 percent compared to last year. However, it accounted for 29 percent of all Website visits and threatened both big and small sites. Impersonator bots increased during the last three years but were the only type to grow consistently. Small sites (under 1,000 visits per day) received the most bot traffic.”
Before going on, let’s take a moment to understand exactly what bots are. DigitalMarketingGlossary.com (proving there is a website for everything) says bots are “part of online traffic and activities artificially generated by automated bots and spiders.”
Just like “The Wizard of Oz” had good witches and bad witches, there are good bots and bad bots. Digital Marketing Glossary offers examples of both:
Good bots are useful for providing numerous Internet services. Thus, “good bots” are:
- search engine spider bots
- bots used for availibility and response time measurement
- e-reputation spider bots
- advertising measurement bots
Bad bots are used for:
- email address harvesting
- automated account sign-up (to create multiple email accounts)
- content spinning
- blogs and comments spam
- click andimpression fraud: see botnet traffic
The bot CIOs must fear are those called impersonator bots. The Incapsula report says these are bots that “are the works of more capable hackers experienced enough to use their own tools, be it a modified versions of existing malware kits or new scripts coded from scratch.”
Impersonator bots are out there masquerading as “DDoS bots having browser-like characteristics, rogue bots masked by proxy servers, and still others attempting to masquerade as accepted search engine crawlers,” according to Incapsula. The bad news is these bots have seen consistent growth over the last three years. Their growth last year alone was 22 percent.
It’s fairly simple to see what kind of bot traffic your sites are getting. You can use Google Analytics to filter out the impact. According to SearchEngineWatch.com, it’s as simple as going into the admin settings. Click on view settings in the view panel. Then, click on Site Search Settings near the bottom of options. Under bot filtering, check the box that says Exclude all hits from known bots and spiders. You’ll filter out all the known bots and spiders, according to the site.
The Incapsula report says most traffic to websites frankly is these bots. You’re just not getting the human impressions you thought you were. “So, while bot traffic to larger websites is at roughly 50 percent, smaller and medium websites—which represent the bulk of the Internet—are actually serving two to four bot sessions for every human visitor,” the report says, with most sites getting less than 10,000 human visits a day regardless of size.
Bad bots don’t discriminate, either. As Incapsula says, they feast on all sites big and small. It doesn’t matter if a website is popular or sitting there gathering dust. About 30 percent of all traffic is going to be bad bots.
Incapsula has a good strategy for company websites that moves beyond the “captcha” technology that annoys most website visitors. (That’s the software that, for example, asks you to key in numbers from a photo to show you are human.) Now instead of bot filtering that asks who you are, it’s going to rely more on reputation and behavioral analysis. It’s going to ask you why you are visiting a website.
Odds are, though, in relatively short order bots will be able to overcome even that and a new frontier will open up in the ongoing battle.