Bots occupy two thirds of Internet Traffic

As per the report, e-commerce applications and login portals were the most common targets of advanced persistent bots.

 

According to statistics found in a new report from cloud security provider Barracuda Networks, reveals that automated traffic accounts for nearly 64% of internet traffic.

From these whopping stats, the automated traffic by good bots was 25% which included search engine crawlers and social network bots, while 39% traffic was from bad bots.

Bad bots comprise of basic web scrapers and attack scripts, along with advanced and persistent bots.

These advanced bots deliberately try to escape standard defenses and constantly attempt to carry out malicious activities under the radar. The report notified that the ones that went after e-commerce applications and login portals were the most common among these persistent bots.

Bot attacks titled report comprised of the Top Threats and Trends having detailed Insights into the growing number of automated attacks, and also a breakdown of bad bot traffic by location.

The report showed that North America accounted for 67% of bad bot traffic, followed by Europe with 22% and then Asia 7.5%.

It is quite interesting as the European bot traffic was thought to come from hosting services (VPS) or residential IPs than the North American traffic, which mostly originated from public data centers.

Nitzan Miron, VP of Product Management and Application Security at Barracuda while commenting on the statistics, said that: “While some bots like search engine crawlers are good, their research showed that over 60% of bots were dedicated to be involved in malicious activities at scale.

He said that when these are left unchecked, bad bots affects user accounts, skew analytics, steal data, affect site performance and destroy customer experience.

The research further notified that most bot traffic came from the popular public cloud vendors, AWS and Microsoft Azure, in almost equal ratios. This is possible because it is easier to set up a free account with provider, and further use the account to set up the bad bots.

Barracuda researchers noticed that bad bot traffic generally follows the standard workday, which allowed them to hide within normal human traffic streams in order to avoid raising alarm bells.

Taking the above into context, Miron said that: “it’s critically important to detect and effectively block bot traffic by investing in Web Application and API Protection technology that can identify and stop bad bots whilst simultaneously improving user experience and overall security.

He further added that “Today, many security decision makers in business can be overwhelmed by the process of protecting against newer attack threats, like bad bots, but fortunately, all it requires is for them to invest in the right application security solution that includes anti-bot protection alongside machine-learning capabilities, to most effectively detect and block sophisticated bot attacks.”

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.