Automated web traffic is one of the most destructive forces threatening web application security. What it does is perform unbelievable amounts of hidden requests and submissions against multiple characteristics of a website. This goes for characteristics humans aren’t even intended to access. Unfortunately, the amount of automated web traffic is consistently growing. As it rises, so too does the sophistication and complexity of the bot operators. Before discussing the activities of online bot traffic in detail, it’s worth addressing some of the common misconceptions that website owners may have about automatic web traffic.
1. Misconception: Bots Are Just Simple Automated Scripts
Online bots have advanced beyond simple automated scripts and continue to grow stronger each day. This is due to the massive increase in both the technology and platforms available to bot operators. Not to mention, the sophistication of defenses in place requires more advanced bots to work around them. Most importantly, because of all this advancement, the value of the traffic increases.
Modern bots manage the distribution of traffic across large-scale environments via multiple proxies to hide their activity from users. These advanced bots frequently execute requests from real browsers and run JavaScript, sent to validate users as human. This means they bypass detection mechanisms, such as CAPTCHA, either by using artificial intelligence or brute-force systems. Some groups also employ farms of human agents to solve detection mechanisms and pass the solutions back to the bot. These bots are intelligent enough to integrate with these human services seamlessly.
2. Misconception: Online Bots Are Just a Web Security Problem
Often, the web security department and information security officer handle the challenge of managing automated web traffic. However, the security department usually handles some types of automated web traffic, such as credit card fraud. Additionally, price aggregators are business considerations, managed by another department more suited for that.
The roles involved in making decisions about different types of, and the challenges raised by, automated web traffic varies based on each business. These include heads of different departments, for instance eCommerce, Platform, Ops and Marketing. Furthermore, the ideal management solution provides enough information to allow decision-makers to view all the details regarding automated traffic. This allows them to make informed decisions on how to manage the elements specific to their roles without being dependent on a black box security-based system.
3. Misconception: Online Bot Operators Are Just Individual Hackers
As we all know, large organizations exist that operate automated web traffic networks. Below that, there are a group of organizations that scrape data for legitimate purposes, such as price aggregators. However, sometimes a distributed set of lone hackers develop software to perform harmful scams or sell it to companies to spy on their competitors.
There is a lot of money to be made with some types of automated web traffic. Knowing this, complex unethical organizations employ technical experts as well other talents to manage organizational and strategic tasks outside the scope of bot activity. There is also an increasing trend of third-party services that focus on delivering automated traffic activity on demand.
4. Misconception: Only the Big Boys Need to Worry About Online Bots
Sometimes, it feels like there are only two types of bots:
Targeted bots: ones focusing on specific high-profile websites.
Generic bots: ones targeting spotted weaknesses in many sites.
This leads to a false sense of security for website owners of medium-sized websites. Since they have some general security protection in place, they think bot operators will never target their website. Unfortunately, this is far from the truth. Realistically, smaller sites tend to have fewer defenses, making them easier targets. Moreover, the frameworks built to allow for easy expansion, plus the available resources surrounding them, make a wide range of websites targetable. Small and medium-sized commercial online presences have been shown to be equally targeted by automated traffic activity.
5. Misconception: I Have a WAF, I Don’t Need to Worry About Bot Activity
Web Application Firewalls (WAF) are beneficial tools that form a fundamental part of a security system. Unlike network firewalls, which operate at a TCP/IP level, WAFs serve at the HTTP level to process all incoming requests. Then, they match each request against a set of blocking requests, static rules that see if requests fail the checks. Therefore, they are very effective at running-off vulnerability scanning attempts such as SQL injection attacks.
However, WAFs are not efficient for identifying bot traffic, as the challenge of spotting automated web traffic is radically different. By design, WAFs scan web traffic looking for illegitimate requests designed to exploit security weaknesses in web applications. Bot detection systems need to scan web traffic looking for legitimate requests that are aiming to exploit weaknesses in the business logic of a web application. Typically, this involves making a judgment after analyzing the series of requests made to look for patterns of behavior that differ from legitimate users.
Hopefully, understanding these misconceptions helps you better protect yourself, and your site, from any unwanted automated web traffic. Make sure to take a long, hard look at your web security so that you aren’t vulnerable.
Photo by Christina Morillo from Pexels