Good bots vs. bad bots

Why the deep learning provides value and unique opportunities.

Most of us are familiar with the concept of bots – these little pieces of software designed to perform tasks simple and automated. It is commonly known in the world of technology that more than half of a web site's traffic comes from bots.

The many faces of the bots

The bots can be categorized into good bots or bots bad. Bots good are those that are meant to allow searches on the web more accurate, and chat you through an order online, or many other useful tasks.

The robots not so good are intended to slow down your site, negatively impacting customer satisfaction. These bots bad can capture the data, especially price data from the website of a competitor – or to simply steal personal and financial data such as credit card information.

However, how to label a bot as good or bad can be highly subjective. What is “good” for one company may not be good for another.

As well as the battle between computer viruses and anti-virus software, bad robots are becoming increasingly sophisticated as our ability to detect and differentiate good bots from bad bots improves.

For example, one of the significant features of bad bots used to be amount. We detect bots by looking for a large number of requests, and a pattern of repeated visits. The bad bots of today have learned to avoid large quantities; instead, they focus on the quality.

Another sign revealing of bots bad was a large number of requests from an IP address. In response, some bots are bad already, do not attack from the same IP address, and in fact, most of the bad robots now attacking a set of IP addresses.

In addition, bots are bad increasingly mimic the human behavior, with the hope that they will be detected, but are classified as “human”. There is now even a new term for these bots, known as a Koan: the Advanced Persistent Bots.

Analytics for the detection of bot

The increasing complexity of the bots bad is an ongoing challenge for companies whose digital presence is essential for your business. This is the place where the analysis can provide unique value and opportunity.

To understand this, first we need to understand the traditional approach of detecting bots based on the count of IP.

For example, as a buyer I might visit a site for 15 minutes, sending about 20 requests to the page. However, if there are 200 page requests in a 10-minute window from the same computer, it is likely that a bot since a human being can't sail so fast.

Bots have also been detected based on the geographical location of IP addresses. Each company tends to have a segment targeted customers. For example, a company can have 95% of customers with headquarters in the us, so if one of these customers travels to Australia and navigates from there, is not suspicious.

However, if suddenly there are hundreds of web requests from clients in Australia, usually a local relatively inactive, so this may be a bot. Traditional mechanisms of detection of the bot tend to focus on volume, starting IP address, and some statistical methods fundamental, such as sum or average.

Unfortunately, these traditional techniques are losing ground in the detection of bots of evil, for which, each day become more advanced. But the analytic, especially the deep learning, you can enter a wholly new approach to make our detection and mitigation of bot more effective.

Why the deep learning

Deep learning, according to Wikipedia, is “a branch of the machine learning based on a set of algorithms that attempt to model abstractions of high-level data by using several layers of processing, with complex structures or not, composed of multiple not linear transformations . ”

In other words, with a huge speed, the deep learning can recognize patterns in complex human, make a style foster learning of these patterns and then detect suspicious behaviour.

The deep learning can be especially effective in the recognition of bots are complex due to recent advances in neural networks. The more complex a bot, the more it resembles a human being. The neural networks provide the ability to correlate a significantly larger number of variants, in multiple layers, creating a completely new style of learning behavioral that is more dynamic and continuous – more like that of a human being.

Neural networks provide predictions much more in real time. To the extent that a new bot complex emerges, the systems of profound learning not only will be able to quickly learn new patterns of behavior of the bot and how it differs from a visit real human, but also continue the learning while the bot changes the behavior so that the previous Insights that can be leveraged.

Print Friendly, PDF & Email

Leave a comment

Show Buttons
Hide Buttons