User Tools

Site Tools


networking:linux:bad_bot_trap

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
networking:linux:bad_bot_trap [2013/09/26 11:25]
gcooper
networking:linux:bad_bot_trap [2013/10/13 14:59] (current)
gcooper
Line 6: Line 6:
  
 We use Fail2Ban to block bad bots. We use Fail2Ban to block bad bots.
 +
 +===== robots.txt =====
 +
 +Misbehaving bots may access areas of your web site even if you tell then not to.  That's what we key on here.
  
 :!: First, change to your web root folder.  Possibly ''public_html''. :!: First, change to your web root folder.  Possibly ''public_html''.
Line 20: Line 24:
 </file> </file>
  
-<file> +===== Web Site Header =====
-mkdir bot-trap +
-</file>+
  
-''blank.png'' should be just a single pixel transparent image and is unimportant (and may already exist).  You can get ''blank.png'' like this:+We use a tiny image for embedding the hidden link.  ''blank.png'' should be just a single pixel transparent image and is unimportant (and may already exist).  If it doesn't already exist, you can get ''blank.png'' like this:
  
 <file> <file>
Line 32: Line 34:
 </file> </file>
  
-Edit your HTML header (''<head>'' section) (''index.html'', ''templates/yourtemplatename/index.php'', etc.) and add this line:+Edit your HTML header (''<head>'' section) (''index.html'', ''templates/yourtemplatename/index.php'', etc.) and add this line (modify as necessary):
  
 <file> <file>
Line 40: Line 42:
 </file> </file>
  
-Now we create a file so as not to pollute the error logs.  Change the URL to your own domain:+===== bot-trap Folder ===== 
 + 
 +Now we create the ''bot-trap'' folder and a file so as not to pollute the error logs.  Change the URL to your own domain and the permissions as necessary:
  
 <file> <file>
 +mkdir bot-trap
 cat << EOF >> bot-trap/index.html cat << EOF >> bot-trap/index.html
 <html> <html>
Line 51: Line 56:
   </body>   </body>
 </html> </html>
-} 
 EOF EOF
 +
 +chown -R apache.apache bot-trap
 </file> </file>
 +
 +===== Fail2Ban =====
  
 Add another regex to the fail2ban ''badbots'' filter: Add another regex to the fail2ban ''badbots'' filter:
  
 <file> <file>
-vi /etc/fail2ban/filter.d/apache-badbots.conf+vi /etc/fail2ban/filter.d/apache-badbots.conf
  
 failregex = ^<HOST> -.*"(GET|POST).*HTTP.*"(?:%(badbots)s|%(badbotscustom)s)"$ failregex = ^<HOST> -.*"(GET|POST).*HTTP.*"(?:%(badbots)s|%(badbotscustom)s)"$
-            ^<HOST> -.*"GET /bot-trap/"+            ^<HOST> -.*"GET /bot-trap/ 
 +</file> 
 + 
 +Be sure to enable the ''apache-badbots'' stanza in ''/etc/fail2ban/jail.local'' and restart Fail2Ban: 
 + 
 +<file> 
 +service fail2ban restart 
 +</file> 
 + 
 +===== Test Fail2Ban Filter ===== 
 + 
 +Modify your log path as necessary: 
 + 
 +<file> 
 +fail2ban-regex ../logs/access_log /etc/fail2ban/filter.d/apache-badbots.conf 
 +</file> 
 + 
 +Check the Fail2Ban log: 
 + 
 +<file> 
 +tail -f /var/log/fail2ban.log 
 +</file> 
 + 
 +:!: If Fail2Ban fails to parse your log files at all, try setting ''backend=polling'' in ''jail.local''
 + 
 +===== Parse IPTables Rules for List of Banned IPs ===== 
 + 
 +<file> 
 +iptables -nL |grep "DROP       all" |tr -s ' ' | cut -d " " -f4|grep -v '0.0.0.0/0' |uniq |sort -n > botlist.txt
 </file> </file>
networking/linux/bad_bot_trap.1380216301.txt.gz · Last modified: 2013/09/26 11:25 by gcooper