User Tools

Site Tools


networking:linux:bad_bot_trap

This is an old revision of the document!


Bad Bot Trap

Reference

See also Fail2Ban

http://www.kloth.net/internet/bottrap.php

We use Fail2Ban to block bad bots.

:!: First, change to your web root folder. Possibly public_html.

Edit robots.txt and add a Disallow line under User-agent: *:

cd public_html

vi robots.txt

User-agent: *
Disallow: /bot-trap/
mkdir bot-trap

blank.png should be just a single pixel transparent image and is unimportant (and may already exist). You can get blank.png like this:

cd images
wget http://www.sonoracomm.com/images/blank.png
cd ..

Edit your HTML header (<head> section) (index.html, templates/yourtemplatename/index.php, etc.) and add this line:

vi index.html

<a href="/bot-trap/"><img src="images/blank.png" border="0" alt=" " width="1" height="1"></a>

Now we create a file so as not to pollute the error logs. Change the URL to your own domain:

cat << EOF >> bot-trap/index.html
<html>
  <head><title> </title></head>
  <body>
    <p>This is a spambot trap.  You shouldn't normally ever see this...</p>
    <p><a href="http://www.sonoracomm.com/">Home Page</a></p>
  </body>
</html>
EOF

Add another regex to the fail2ban badbots filter:

$ vi /etc/fail2ban/filter.d/apache-badbots.conf

failregex = ^<HOST> -.*"(GET|POST).*HTTP.*"(?:%(badbots)s|%(badbotscustom)s)"$
            ^<HOST> -.*"GET /bot-trap/"

Be sure to enable the apache-badbots stanza in /etc/fail2ban/jail.local and restart Fail2Ban:

service fail2ban restart

Parse IPTables Rules for List of Banned IPs

iptables -nL |grep "DROP       all" |tr -s ' ' | cut -d " " -f4|grep -v '0.0.0.0/0' |uniq |sort -n > botlist.txt
networking/linux/bad_bot_trap.1380304310.txt.gz · Last modified: 2013/09/27 11:51 by gcooper