User Tools

Site Tools


networking:linux:bad_bot_trap

This is an old revision of the document!


Bad Bot Trap

http://www.kloth.net/internet/bottrap.php

Used here with fail2ban:

:!: First, change to your web root folder. Possibly public_html.

Edit robots.txt and add a Disallow line under User-agent: *:

cd public_html

vi robots.txt

User-agent: *
Disallow: /bot-trap/
mkdir bot-trap

blank.png should be just a single pixel transparent image and is unimportant (and may already exist). You can get blank.png like this:

cd images
wget http://www.sonoracomm.com/images/blank.png
cd ..

Edit your HTML header (<head> section) (index.html, templates/yourtemplatename/index.php, etc.) and add this line:

vi index.html

<a href="/bot-trap/"><img src="images/blank.png" border="0" alt=" " width="1" height="1"></a>

Now we create a file so as not to pollute the error logs:

cat << EOF >> bot-trap/index.html
<html>
  <head><title> </title></head>
  <body>
    <p>This is a spambot trap.  You shouldn't normally ever see this...</p>
    <p><a href="http://www.sonoracomm.com/">Home Page</a></p>
  </body>
</html>
}
EOF

Add another regex to the fail2ban filter:

failregex = ^<HOST> -.*"(GET|POST).*HTTP.*"(?:%(badbots)s|%(badbotscustom)s)"$
            ^<HOST> -.*"GET /bot-trap/
networking/linux/bad_bot_trap.1380215636.txt.gz · Last modified: 2013/09/26 11:13 by gcooper