This is an old revision of the document!
http://www.kloth.net/internet/bottrap.php
Used here with **fail2ban**:
vim robots.txt User-agent: * Disallow: /bot-trap/
mkdir /var/www/web1/web/bot-trap vim /var/www/web1/web/templates/ja_purity/index.php
Add this in the header. Blank.png should be just a single pixel transparent image and is unimportant. The link is important.
<a href="/bot-trap/"><img src="images/blank.png" border="0" alt=" " width="1" height="1"></a>
Now we create a file so as not to pollute the error logs:
vim /var/www/web1/web/bot-trap/index.html <html> <head><title> </title></head> <body> <p>This is a spambot trap. You shouldn't normally ever see this...</p> <p><a href="http://www.sonoracomm.com/">Home Page</a></p> </body> </html>
Add another regex to the fail2ban filter:
failregex = ^<HOST> -.*"(GET|POST).*HTTP.*"(?:%(badbots)s|%(badbotscustom)s)"$ ^<HOST> -.*"GET /bot-trap/