This is an old revision of the document!
See also Fail2Ban
http://www.kloth.net/internet/bottrap.php
We use Fail2Ban to block bad bots.
First, change to your web root folder. Possibly
public_html
.
Edit robots.txt and add a Disallow
line under User-agent: *
:
cd public_html vi robots.txt User-agent: * Disallow: /bot-trap/
mkdir bot-trap
blank.png
should be just a single pixel transparent image and is unimportant (and may already exist). You can get blank.png
like this:
cd images wget http://www.sonoracomm.com/images/blank.png cd ..
Edit your HTML header (<head>
section) (index.html
, templates/yourtemplatename/index.php
, etc.) and add this line:
vi index.html <a href="/bot-trap/"><img src="images/blank.png" border="0" alt=" " width="1" height="1"></a>
Now we create a file so as not to pollute the error logs. Change the URL to your own domain:
cat << EOF >> bot-trap/index.html <html> <head><title> </title></head> <body> <p>This is a spambot trap. You shouldn't normally ever see this...</p> <p><a href="http://www.sonoracomm.com/">Home Page</a></p> </body> </html> EOF
Add another regex to the fail2ban badbots
filter:
$ vi /etc/fail2ban/filter.d/apache-badbots.conf failregex = ^<HOST> -.*"(GET|POST).*HTTP.*"(?:%(badbots)s|%(badbotscustom)s)"$ ^<HOST> -.*"GET /bot-trap/"