I just noticed something funny. Normally, the daily routine of a website owner is all about getting more visitors (preferably from search engines) and getting less spam.
Now, I made a mistake with this project, resulting in getting hit hard by a spambot. The project in question features a bugtracker module and I honestly thought, limiting posting bug reports to registered users and protecting the signup form with a math CAPTCHA would suffice to lock spammers out. I could not have been more wrong about it.
What I now call "my special friend 220.127.116.11", found me, created dozens of user accounts (strangely without failing the CAPTCHA once), waited for a week or so to let them sink in, and then began pumping spam posts into the bugtracker like there was no tomorrow.
It took me quite some time to track down all the fishy accounts, block them and to delete all the pages created by them, always wondering: who goes through all the trouble of writing such bot software? Is there some human assistance involved in beating the CAPTCHA? Why on earth even bother posting mind boggling amounts of spam pages that are guaranteed to catch administrator's attention, with the consequence of them being deleted within days?
My friend 18.104.22.168 is now on my blocklist, earning him an HTTP 403 FORBIDDEN, for everything he requests (which does not seem to stop him from trying it every five minutes, though). Email addresses in the .ch or .in domain can no longer be used for signing up and the spambot module now keeps a close eye on the registration form.
It goes without saying that after cleaning up the mess, my deepest desire was to meet whoever controls 22.214.171.124 in person and discuss with him/her the practical applications of pineapples and the difference between "top goes in first" as oppose to "bottom first".
The whole incident has an interesting aftermath,though. According to the website's statistics, I have been getting search engine visitors, looking for some bizarre e-books that are most certainly not topic of that website. Further investigation showed that these visitors were indeed trying to request some of the now deleted pages. Obviously, I caught 126.96.36.199 late and since he used random copy&paste textblocks in order to defeat automatic spam filtering, the Google bot happily indexed his trash.
I really don't know what to make out of this. On one hand, I'm happy about the extra visitors, on the other, the way in which I got them is rather mood.