[Elgg 1.8-1.12 & 2.X & 3.X: http:blacklist] v1.9.2

Release Notes

Changelog:

  • Separate release for Elgg 1.9 to fix BC-breaking issue in 'route' plugin hook ($return['handler'] is now $return['identifier']).

Feedback appreciated! Does it work for you? Any noticeable reduction in spam? How many blocked/redirected accessed per day/per week? Not only negative feedback would be helpful as I would also like to know if it works perfectly fine.

  • Number of accesses blocked or redirected: 7

    Too bad most of the bots get through. I've put the threat score at 3 and period in days to 33. It would be awesome to get a option to blacklist a spam bot through Elgg so we could systematically inoculate against all of them.

  • To avoid spam bots, you can use this in .htaccess.

    #Avoid postings from bots as antispam
    RewriteCond %{REQUEST_METHOD} POST
    RewriteCond %{HTTP_REFERER} !.*yourdomain.com.* [OR]
    RewriteCond %{HTTP_USER_AGENT} ^$
    RewriteRule ^(.*)$ - [F,L]

     

  • @Darth Vader: the http_blacklist plugin alone is very likely not enough to reduce the number of spam accounts noticeable. The reason is that different types of sites (or different thematic sites) get visited by different spammer (human or bots). The Project Honeypot seems more effective for harvesters (bots that collect email addresses), malicious search engines, dictionary attackers (brute force attacks) and traditional email spammers. For forum / comment spam there are not that many entries (yet?) in their blacklist. They also don't offer any option to report spammer's IPs directly but only offer embedding of a honeypot code. This means you could add some special hidden input field for example on the login or register page that only spam bots will use. The IPs of such bots would then get added to their database. But I have to admit that I've not yet tried this out myself.

    For a higher success rate in keeping spammers out I would suggest to use the Spam Login Filter plugin in any case. They also allow reporting of spammers (you need to register on their site to get an API key to be able to do so though). The more people report spammers the better. If no one reports them, they are not in the blacklist and they won't get blocked on any other site (unfortunately, one person will always have to deal with them first manually). But reporting your spammers will also help to reduce the spammers on your own site in the future as the same IPs are often re-used. So, reporting them once will block them on future attempts to register.

    @Gerard: could you provide some details / explanations to these lines? What are they supposed to do exactly?

  • Sure, when a form is submitted a http(s) POST request is required. This can be done using a browser, or a bot can use curl or a direct socket connections. A browser always carries a user_agent string and the 3th line is checking if there is one. But a bot can also construct a user_agent string, like spiders usually do.

    The referrer string is to check if the page was accessed through another page on your site, which is normal user behavior, but not for a bot ! But even if they are just following links on your website, they still do not carry a referrer link. It is possible to fake referrer in the request, but unless you are dealing with a clever and dedicated spammer they don't. Having these rule now for 6 months, none of them fit that subscription of being smart and dedicated yet. But once they do, I'll just think of another rule that will set them off for another 6 months.

    They usually go straight for the kill (direct access to the form). So if there is no referrer or no user_agent it cannot be a browser and can only be a bot that is trying to penetrating the site. It doesn't matter what form, register, guest comments, contact forms or brute force attack on login. They are all detected and blocked with a 403.

    Apply the rules and try for instance

    curl -X POST http://www.yourdomain.com/register

    where yourdomain must be what it suggest, the name of your site.

    You will get something like "You don't have permission to access /register on this server".

    If you are using API webservices you could add this line as second

    RewriteCond %{REQUEST_URI} !.services*

    which allows POST requests from remote websites to your API. Most elgg webservices are http GET based, but some requests require a post.

Stats

  • Category: Spam
  • License: GNU General Public License (GPL) version 2
  • Updated: 2019-4-7
  • Downloads: 1947
  • Recommendations: 7

Other Projects

View iionly's plugins