Wordpress Security Enhancements Plugin

For the last few weeks I’ve been working on a security plugin for WordPress called iDapt. The plugin is still under heavy development but I wanted to do a quick write up on what I’m currently working on.

Small into: the upcoming plugin is a plugin that everybody should be able to install and use. One of the core requirements from the start has been that even if you don’t own your own server or have deep computer/security knowledge you’re still able to use the plugin and understand the defense mechanisms. Since, most detection can be done in more advanced way’s I wanted to create a piece of software that is easy to use and install yet powerful enough to detect attackers even within (possibly) their information-gathering stage.

I’m implementing a few methods to detect (possible) attackers. There are three that I would like to write about.

Blocking (in)valid logins the cool way

The first simple feature that I’m working on is a way of dealing with the insane amount of login attempts that is allowed in WordPress. WordPress does not offer a lockout functionality after a certain amount of failed login attempts. WordPress does not offer any type of notification on brute force attacks on accounts. In a nutshell, WordPress does not offer any type of protection against the brute force login attacks that take place every single day. Since I’ve seen around 3000 unique IP’s trying to find their way into my admin account in the last week I figured its time for a decent solutions.

wordpress-attacks Fig 1. Geo location of last weeks attacks (~14 may 2013 until 21 may 2013)

Since we see so many unique IP’s it is nearly impossible to base invalid logins on IP’s. Although iDapt will still process these activities preventing them is kinda difficult. Therefore the simple solution is as follows: after a X amount of failed login attempts the user account will be blocked not allowing any logins from any IP. An email will be send to the owner of the account allowing the owner to unblock the account so they can log in. This might sound like a hassle but lets be honest here, how many people do not have email within reach these days?

An other method that will be used to block these high volume login attacks is by using central violation logging.

Central violation logging

Ok, this is not really new. But I just want to write about it because its one of the cooler features I think. A lot of attackers these days’ use a high amount of proxies. This makes blocking them quite difficult if you are an individual trying to block based on a violation count per IP. Example: you get 100 logins on your personal account via 100 proxies. In a normal brute-force detection plugin this will be logged as 100 separate login attempts, thus not blocking anyone. But the attacker just had 100 login attempts. This single event is not worth much. But it is very likely the attacker will move on to another blog. If this data is logged at a central location we can detected that now one IP has tried to login 2 times on 2 different blogs. Get where I’m going? So the more people who are going to use iDapt the faster attacks will be blocked by detecting these attack patterns. I hope the name iDapt makes a bit more sense now ;)

Honeyfiles, a theory on protecting against vulnerability scanners

This is a theory I have – well, probably somebody else came up with it before me – is to create a ‘honey file’. This is a file that sits on your web server and triggers an alert when visited. Its important that this file is not approached by users that are using your blog/website like they should. Instead, I actually only want dodgy crawlers to find this file. I know that legit crawlers will obey my robots.txt – if they don’t, I don’t think they can be called legit. So if I print the link on my page:

potato

The link is now “invisible” to the user. The only one who will find this link is a non-legit (is that even a word?) crawlers or people poking around. In any case, I would like to know. Just to make sure it won’t be visited by legit bots I add the following to my robots.txt file:

User-agent: * Disallow: {file}.php

Now, this is just a theory and I have no proof if this works yet. But I see no real reason why not.

Anyway, that is what I’m working on. In case you are interested in becoming one of the first testers please give me a buzz so we can talk.

Cheers, Ruben.

Comments