How to mitigate CVE-2018-6389 – the load-scripts.php DoS “attack” in WordPress

A little sensationalist written blog post by Barak Tawily claims that WordPress is vulnerable to a DoS attack because of the load-scripts.php file which concatenates JavaScript files on the fly.

Barak Tawily shows how you can ask that file to include all JS-files present in a WordPress installation. This creates a huge file, that will cause some load on your server, and if requested often enough, will block your server from doing anything else.

This kind of thing should really be mitigated at the server or network level rather than the application level, which is outside of WordPress’s control.

When Barak Tawily reported this as a bug, he got a response from the WordPress team that this is nothing that should be fixed at the application level, but at server or network level. As this is no different from creating any heavy query – or any query backed by enough requests, I can see this from the WordPress team’s point of view here. It would be really easy to mitigate in WordPress for a lot of sites by only allowing authenticated users access the file, but this would not help much for all the sites out there with open user registration.

By the way: When I ran Tawily’s proof of concept, I was much more likely to DoS the machine the attack ran on. My server chugged along just fine. I don’t know what kind of server Tawily tested on, but I was doing just fine. For this attack to have any effect on my server, it would have to be a DDoS. Something which easily will take down my server just by requesting the front page, if I don’t get really good DDoS protection.

Whether you agree with the WordPress team or not, it is not going to be fixed in WordPress. Here are 3 possible ways to fix it at the server level:

1: Disable concatenation of JS and CSS files

  1. You should really use HTTPS. If you don’t, you shouldn’t have a web site in the first place.
  2. When you use HTTPS, there’s no reason to not use HTTP/2.
  3. With HTTP/2, there’s no need to concatenate your files. It is actually an anti-pattern.

What load-scripts.php does, is concatenating scripts. Which we don’t really want. By setting a constant in wp-config.php, you can tell WordPress that we don’t want to use this file – or the load-styles.php file for CSS:

define( 'CONCATENATE_SCRIPTS', false );

Now, we don’t have any use for the load-scripts.php file anymore – or the accompanying load-styles.php file for doing the same thing with CSS files. Which means we can add a configuration directive in our Nginx webserver config to disallow all access to them:

location ~ \/wp-admin\/load-(scripts|styles).php {
deny all;
}

That’s all that’s needed. Now you’ve improved the speed of your wp-admin and mitigated this pseudo attack vector.

2: Implement rate-limiting

This is something you probably should have in place regardless of this specific case. Rate limiting the requests will give you some basic protection against any DoS attacks. In Nginx, we can limit the number of requests a visitor can make in a give time period. We can implement the limit in a location directive, meaning we can easily apply the limit to only PHP files – which of course applies to load-scripts.php and load-styles.php. There is probably no legit reason why any visitor needs to make more than 1 request per second to your site. To configure rate limiting in Nginx, we need two directives: limit_req_zone to create a zone, or bucket, to put the requests in a queue, and limit_req to apply the restriction in a location.

The first one, limit_req_zone, is applied at the http level of your configuration, and limit_req can be applied at either the server or location level. This means your Nginx configuration can look something like this:

limit_req_zone $binary_remote_addr zone=php:10m rate=1r/s;

server {
[…]
location ~ .php$ {
limit_req zone=php burst=10 nodelay;
[…]
}
}

The limit_req_zone directive takes three arguments:

  • The key to which the limit is applied. Here we are using $binary_remote_addr which is the visitor’s IP address in a binary format – which takes less space to store than the string format provided by $remote_addr. If you are behind a load balancer, make sure you use a variable that holds the forwarded IP address, so you don’t restrict the load balancer itself.
  • The zone definition – a name and size. The name of our zone above is php and the zone size is 10MB which I’m told is enough to hold about 160,000 addresses.
  • The maximum rate. Here we set 1 request per second. If your server can’t handle that, you really need to get proper hosting.

Any visitor who requests any PHP file more often than 1 per second will receive a response with the 503 HTTP response code. But what if you have a page that immediately runs a few AJAX requests to lazy load parts of your page? Well, read on.

The limit_req directive only needs one argument, but we can give it up to three, which are really useful. The arguments are:

  • The zone name. Here we are using the php zone, which we created with the limit_req_zone directive before.
  • The burst argument lets us queue up a number of requests, here we allow 10 requests, before sending 503 responses. Usually Nginx will handle these requests at the max rate defined, but …
  • If nodelay is used, Nginx will handle all the requests immediately, but only freeing one slot in the queue per the max rate.

Our configuration above will allow a client to send 10 request immediately and receive the responses as fast as Nginx can handle them. But if an 11th request is sent within a second, Nginx will return a 503 response for that request.

3: Protect wp-admin with a static IP VPN

If you get yourself a VPN with a static IP, you can protect the entire wp-admin area – which includes the load-scripts.php and load-styles.php files – and only allow requests from that IP. I’ve written about that earlier in Restrict access to the WordPress dashboard by IP address in Nginx.

1 Comment

  1. Without a front-end cache, WordPress will dutifully assemble the files upon each request. The mitigation methods in this article works, but the denial of service on non-patched systems is still applicable. You mentioned that running the DOS stresses the attack machine – that’s because you are dealing with the download stream, and you are probably also parsing the data. You can write a PHP script that makes a CURL request but discard the response. Use the CURLOPT_RANGE option to “peek” into the response. This allows a single host to reach the maximum Response-Request-Cost ratio. The target machine has to spend the disk I/O and memory to handle the request. In the attack script, add a timestamp or scramble the query parameter sequence to sidestep caching.

Comments are closed.