DNS changes not propagating

I had an issue lately where a new subdomain I’d added for a site wasn’t accessible. Trying to debug it, when I ran nslookup sub.example.com my.webhost.dns it returned the correct IP address of the server where the subdomain was meant to be pointing. But when I ran nslookup sub.example.com 8.8.8.8 (that’s Google’s DNS server) then it couldn’t find the domain.

Eventually I tracked down the problem, and it was something very simple. The domain wasn’t actually set to use my webhost’s DNS servers. Instead I had it configured to use CloudFlare’s DNS servers.

So if you have this problem, make sure you double-check that the DNS server(s) you’re updating are actually set as the primary DNS servers for the domain. It might seem obvious, but it’s easy to overlook (at least it was for me!)

Posted on by xoogu, last updated

Nginx not gzipping files

I had a problem recently where Nginx wasn’t gzipping responses, despite having the necessary lines in my nginx.conf. In looking for a solution, I found quite a few posts covering various reasons why gzip might not be working. But none that fitted my case.

So, I thought I might as well share what the problem / solution was in my case, plus the other reasons why Nginx may not be gzipping files. Hopefully this will be helpful to anyone else trying to figure out why their Nginx gzip configuration isn’t working.

Mime type Not in gzip_types

As part of the gzip configuration, you need to specify what mime types should be gzipped.

gzip_types text/css text/javascript text/xml text/plain;

If you had something like above, but your javascript was actually served with a mime type of application/javascript, then it wouldn’t be gzipped because application/javascript is not listed in the mime types you want gzipped.

So the solution here is just to ensure you include all mime types you want gzipped after the gzip_types directive.

Item size too small

Normally, as part of gzip configuration you will include a minimum size that the response must be for it to get gzipped. (There’s little benefit in gzipping already very small files).

gzip_min_length 1100;

It can be easy to forget this and think that gzip isn’t working, when actually it is working, it’s just that you’re checking with a small file that shouldn’t be gzipped.

Using old HTTP version

This was what the problem was in my case. By default, Nginx will only gzip responses where the HTTP version being used is 1.1 or greater. This will be the case for nearly all browsers, but the problem comes when you have a proxy in front of your Nginx instance.

In my case, my webhost uses Nginx, which then proxies requests to my Nginx instance. And I’ve mirrored this setup in my development environment. The problem is that by default Nginx will proxy requests using HTTP1.0.

So the browser was sending the request using HTTP1.1, the frontend Nginx was receiving the request, then proxying it to my backend Nginx using HTTP1.0. My backend Nginx saw the HTTP version didn’t match the minimum gzip default of 1.1 and so sent back the response unzipped.

In this case you either need to update the proxy_http_version directive of the proxying server to use 1.1. Or you need to set the gzip_http_version to 1.0 in your config.

Client side software deflating

I think this is likely to be a rather unusual situation, but I found it described here: nginx gzip enabled but not not gzipping. Basically they had some security software installed on the client machine they were testing from. This software was deflating and inspecting all requests before they were sent on to the browser.

The same thing could happen if there was a proxy between you and the server that deflates any gzipped responses before sending them on to you. But I think it would be very rare to have proxy configured like that.

There could also be other reasons why Nginx might not be gzipping responses. For example, it could be you’re using a gzip_disable directive that matches. Or you have gzip off; somewhere later in your config. But I think the items above are likely to be the main reasons why Nginx isn’t (or looks like it isn’t) gzipping files when it should be.

Posted on by xoogu, last updated

Animation event not firing in MS Edge? This might be why

Recently I’ve been working on a widget that makes use of this hack using animation events as an alternative to DOM Mutation events. The nice thing about this method is that it lets you add the event listener on the element you want to get the ‘node inserted’ event for. Whereas with DOM mutation events, you must add the listener to the parent node. In cases where you don’t know where the node will be inserted, this means attaching the mutation listener to the body, and you have to filter all mutation events to try and find the one for your element. With the animation event method you don’t have that problem.

Anyway, to get on to the main point of this post, I was having a big problem with my widget working fine in all browsers (that support CSS3 animations) apart from MS Edge. It seemed very strange that something working in older IEs would not work in Edge. The problem was that the animation event was never being fired when the node was inserted. But when I tried the jsFiddle example from the backalleycoder post, that worked fine in Edge.

After much debugging, I found the issue. I had my keyframes like this:

@keyframes nodeInserted {
    from {  
        outline-color: #000; 
    }
    to {  
        outline-color: #111;
    } 
}
@-moz-keyframes nodeInserted {  
}
@-webkit-keyframes nodeInserted { 
    from {  
        outline-color: initial; 
    }
    to {  
        outline-color: initial;
    }  
}

@-ms-keyframes nodeInserted {
    from {  
        outline-color: #000; 
    }
    to {  
        outline-color: #111;
    } 
}

@-o-keyframes nodeInserted {  
    from {  
        outline-color: #fff; 
    }
    to {  
        outline-color: #000;
    }  
}

Initially I had the unprefixed @keyframes empty, but when playing with the jsFiddle example I found MS Edge didn’t like an empty @keyframes, nor did it like @keyframes changing the values from initial to initial. The problem with my CSS was that after defining the unprefixed @keyframes in a format Edge will fire an animation event for, I then have a webkit prefixed @keyframes using the initial values it doesn’t like.

MS Edge was picking up the webkit prefixed @keyframes, and using this as the value, since it comes later in the stylesheet than the unprefixed version. So the solution was simply to move the unprefixed @keyframes down to the bottom.

It seems a bit silly that MS Edge will pick up the webkit prefixed declaration, but doesn’t pick up the later ms prefixed declaration. But I guess that’s the kind of weirdness you come to expect from MS.

This foxed me for quite a while, so I hope this helps anyone else coming across the same problem.

Posted on by xoogu, last updated

Script to test / benchmark SQL queries

I’m not particularly knowledgable on the subject of optimising SQL queries, so the easiest way to optimise a query for me is to write a few variations then test them against one another. To this end I’ve developed a PHP class to do the testing and benchmarking. I think that even if I was highly knowledgable about optimising queries, then I would still want to test my queries to ensure that my theory held true in practice.

For a useful benchmark you need to execute the queries using a range of data that simulates the real data the queries would be executed with. They also need to be executed in a random order and multiple times, to ensure results can be averaged and reasonably reliable. That’s what this class does, along with providing a summary of the results in CSV format.

It should be noted that this class does not set up or modify any tables for testing with – it just allows you to supply a range of data to be included within the queries themselves, such as testing with a range of different values in a WHERE clause.

Continue reading

Posted on by xoogu, last updated

Issues compiling PHP dependencies

I decided to update PHP, and had a few problems compiling its dependencies. So I thought I’d share the problems I had and the solutions here for future reference, and maybe they might help someone else as well.

Continue reading

Posted on by xoogu, last updated

Does HTTP2 really simplify things?

I recently watched a webinar from NGINX on ‘What’s new in HTTP/2?’. In the webinar they go over how HTTP/2 differs from version 1, and what benefits it has. One of the benefits is that it allows you to use SSL with no performance hit compared to plain HTTP/1.1. The other benefit they go into is that it simplifies your work process. However, I’m not sure this simplification benefit has much truth in it.

Continue reading

Posted on by xoogu, last updated

Accessing a local site on a VM with NAT (shared IP) from another computer

I develop my websites using Linux in a VMWare Virtual Machine, on a Windows host PC. I have the VM configured to use NAT for the Network connection, which means that the VM does not have its own specific IP address on the network, but rather shares the host machine’s IP address.

VMWare Virtual Machine Network adapter settings set to NAT

This is no problem for testing sites locally from the host machine (on a linux VM use ifconfig to get the IP address, then add this with your test domains to the host machine’s hosts file). However, it does cause issues if you want to test your sites from other machines on the network, as they have no way of connecting to your VM. In my case I had a Windows tablet that I wanted to test my site on.

Continue reading

Posted on by xoogu, last updated

Switching gallery / attachment image srcs to a static domain / CDN in WordPress

For the majority of sites that I manage, I use a www. subdomain as the main URL, which dynamic pages are served from. For static assets, such as images, javascript, CSS, etc. I use a static subdomain. This makes it easy to send long expires headers for everything on the static subdomain, and keep cookies restricted to the www subdomain only, resulting in a faster website.

Normally I write my posts in HTML, meaning I explicitly set the src of any images included in the post to point to the static subdomain. However, I realised today that on one of my sites I’m using the [gallery] shortcode. With this WordPress automatically creates an image gallery based on the images added as attachments to the post, and of course, will just use the standard www. subdomain for the image srcs.

Thankfully, you can filter the src that WordPress uses for attachment images very easily. In my theme’s functions.php, I added the following:

add_filter('wp_get_attachment_image_src', 'staticize_attachment_src', null, 4);
/*
* @param array|false  $image         Either array with src, width & height, icon src, or false.
* @param int          $attachment_id Image attachment ID.
* @param string|array $size          Registered image size to retrieve the source for or a flat
*                                    array of height and width dimensions. Default 'thumbnail'.
* @param bool         $icon          Whether the image should be treated as an icon. Default false.
*/
function staticize_attachment_src($image, $attachment_id, $size, $icon)
{
	if (is_array($image) && !empty($image[0])) {
		$image[0] = str_replace('http://www.', 'http://static1.', $image[0]);
	}
	return $image;
}

You need to hook into the wp_get_attachment_image_src action. The callback has 4 paramaters – the first is the important one in our case, $image. This is a numerically indexed array, with index 0 being the src of the image. So you can just do a find and replace on the src in the callback to replace the default subdomain with your static subdomain.

Bear in mind that if your site uses / allows HTTPS, then you’ll need to modify the code above appropriately.

Posted on by xoogu, last updated

Moving the WP Super Cache Plugins folder

WP Super Cache is a plugin for WordPress that allows you to speed up your site by making use of server side caching. The plugin itself can be extended through the use of Super Cache plugins. Ideally you should move the Super Cache plugins folder from its default location (wp-content/plugins/wp-super-cache/plugins/) as otherwise whenever you upgrade the super cache plugin it will overwrite its plugins directory.

I suggest moving the folder to wp-content/wpsc-plugins/. You’ll then need to edit the wp-content/wp-cache-config.php file to change the location where WP Super Cache looks for the plugins. Find the line that says:

$wp_cache_plugins_dir = WPCACHEHOME . 'plugins';

and change it to:

$wp_cache_plugins_dir = WP_CONTENT_DIR . '/wpsc-plugins';
Posted on by xoogu, last updated

Differences between caching methods with Geo targeted content

I recently released a version of the Geo Text WordPress plugin with support for the WP Super Cache and W3 Total Cache page caching plugins. As part of this I added a page to describe why caching plugins are problematic and the different methods the plugin uses to enable compatibility with caching plugins: Geo Text WordPress Plugin: W3TC and WPSC compatibility.

After writing that text, I thought some simple diagrams might help illustrate the reason why standard caching is not useful when you have geo targeted content, and the way that page key modification and fragment caching / dynamic caching can solve this issue.

Continue reading

Posted on by xoogu, last updated