Why Firefox 4 wont be Acid3 compliant

The Acid Tests are useful tests used to gauge whether a browser conforms to web standards and are often used to rate progress on beta builds of new browser versions, as well as a way to nudge older browsers (Internet Explorer) towards compliance.

So you may be surprised to find that it is unlikely that Firefox4 will be fully Acid3 compliant. According to a comment on Slashdot by Mozilla’s Boris Zbarsky and a follow up post by Alexander Limi, the three remaining items are to do with SVG fonts and have only been partially implemented in Opera and Webkit (Safari and Chrome) – just enough for those browsers to pass Acid3.

Currently IE9 Beta scores 95 on the Acid3 test and it’s unlikely that it will meet the 100% mark either. Which raises the question: just how important is the Acid3 test these days anyway?

Advertisements
Why Firefox 4 wont be Acid3 compliant

Transparent .png images in IE6

A lot of web design I’m involved with these days involves either animating images over backgrounds, or creating composite layouts made up of multiple layered images. These techniques require placing images over one another, and blending the images (usually with shadows) together. This approach requires images with something called alpha (partial) transparency. The alpha channel is a separate stream of data that specifies the level of transparency for each pixel in the image. In web browsers this functionality is available as a 24 bit .png, with 8 bits each for alpha, red, green and blue.

Now for the kicker. Alpha transparency isn’t available in IE6, and unfortunately as I write this, corporate customers still dictate that we support Microsoft’s finest browser creation.

As has been documented for some time, certain workarounds exist for displaying transparent images in IE6. Firstly, the original sleight method used the proprietary alpha image loader filter, and then variations to allow for background images and css – such as supersleight and Angus Turnbull’s png fix – appeared as web 2.0 took off and we started to make more handsome and detailed web pages.

A recent discussion with a work collegue highlighted one major flaw with this approach – the alpha image loader does not support background-position – and as every decent front-end developer knows, you can’t sprite images without this bit of handy css. I’ve also recently discovered that according to Stoyan Stefanov at Yahoo!, the alpha image loader is in fact also very bad for performance.

With this issue in mind, I came across a different technique for loading alpha png images in IE6. Drew Diller’s technique loads a VML element into the DOM which can then be styled using a full alpha .png. It also appears to be faster than the alpha image loader – though if the browser doesn’t have access to javascript then the alpha image loader would appear to be the only option.

With IE6 usage down to as little as < 5%, perhaps we don't have to wait much longer to leave these dirty little hacks behind us.

Transparent .png images in IE6

A jQuery function that performs an .each in reverse

Sometimes you need to start at the end of a set of selected items. The .each function runs a function starting from the beginning, but theres nothing similar to loop backwards. The reverse selector function does just that.

jQuery.fn.reverse = function(fn) {

   var i = this.length;

   while(i) {
        i--;
        fn.call(this[i], i, this[i])
   }
};

Usage:

$('#product-panel > div').reverse(function(i, e) {
    alert(i);
    alert(e);
});
A jQuery function that performs an .each in reverse

Performance: packing or minifying your javascript?

One thing I love about writing jQuery plug-ins is that every line of code counts towards performance. Its like real programming again. There are lots of difference ways in which you can improve performance but one must have is reducing the size of the JavaScript that comes down the wire from the server.

There are two options to use when reducing JavaScript file size – packing and minifying. Packing uses a compression algorithm such as gzip to physical compress the file to it’s smallest possible size, whilst a decent minifyer removes whitespace, renames variables and internal functions and removes unreachable code.

A while ago I came across this post by John Resig pointing out that although packing reduces the file size, it takes longer to ultimately execute, due to the repeated unpacking process every time the script is used.

After reading this I started using the Yahoo YUI Compressor to minify all of my plug-ins, and it worked well, typically reducing file sizes by 70% or more. Looking through the jQuery 1.4 release notes I noticed that the jQuery team have switched to Google Closure Compiler. This really is a great tool because the code is compiled and rewritten, instead of just applying a regex filter. You can use in in your build process as a Java jar, or online using the Closure Compiler Service. In tests I have been getting a consistently higher compression ratio with Google Closure, and the added benefit that I know the code is syntactically correct.

Performance: packing or minifying your javascript?