Images are a key ingredient of compelling content. However, images also bloat up our web pages. According to Tammy Everts of Soasta, in 2015, the average page weight was about 2.1MB and 65% of that was due to images. Who here thinks that this has actually improved in the ensuing 2 years?
Given that I manage a site that uses a lot of images for its content, I was excited when I read about Google's new Guetzli JPEG encoder. It claims to create JPEG images that are 20-30% smaller than the current standard encoder (libjpeg) that are still JPEG compatible but with better quality. Sounds almost too good to be true!
You can learn more behind the technology of Guetzli in the announcement post.
Guetzli [guɛtsli] — cookie in Swiss German — is a JPEG encoder for digital images and web graphics that can enable faster online experiences by producing smaller JPEG files while still maintaining compatibility with existing browsers, image processing applications and the JPEG standard.
The encoder is currently available on GitHub and isn't yet incorporated into any standard image editing software. I tested this on a Mac by simply installing it via HomeBrew. However, there are binaries available for Linux and Windows as well.
If you already have Homebrew on a Mac, though, brew install guetzli
takes care of everything.
Using the tool via the command line is pretty simple.
guetzli original.jpg optimized.jpg
You can specify a quality with a -quality
flag. This works just like the libjpeg quality, which is a number between 0 and 100 (though Guetzli currently limits you to a minimum of 85).
You can also add a -verbose
flag, which will post details to your console as it works to optimize your image. If you are like me, you won't know what any of the output means, but I highly recommend doing this - you may end up thinking the encoder has crashed and burned when it is just taking it's time to process.
The instructions say to use unoptimized images (PNG or JPEG) as originals, but it can take a really long time. I timed a basic encoding of this image off Unsplash and it took 17 minutes and 35 seconds to complete on a Macbook Pro that is less than a year old. Yes, you read that right. My fan kicked in during the process like I was running Adobe Flash too!
While Guetzli creates smaller image file sizes, the tradeoff is that these search algorithms take significantly longer to create compressed images than currently available methods.
The point is: be patient!
To see how effective the Guetzli encoder really is, I grabbed some images (mostly from Unsplash, which is my personal favorite source of open source imagery), as well as some from recent articles and ran them through the encoder.
While my tests were limited, I tried several iterations.
First, I ran a couple of large JPEGs through the encoder without specifying a quality (meaning, it was compressed at the highest quality level).
Image | Original File Size | Compressed Image | Compressed File Size |
---|---|---|---|
Original | 11.9MB | Compressed | 3.1MB |
Original | 12MB | Compressed | 3.9MB |
So, the results were impressive! Rather than a 30% savings, we ended up with images that are 30% the size of the original and with (to my eye) no discernible quality difference.
So what about setting a quality? The lowest quality that the encoder will allow is 85, so let's test on another sample image from Unsplash using a quality setting of 90.
Image | Original File Size | Compressed Image | Compressed File Size |
---|---|---|---|
Original | 7MB | Compressed | 1.3MB |
The resulting file is about 19% the size of the original, and, honestly, I can't tell the difference in quality.
Let's try this out on PNGs next. I tested it on a couple of PNGs from recent articles here on TDN (most of our authors deliver their images as PNGs and I resize and compress them before publication). Let's look at the results for these two.
Image | Original File Size | Compressed Image | Compressed File Size |
---|---|---|---|
Original | 1.1MB | Compressed | 270KB |
Original | 2.9MB | Compressed | 938KB |
The savings is much smaller, even percentage-wise, than above, but this isn't surprising given the smaller initial file size. Plus, it is still hugely significant. I should note, though, that the compression process seemed to take substantially longer here, even given the smaller initial file sizes.
To make things a little more realistic, I went ahead and resized all of the images as I would for the web. I used an image processor script in Photoshop that resized all the images to a maximum 750px width. Usually, I would have set the quality to about an 8, in Photoshop terms, so I went ahead and did that. I then ran the same thing but with the quality set to maximum.
What I want to do here is compare the file size and quality that I would get through my normal process of formatting images for the web versus using Guetzli compression. Here are the results.
PS Image | PS Compressed File Size | Compressed Image | Compressed File Size |
---|---|---|---|
Photoshop | 139KB | Compressed | 188KB |
Photoshop | 127kB | Compressed | 176KB |
Photoshop | 78KB | Compressed | 111KB |
Photoshop | 156KB | Compressed | 193KB |
Photoshop | 102KB | Compressed | 152KB |
Well, Guetzli didn't win here, but these file sizes are only moderately larger with no loss in quality versus an 8 (out of a 12 scale) in Photoshop quality. If you need to maintain very high image quality, the trade off might be worth it. However, let's try what should be a fairer battle comparing Guetzli with an 85 (out of 100) quality versus the Photoshop files.
PS Image | PS Compressed File Size | Compressed Image | Compressed File Size |
---|---|---|---|
Photoshop | 139KB | Compressed | 111KB |
Photoshop | 127kB | Compressed | 106KB |
Photoshop | 78KB | Compressed | 61KB |
Photoshop | 156KB | Compressed | 127KB |
Photoshop | 102KB | Compressed | 115KB |
These results are more in line with the 20-30% savings that the project proclaims, with the odd exception of the last file which is actually about 13% larger. Interesting. The quality is extremely comparable, though, to my eye, the Photoshop files seemed ever so slightly crisper. Of course, I could increase the quality, but that would lessen the file size decrease.
However, one of the other benefits of working this way is that the compression process with Guetzli is significantly faster due to the smaller initial file. So the file-size savings doesn't come with as much of a time-spent cost. By comparison, the image that took over 15 minutes above took 25 seconds in this case.
The net result based upon my basic tests is that using the Guetzli encoder is worth it from a file size standpoint. It's especially useful if you are handling large, high-resolution images.
Even if you are just dealing with standard web images, as I am, using Tammy Everts's data (again, from 2015), we could theoretically cut the 1,310KB of images on an average page down to about 1,048kb based upon these tests. This would bring it down to about 50% of page weight rather than 62% - a substantial savings, but not a clear-cut solution to the page bloat problem.
Optimizing large high-resolution images has a huge payoff, but comes at the price of between 10 and 15 minutes to compress (or more depending on their size). For smaller images, the trade-off is less significant but so is the payoff. Plus, at the moment, no tooling supports this compression, which means that it is a manual process.
The length of time the compression takes seems to be baked into Guetzli, so, while it may improve slightly, I don't expect it to dramatically change down the line (at least, not soon). However, perhaps we'll see some tooling come out that will make simplify its use.
Header image courtesy of Tom Sens
Brian Rinaldi is a Developer Advocate at Progress focused on the Kinvey mobile backend as a service. You can follow Brian via @remotesynth on Twitter.