An important performance issue is optimizing content images. Those images are uploaded by the content editors, mostly via a CMS, at least for the bigger websites. They should be crunched, small, with good enough quality.
What are the best solutions for this? SmushIt is a good online tool, but very buggy at times. And it would be better to use an automated solution tied to the CMS.
The only experience I’ve had with this is in the photo space (which is usually the problem with editors). Instead of serving the images directly we would request all of our images through a compression module that would use imagemagik to resize the image to the correct dimensions and compress it to a pre-defined setting.
That works a lot better than smushit for things like photos because we’d usually get editors uploading images directly from a camera (really high quality and huge) and smushit wouldn’t do lossy compression or resizing on the images.
The code was something we had built in-house though and I don’t think it was open-sourced (and certainly not available as a service).
I’ve used a few on the fly resizing tools in the past as I often found images in the CMS uploaded straight from a high spec camera. As much as you try to educate your clients to optimize before uploading it’s always good to have a safety net.
The off the shelf tools I’ve used weren’t perfect as far as file size is concerned but are definitely a starting point for when images are much larger pixel-wise than required. These will both cache generated images to save your servers CPU.
ASP.Net see ImageGen by Doug Robar
PHP see TimThumb by Tim McDaniels and Ben Gillbanks.
These are the only two I’ve used. I’m keen to hear if anyone has any alternatives, particularly in .Net
via my template, i can generate an image to show on the site, and set the quality parameter in there.
Im sure there is similar solutions for all languages.
You can also make a homebrew solution in php(or what have you) where any image requested is sent to your script via .htaccess and it reduces its quality to something predefined …
We’ve decided that no tool can decide if the size and quality of an image is correct, only the web editors can do that. Hence we are aiming for a non-technical approach:
Every XX weeks:
Download all images from image libraries
Run images through lossless compressor
Upload all images to image libraries
We’ll also make an instruction for the web editors, letting them know that they should run all images through the compressor before putting them online.
The hard part is “the same quality”. A JPEG by definition will never be the same quality (technically) as a PNG because it is lossy. It would probably be possible to detect if a PNG has an alpha layer (which would make JPEG unsuitable) and if not, try compressing the image as a JPEG to see the resulting size. A human would still need to make a decision if the image still looked good though (particularly if there was any text in it).
If you want to go manual, try “irfanview”. It’s free, has a ton of filters, plugins, let’s you save on whatever format you like and you can enhance, add special effects, resize, crop, etc.
I ran image outputs vs. smushit, punypng and file size was the same, but with a bit better quality.
On MacOS X, there’s a small tool that do the job :
ImageOptim
just a right click on the image file and it’s done fore jpegs and pngs
And I love ImageAlpha to optimize 24 PNG to 8 bit PNG with Alpha transparency that works well on IE
If you have Photoshop, you can choose, file > save for web
you then have the options of JPEG, GIF, or PNG with a quality slider from 0-100,
and you have options for transparency, progressive, etc…
You can see the original image and compare it to the compressed versions before saving.