Need to losslessly optimize large pngs. (5MB and bigger)

Ok I need some suggestions on some tools that can losslessly compress large images.

I have a handful of .png & .jpg files that are 5MB and larger in size. I have about 441 images that exceed 5MB. Some are even in the range of 10-30 MB and about 3 are 60MB and up.

So if anyone can provide any tools to losslessly compress these that would be great, I’d rather not deal with lossy compression if possible as these are game maps and smearing/distorting of any kind isn’t acceptable.

I’ve tried pnggauntlet and pngoptimizer so far and both are too slow.

Are you using a Mac or PC?

I’m using PC, if the tools have a gui the better :stuck_out_tongue:

The problem is that image compression tools are generally bound to a single CPU core, so I’m afraid you won’t be able to speed them up much, and the difference between the various tools is probably small. With file sizes of 5MB and up, I think you need to ask yourself whether compression is worth it. What are you expecting to save, and to what extent will that affect the user experience?

Lossless pretty much means that there is no visual loss in quality so it won’t really negatively affect the user experience. What am I willing to save? Some of the time it shaves off a miniscule 3-4 percent and most of the time it averages around 50-70 percent savings.

Instead of discouraging me, suggest me with some tools and yes I am willing to check out paid applications/services.

Sorry, I missed the intended use of these images on reading your post the first time.

I’m not discouraging you, I’m just saying you need to realize that compression is a time-consuming task, especially with images that are multiple megabytes in size. You just won’t get around this being “too slow”. Thankfully, as of version 3.1, PNGGauntlet does support parallel processing of images (one image per thread), which will reduce total compression time if you have a few threads to spare.

All APIs and web tools I’m aware of (kraken.io, PunyPNG, TinyPNG, smush.it, etc.) are either lossy or have file size limitations that won’t work for you. If you want a GUI on Windows, I’d say PNGGauntlet with its parallel processing is probably best for the job.

Personally, I’d probably fire up a multi-core cloud server somewhere and have it crunch the images with the help of a wrapper that assigns a compression job to each core. That way, at least this won’t bother you when you’re using your PC for other tasks.

I have a 4 core vps (it’s centos 6 with zpanel installed) that I can use for that but how would I go about setting up such a compression job?

An example with OptiPNG: concurrency - Bash: how to simply parallelize tasks? - Stack Overflow

If you want to use OptiPNG, you’ll have to build it from source first, since it’s not in any of the main repositories.

yum -y install gcc cd /tmp wget http://prdownloads.sourceforge.net/optipng/optipng-0.7.4.tar.gz?download tar -xzvf optipng-0.7.4.tar.gz cd optipng-0.7.4 ./configure make make install

If you’d rather use PNGout, just issue:

yum -y install pngout

That’s assuming you have the EPEL repository. If not:

rpm -Uvh http://dl.fedoraproject.org/pub/epel/6/i386/epel-release-6-8.noarch.rpm yum install optipng --enablerepo=epel

If you didn’t have the EPEL repo, you may want to disable it again afterwards by setting “enabled=0” in /etc/yum.repos.d/epel.repo.

Anyway, assuming you’ve installed OptiPNG and your PNGs are in a directory like “/tmp/png”:

cd /tmp/png find . -iname "*png" -print0 | xargs -0 --max-procs=4 -n 1 optipng -dir /tmp/png-opt/ &

The first part locates all PNG files in the current directory, the xargs bit takes that input and creates a queue of files to pass on to OptiPNG, where --max-procs is the number of processes to start (i.e. cores to use). The -dir directive tells OptiPNG to save the compressed images to a /tmp/png-opt directory. You can pass other options like the optimization level, if you like. The ampersand at the end will ensure the jobs keep running even if you’ve closed your SSH session.

No guarantees, of course, but I’ve just tested it on an 8-thread VPS (CentOS 6.5) and it seems to be working rather well.

Hope that helps.

Thanks rob I’ll give that a shot tomorrow :slight_smile:

I’d refine that slightly…

cd /tmp/png
nohup (find . -iname '*png' -type f -print0 | xargs -0 --max-procs=4 -n 1 optipng -dir /tmp/png-opt/) &

Use single quotes, otherwise the *png will expand if there are any files present that would match in the current directory, and you’'ll only get a part of what you expect
-type f to ignore any directories, sockets, etc that end in png
nohup and brackets so that you can set it off when you leave the office, log out, and it’ll be finished in the morning.

I think it’ll be crunching on images a bit longer than that lol, I have about 26K files to crunch and the total size for all of them is about 11.4 GB.