Score:1

How to convert 32MB of PNG file to 200Kbs without losing its colors

us flag

I have couple of images that are of 32MBs of size and I want to change their size from 32MBs to 100Kbs or any KBs of size without affecting its colours.

The commands I am trying are:

muhammad@muhammad-mohsin:~/scans$ find . -iname '*.png' -exec mogrify -format jpg "*.png" {} +


muhammad@muhammad-mohsin:~/$ find . -type f -iname \*.png -delete


muhammad@muhammad-mohsin:~/$ find . -iname '*.jpg' -exec mogrify -define jpeg:extent=300kb -strip -quality 90 -scale 90% *.jpg {} +

Here, first I convert a PNG to JPG that reduce its size from 32Mbs to 5.8Mbs and everything stays same but when I use 3rd command, it removes background color in image and making it grayscale sort of blurry.

However, the text is still readable but colors and background logo does not.

How I can achieve this with convert, mogrify or any other tool? I tried every possible thing so far.

This is part of original image

This is part of changed image after command

Knud Larsen avatar
by flag
30MB png to 300kb jpg → Example : `convert Sample.png -resize 22% S300.jpg` .... then you have very good quality, but a smaller image.
LearningROR avatar
us flag
@KnudLarsen Thanks for it. Really helped me. Can we add some other feature to make image colours sharp/bright etc? I am using: `convert image.png -resize 35% S300.jpg` and it returns `763Kbs` size. We are close!
LearningROR avatar
us flag
@KnudLarsen Can we a batch process for this command so all images in a folder and sub folders will get this command on them? I am trying `find . -iname '*.png' -exec convert -resize 60% -quality 60 "*.jpg" {} +` but that does not work.
Score:4
om flag

It's because of how JPEG compression works. It attempts to round adjoining pixels that are similar to eachother to similar values. This causes loss of details, and blockyness.

This becomes more noticeable as you increase the compression level, which is exactly what you're doing. In addition you're doing it in two steps:

  1. Lossless (PNG) to lossy (JPEG) compression.
  2. Lossy to lossy compression.

You will probably get a better result by going lossless to lossy at final quality, thus only applying lossy compression once, e.g. using jpeg:extent=300kb -strip -quality 90 -scale 90% in the first conversion.

Furthermore, you say nothing about the size of the image and level of detail. It may not be feasible to get it down to 300kB and retain the desired quality.

To get rid of background blotches, you can try to apply thresholds to your document in some image editing software, forcing anything less than a certain shade of gray to be white, for instance.

However, no matter what you do, compressing from a 30MB lossless format to a 300kB lossy format will lead to visibly reduced quality.

LearningROR avatar
us flag
Thanks so much for detailed response. What do you say in this case after 5.8MBs of file size with JPGs. Can we use any compression tool to make the size a bit lower?
LearningROR avatar
us flag
I have all files in PNGs. How I can use `jpeg:extent=300kb -strip -quality 90 -scale 90%` in once only in this case?
vidarlo avatar
om flag
You can use it in your first conversion. And yes; you can try with e.g. 2MB size, and it will lead to a better result than 300kB target.
LearningROR avatar
us flag
Thank you. Makes sense. +1 for good response. :)
Peter Cordes avatar
fr flag
*encodes them as copies of each other.* - JPEG *only* does DCT quantization, without intra-prediction like "copy the block from 29 pixels in this direction". Maybe you're thinking of h.264 / h.265 I frames (https://en.wikipedia.org/wiki/High_Efficiency_Image_File_Format)? I guess a JPEG encoder doing trellis quantization might try to encode nearby blocks the same way so the final lossless compression step could save more bits?
Peter Cordes avatar
fr flag
Or maybe you mean on a very local scale, wanting to round off high frequency DCT components to zero, making adjacent pixels more like each other. That similar-pixel effect is *not* the result of literally encoding them as copies of each other, but actually of encoding with mostly low spatial frequency coefficients.
vidarlo avatar
om flag
@PeterCordes Thanks for the correction :)
Peter Cordes avatar
fr flag
"*attempts to round adjoining pixels that are similar to eachother to similar values*" is still not how it actually works. That can be part of the result, but it's not that simple. JPEG's lossy compression happens in the frequency domain after [an 8x8 DCT](https://en.wikipedia.org/wiki/Discrete_cosine_transform#Compression_artifacts), not in terms of actual adjacent pixels. So you can get ["ringing" artifacts](https://en.wikipedia.org/wiki/Ringing_artifacts) around sharp edges, and the separate 8x8 block processing is why low qual JPEGs get blocky.
Peter Cordes avatar
fr flag
See also [Two EXACTLY the same .jpg images with one image more than twice the file size of the other - Why?](https://photo.stackexchange.com/a/125291) for more description of how JPEG compresses, and what makes some images harder or easier to compress without significant distortion. (I wrote that answer with technical details, but aimed at an audience that didn't already know how image-compression worked.)
Score:2
by flag

I am trying find . -iname '*.png' -exec convert -resize 60% -quality 60 "*.jpg" {} + but that does not work.

Ref. https://superuser.com/questions/71028/batch-converting-png-to-jpg-in-linux

$ ls -1 *.png | xargs -n 1 bash -c 'convert -quality 60 "$0" "${0%.*}.jpg"'

Converts my example 31MB.png to 1.4MB.jpg . ... You may have to repeat with e.g. $ ls -1 *.PNG | ... etc.

Ref. comment by @steeldriver : "slightly better is xargs -d '\n' -n 1 "

LearningROR avatar
us flag
Thank you, Mr. Knud. I already done that with `find . -iname '*.png' -exec mogrify -resize 60% -quality 60 -format jpg *.png {} +` but accepting and upvoting your answer for putting me in right direction.
hr flag
Note that `ls -1 | xargs -n 1` will break if any of the filenames contains whitespace. You can make it *slightly* better using `xargs -d '\n' -n 1` which will work except for filenames containing newlines - you could handle those as well using null delimiters ex. `printf '%s\0' *.png | xargs -0 -n 1 ... `. However since you are forking a new bash shell for every file I wonder if the whole thing would be just as easily done using a shell loop `for f in *.png; do ... "$f" "${f%.*}.jpg"; done`
Knud Larsen avatar
by flag
@steeldriver : OK, `ls -1` etc. etc. is just one possible set of options. I can add your suggestion to the answer.
mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.