If I upload a PNG, I’d like it to remain a PNG personally…and @codinghorror – I did actually reproduce it – see my original post – it is convertng to JPEG all images apparently
This usually implies the file is not optimized, and therefore far larger than it should be. Have you tried running the PNG through a lossless optimizer first?
Isn’t it the option
png to jpg quality that automatically compress png files?
You should try by putting
100 in this setting
I just grabbed it from the Assets for Site Design thread and uploaded it. I could do that and try again… could we just optimize the PNGs and not convert to JPEG so that we could preserve transparency?
I’m on vacation so it’s hard for me to test things at this time, but I strongly suspect that if you ran the PNG through a lossless optimizer (google for it, if you need to), you’ll get a much smaller image that won’t be converted due to size. I could be wrong, but this is my best technical guess based on what I know at this moment.
Thiis is not time-sensitive and can wait until after the holidays
Could a solution be to optimize the PNG server-side rather than convert to JPEG be an option?
Anyone reading this has all the information necessary to test the hypothesis.
We will get this sorted, very likely due to image optimization. I am very surprised that our optimisation is stripping transparent pixels.
It’s filesize based, for the most part. If the image is wildly smaller as a jpg then that’s what we choose. Only a perversely egregious PNG would cause this to happen.
Edit: that does NOT appear to be the case (13kb PNG), so if this is happening, it is a different bug @sam
$ du -sh transparent_logo.png 28K transparent_logo.png
Weird, it is 13kb from imgur so I suspect it’s not optimized.
Either way both 13kb and 26kb are plenty small, so my hypothesis was not correct.
@robbyoconnor yeah something is not right … this PNG is tiny … 12.2k
JPEG converting this should not happen… yet… it happens.
I think it’s because the jpg is only 5kb, so that’s viewed as a significant filesize savings (by percentage perhaps). We should disable this optimization step when the total size is under, say, 100kb?
On it… yeah I would say it has to have a threshold of absolute savings in bytes as well… minimum saving of say 100k.
$ optipng transparent_logo.png -o7 ** Processing: transparent_logo.png 2400x600 pixels, 8 bits/pixel, 17 colors (16 transparent) in palette Input IDAT size = 12340 bytes Input file size = 12715 bytes Trying: zc = 9 zm = 8 zs = 3 f = 5 IDAT size = 12340 transparent_logo.png is already optimized.
I completely missed the fact that it was converted to JPEG somehow
Note recovering here is a bit tricky. After you get the commit deployed you are going to have to either:
- Change 1 byte of data in your png
- Enter your container hunt the upload and destroy it
When you upload a file we tie the SHA1 hash of the upload with all the previous processes, so you can not force old uploads through the new pipeline.
Ok so grab latest @robbyoconnor and give it a whirl. Good find, this was indeed an edge case in our upload code!
This resolves the bug.
Thank you so much!!