I am trying to remove 'Optimize images' warning from Google PageSpeed Insights.
Validator says I can do about 40 - 50% image reduction for each jpg on my website. So I installed jpegoptim on Ubuntu and firstly try lossless optimization. It did about 7% reduction, so when I uploaded images on website and run validation, warning 'Optimize images' still occurs.
So I tried to go on lossy optimization of 90% quality. It save about 16% image reduction, but it's still not enough. And when i go to 50% quality to get on 40 -50% image reduction, images are bad quality.
I tried to use https://compressor.io with lossy optimization and i got really fine reduction. I looked on about section and they use jpegoptim for .jpg so where can be problem? I have to implement jpegoptim to my app and i can't run every single jpg with compressor.io.
I want to remove this Pagespeed warning, because it looks like important. But i don't have to use bad quality images on website. Do you have any experiences with this?
Thank you.
You can use Gtmetrix. In their results they will give you a compressed image.
I use http://compressjpeg.com/
Neither will reduce the quality of the images.
Related
I have done a research comparing JPEG and JPEG-2000 and couldn't reach the same or better perceptual level of quality for JPEG-2000 compared to JPEG in extreme compressing levels (for web).
Although, announced as ~20+% better perceptual quality with the same size than original JPEG, with available tools to reconvert existing JPEGs or even lossless PNGs, original JPEG was still superior. JPEG-2000 managed to get arguably better results only for huge images which are not as widely used in web (Full HD and bigger).
ImageMagick, GraphicsMagick, OpenJPEG all showed identical results (I assume, due to usage of Jasper to encode JPEG-2000) and all lacked options for encoding. Using Kakadu or online/basic converters didn't help either. In current status quo, tools like imagemin with plugins, can provide much better quality JPEGs on output than JPEG-2000 when maximally compressed for web. So JPEG-2000 being useful mostly for Safari, doesn't get any point to be another format to encode since regular JPEG provides better results.
What do I do wrong and are there any other tools/tricks that have more advanced options for JPEG-2000 encoding to finally beat JPEG?
It's not just you. JPEG 2000 isn't that much better. It's difficult to encode well, and we've got more mature JPEG encoders nowadays (MozJPEG, Guetzli).
JPEG 2000 is wavelet-based, which gives it better scores in the PSNR metric. However, JPEG 2000 excels in exactly the thing PSNR metric prefers too much: blurring. It makes JPEG 2000 look great in low quality range, where the older JPEG breaks down. But it's not helpful in practice for higher-quality images. Wavelets also struggle with text and sharp edges, just like the older JPEG.
JPEG 2000 has very nice progressive loading, but Safari has stopped supporting progressive rendering at some point.
The only practical use for JPEG 2000 is delivering photographic-like images with alpha channel, because JPEG can't do that, and JPEG 2000 still wins compared to PNG.
I did crawl the images in the Google Image Search window
but, the images are too small so I want to increased the size
I increased the size using PIL, but the picture is broken(Image quality is too low)
How can I increase the images size with good quality?
I used PIL this way
from PIL import Image
im = Image.open('filename')
im_new = im.resize((500, 500))
im_new.save('filename2')
No, I think you maybe get a wrong understanding of the real problem.
The images you got are just some thumbnails, so it contains little information. Your efforts to improve the image quality
by some algorithm may be very hard to make a difference. Probably only by using some machine learning tricks can you make the photos a little nicer.
In my opinion, what you need to do is to get original images you got with Google search rather than use thumbnails. You can do this by do a lot more analysis with image search results. Good luck :)
With all the fuss about WEBP and how cool it is, I'm still getting JPEG thumbnail images in image search on google.com in year 2016, even though Chrome browser tells in HTTP header it accepts webp images: accept:image/webp,image/,/*;q=0.8
Why is that so?
Answering myself.
It may have reason "just not adopted yet", although it may as well be "not worth adopting", because of the following:
WEBP gives better overall quality, but image distortions are different on low quality encoder settings, when compared to classical JPEG:
JPEG gives uniform distortions all over the picture, on hard edges, soft edges and soft gradients
while WEBP handles soft gradients and hard edges better than JPEG, it gives more distortions on the soft-edges. Because of that, image looks deformed.
Example: moon in the following image: http://xooyoozoo.github.io/yolo-octo-bugfixes/#pont-de-quebec-at-night&jpg=s&webp=s
As a side note: WEBP is used for video thumbnails on YouTube, but given the source is video, WEBP is more acceptable in this scenario, than encoding thumbnails for JPEG images.
I've used bigtiff library(coming from libtiff to deal with files larger than 4GB) to generate a 8GB image. It can be saved and display successfully, however, it's still to large to be transferred or shared. So I'm trying to compress it with the jpeg support from bigtiff(or libtiff), but it doesn't seem to work for me.
TIFFSetField(out,TIFFTAG_COMPRESSION,COMPRESSION_JPEG);
TIFFSetField(out,TIFFTAG_JPEGQUALITY,30);
TIFFSetField(out,TIFFTAG_JPEGCOLORMODE,JPEGCOLORMODE_RGB);
As above, I set the Jpeg quality tag to 30, and the program can be compiled correctly. However, the result turned out to be a image with jpeg quality equaling to 75 with compression ratio 0.99.
Does anyone have any idea about what's going on here or any suggestions?
Thanks,
sunhmy
There is no such thing as a JPEG quality value. Quality settings are simplifications JPEG encoders use to select quantization tables.
One encoder may use a range of 0-100.
Another might use 1-8.
You would need to find out what BigTif uses as its range of "Quality" values. It's 30 may well correspond to your 75.
You might want to play around with different values to find out.
I am in need of a high resolution images but that is not the problem.I have many high resolution images which are truly free online but their weight is a hinderance for use on the web.For instance,a good image with 1920px by 1080px is weighing in at 1.2mb yet i see images of similar dimensions i.e http://farm4.staticflickr.com/3241/3096487740_ebb4ea9819_o.jpg that weigh a lot less (175kb).
Is there a suitable and reliable web service that can help me reduce the weight of a big image like http://www.rgbstock.com/filedownload/krappweis/nX2zg5e.jpg
Thanks.
I have found a solution.When looking to reduce the weight of an image,don't google 'reduce size'.Also read carefully what free image compression web service is being offered.Some will ruin and have ruined my images.
I finally found this web service http://compressnow.com/