Tomcat images broken on Windows - linux

I have a Tomcat server running on Linux. When viewing a png on Chrome in Windows, this image looks like this
http://imgur.com/x08QkUD in contrast the image on any Unix system: http://imgur.com/OIk84Cb
As you can see it is totally corrupted. Viewing the image in any Unix system it looks just fine (Without all those yellow weird lines).
If I look at this in Firefox (Windows), the browser response with "Cannot display this image because it contains errors"
Here is my request and response headers for this image (this is the same Response, Request as on a Unix system)
Request Method:GET
Status Code:200 OK
Response Headers
view source
Accept-Ranges:bytes
Content-Length:15432
Content-Type:image/png
Date:Tue, 01 Sep 2015 17:21:23 GMT
ETag:W/"15432-1441113486000"
Last-Modified:Tue, 01 Sep 2015 13:18:06 GMT
Server:Apache-Coyote/1.1
Request Headers
view source
Accept:image/webp,*/*;q=0.8
Accept-Encoding:gzip, deflate, sdch
Accept-Language:de-DE,de;q=0.8,en-US;q=0.6,en;q=0.4
Cache-Control:no-cache
Connection:keep-alive
Host:148.251.217.3
Pragma:no-cache
Referer:http://148.251.217.3/
User-Agent:Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/42.0.2311.135
Is there anything that I can change in Tomcat to change this weird behavior?
I am thinking of mimetype, compression, ...
I have basically have the default configuration of Ubuntu 15.04 just now

Related

gzip compression not working with IIS 8.5

I have a Server 2012 R2 box running IIS. I've tried enabling compression for several sites running on that box, but I can't figure out why it won't work. My request headers all show accept-encoding, but the response headers are always Transfer-Encoding:chunked and Vary:Accept-Encoding. The following steps have been performed to try to get gzip compression working:
Dynamic and Static compression have been enabled on each site and at the machine level
Both compression methods are installed from Server Manager
Httpcompression and urlcompression nodes have been manually added to web.configs
Mime types are defined for compression
frequentHitThreshold has been set to 1, so all content should be compressed after the first attempt to access it
A trace has been done to see why compression isn't occurring. The only information I have is the code DYNAMIC_COMPRESSION_NOT_SUCCESS with a reason of 1.
Here are the headers:
GET http://redactedservername:8082/ HTTP/1.1
Host: redactedservername:8082
Connection: keep-alive
Cache-Control: max-age=0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/50.0.2661.102 Safari/537.36
DNT: 1
Accept-Encoding: gzip, deflate, sdch
Accept-Language: en-US,en;q=0.8
Cookie: ASP.NET_SessionId=gnqovt55ggt22lycufudc0ns
`
HTTP/1.1 200 OK
Cache-Control: private
Content-Type: text/html; charset=utf-8
Vary: Accept-Encoding
Date: Wed, 22 Jun 2016 14:00:57 GMT
Transfer-Encoding: chunked
What other steps can be performed to get compression to work?
Compression was working, but ESET Antivirus was doing its job of monitoring web traffic. This modified the response and I didn't get gzip content encoding as expected. Disabling ESET and testing again showed that compression was functioning.

Windows Azure GZIP for ASP.Net MVC Web Site

I've been tearing my hair out trying to get GZIP compression to work for a standard Azure Web Site which is using ASP.NET MVC 5.
Does anybody have a definitive guide to setting up the web.config or applying a custom attribute?
I've tried everything I usually use locally such as supplying GZIP/DEFLATE content encoding header but to no avail.
I'm beginning to think it doesn't work in the standard web site but I thought I should ask first..
Cheers
-- Update.
I am not getting any errors, (I as far as I can tell) am checking the Accept-Encoding header, and adding a Content-encoding header in return with the value gzip. However when I try to check the headers in Chrome/Firefox on the live site it does not seem to be working.
e.g. Locally
GET / HTTP/1.1
Host: localhost:34249
Connection: keep-alive
Cache-Control: no-cache
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8
Pragma: no-cache
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/36.0.1985.125 Safari/537.36
Accept-Encoding: gzip,deflate,sdch
Accept-Language: en-GB,en;q=0.8,en-US;q=0.6,fr;q=0.4
Cache-Control:public, max-age=15
Content-Encoding:gzip
Content-Length:22900
Content-Type:text/html; charset=utf-8
Date:Wed, 13 Aug 2014 20:18:47 GMT
Expires:Wed, 13 Aug 2014 20:19:00 GMT
Last-Modified:Wed, 13 Aug 2014 20:18:45 GMT
Vary:*,Accept-Encoding
X-Frame-Options:DENY
X-SourceFiles:=?UTF-8?B?QzpcV29ya1xCbG9nXEJsb2c=?=
On Azure...
Accept:text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8
Accept-Encoding:gzip,deflate,sdch
Accept-Language:en-GB,en;q=0.8,en-US;q=0.6,fr;q=0.4
Cache-Control:no-cache
Connection:keep-alive
Cookie:ARRAffinity=2f6641fa653941a3835129cf27dd73a8f366413851cb89c13e53d88bf9cadc19
Host:<bla>
Pragma:no-cache
User-Agent:Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/36.0.1985.125 Safari/537.36
Cache-Control:public, max-age=8
Content-Length:44549
Content-Type:text/html; charset=utf-8
Date:Wed, 13 Aug 2014 20:20:03 GMT
Expires:Wed, 13 Aug 2014 20:20:12 GMT
Last-Modified:Wed, 13 Aug 2014 20:19:57 GMT
Vary:*,Accept-Encoding
X-Frame-Options:DENY
As you can see no content encoding parameter. Let me know if you need any more information. Cheers J
Apologies I thought I had already closed the question. It seems I was going through a proxy which was stripping out GZIP in this particular example.
SyntaxC4 - Many thanks for the cheat sheet I'm sure it will become useful later if I require IP blocking.
Cheers,
J

Cannot gzip compress static files in Nodejs

Gzipping static files does not work as required (as I think). I used gzippo and express.compress(). Both gzipp the files one time. There is no Content-Encoding:gzip if I refresh the page again.
Her is how I setup my express app:
var gzippo = require('gzippo');
var express = require('express');
var app = express();
app.use(express.compress());
app.use(app.router);
app.use(gzippo.staticGzip(__dirname + '/www'));
This is what Chrome network Response Headers show after page update:
EDITED for full request headers
GET / HTTP/1.1
Host: myhost.com
Connection: keep-alive
Cache-Control: max-age=0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/ *;q=0.8
User-Agent: Mozilla/5.0 (Windows NT 6.3; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/33.0.1750.146 Safari/537.36
Accept-Encoding: gzip,deflate,sdch
Accept-Language: en-US,en;q=0.8
Cookie: __utma=161759779.498387976.1381482631.1394444924.1394395346.80; __utmb=161759779.3.10.1394395346; __utmc=161759779; __utmz=161759779.1394444924.79.7.utmcsr=gtmetrix.com|utmccn=(referral)|utmcmd=referral|utmcct=/reports/myhost.com/5iXAs1ej/retest
If-None-Match: "25020-1394452200000"
If-Modified-Since: Mon, 10 Mar 2014 11:50:00 GMT
If I refresh again it shows: Edited with full response headers.
HTTP/1.1 304 Not Modified
x-powered-by: Express
accept-ranges: bytes
etag: "25020-1394452200000"
date: Mon, 10 Mar 2014 10:51:45 GMT
cache-control: public, max-age=0
last-modified: Mon, 10 Mar 2014 11:50:00 GMT
If I edit the page again I get Content-Encoding:gzip but only one time.
I don't know if there is something wrong with my express setup.
NOTE: I serve my page as: res.sendfile(__dirname + '/www/index.html');
If I edit the page again I get Content-Encoding:gzip but only one time. I don't know if there is something wrong with my express setup.
All is well. The first time your server sends the file with gzip compression. The second time the normal etag cacheing mechanism comes into play, and since the file has not been modified, the server tells the browser "you already have the right version of this file" and thus there is no need for the server to send a response body at all, just headers, thus no need for any Content-Encoding header.

Running Rendr Examples Results in HTTP 502 Error When Links Clicked

I have built and run Rendr's example apps on Ubuntu 13.10 using Node v0.8.6. When I click on the Repos or Users links, I get an HTTP 502 - Bad Gateway error, but when I refresh page (load from server) it works (200 - OK) and the repos or users are displayed
Here is server output for the working case - (page refresh):
127.0.0.1 - - [Fri, 31 Jan 2014 22:47:56 GMT] "GET /repos HTTP/1.1" 200 - "-" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML,
like Gecko) Ubuntu Chromium/32.0.1700.102 Chrome/32.0.1700.102
Safari/537.36"
And here is the failure case - (link navigation):
127.0.0.1 - - [Fri, 31 Jan 2014 22:48:07 GMT] "GET /api/-/users HTTP/1.1" 502 - "http://localhost:3030/users" "Mozilla/5.0 (X11; Linux
x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Ubuntu
Chromium/32.0.1700.102 Chrome/32.0.1700.102 Safari/537.36"
Any ideas or pointers to what the problem might be?
Thanks.
The solution for this problem is here: https://github.com/airbnb/rendr/issues/266

Express.js Node Framework - Not caching

My Node.js app uses Express and my app has a route that sends a JSON file with Tweet data. I want it to cache for 20 seconds. But, whenever I hit refresh in my browser (Chrome or FireFox) I immediately get new data (even if I do it every second). Note that the data does change more than every 20 seconds, but I still want a 20 second cache.
Here is my route.
app.get('/tweet-stats.json', function(req, res) {
res.set('Cache-Control', 'public, max-age=20');
res.set('Expires', new Date(Date.now() + 20000));
res.set('Last-Modified', new Date(Date.now()));
res.set('Content-Type', 'application/json');
res.send(publicTweetStatus());
});
Here are the request and response headers from FireFox (FireBug):
Response Headers
HTTP/1.1 200 OK
X-Powered-By: Express
Cache-Control: public, max-age=20
Expires: Fri May 10 2013 06:52:11 GMT+0000 (UTC)
Last-Modified: Fri May 10 2013 06:51:51 GMT+0000 (UTC)
Content-Type: application/json
Content-Length: 209
Connection: keep-alive
Request Headers
GET /tweet-stats.json HTTP/1.1
Host: mydevelopmenturl.com
User-Agent: Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:20.0) Gecko/20100101 Firefox/20.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate
Connection: keep-alive
How can I make browsers cache this for 20 seconds before checking with the server again?
UPDATE
So, I tried robertklep's suggestion and it worked in some browsers/OSs and not others:
Ubuntu Chrome - No Cache!!!!!!!!!!!!
Ubuntu FireFox - No Cache!!!!!!!!!!!!!!!
Windows 7 - Chrome - Cache
Windows 7 - FireFox - Cache
Windows 7 - IE 9 - Cache
Windows 7 - Opera - No Cache!!!!!!!!!!!!!!!!!
iOS Safari - Cache
Mac OSX - Safari - Cache
Mac OSX - Chrome - Cache
Mac OSX - Firefox - Cache
Why the differences?

Resources