I'm having trouble getting compression to work with ServiceStack. I return .ToOptimizedResult from my server, and I get a log entry that tells my that the header is added:
ServiceStack.WebHost.Endpoints.Extensions.HttpResponseExtensions:
DEBUG: Setting Custom HTTP Header: Content-Encoding: deflate
However the content returned is not compressed. I've checked using both Fiddler and Network inspector in Chrome.
Sorry to all
It seems that maybe my antivirus (BitDefender) decompresses the data to scan for virus, even though I disabled the AV. When testing on other computers the output is compressed.
Related
I observed that Content-Encoded response header was missing, notably Content-Encoded: gzip. I'm using static content compression. The dynamic content compression feature was never installed. I installed it, enabled it, and tested again. This time, Content-Encoded: gzip appeared in the response. The question is why does the response header appear for dynamic content compression but not for static content compression? I'm fairly certain that IIS is applying gzip to static content compression. Here's why:
I have an IIS URL Rewrite outbound rule which modifies the response on an HTML page. The outbound rule yielded Error 500.52, URL Rewrite Module error -- Outbound rewrite rules cannot be applied when the content of the HTTP response is encoded ("gzip"). The rule is not the issue, just evidence that gzip is reportedly being applied. I disabled the rule. That's clue #1.
Clue #2 is I enabled Failed Request Tracing and observed that not only static compression was being applied but the StaticFileModule was storing the compressed file in the following location: C:\INETPUB\TEMP\IIS TEMPORARY COMPRESSED FILES\MY WEBSITE\$^_GZIP_D^\INETPUB\WWWROOT\TEST.HTML.
I read the Microsoft document on IIS HTTP Compression and--I could be wrong--I didn't see any language that suggests gzip can be employed with static compression. Based on the two clues above, gzip is being employed with static compression.
So I go back to the original problem, which is Content-Encoded response header is missing for static content impression, yet evidence suggests that IIS is not only compressing static content but compressing it with gzip. Is this simply a bug? Is this by design?
Static Compression will add Content-Encoded header when it work.
If you enable failed request tracing and trace static compression module. You will see this.
It means static compression won't work if a static file didn't get hit frequently.
If you relay this request doezens times. Then you will see that header.
Be careful that, there is a limit for minimum file size for compression. You could modify that value in IIS manager->server node->configuration manager->system.webServer/httpCompression->minfileforcomp
We're using Standard offering of Verizon CDN in Azure. From the documentation it's clear that Verizon gives priority to other compression schemes over Brotli if the client supports multiple ones (https://learn.microsoft.com/en-us/azure/cdn/cdn-improve-performance#azure-cdn-from-verizon-profiles):
If the request supports more than one compression type, those compression types take precedence over brotli compression.
Problem is that our origin gives priority to Brotli. So for a request with Accept-Encoding: gzip, deflate, br header directly made to the origin, the response comes back with Content-Encoding: br header. However, the same request going through CDN comes back with Content-Encoding: gzip.
Azure's documentation isn't clear on what occurs here. Does the POP node decompress the resource and re-compress with gzip and cache? Does it decompress and cache, then compress on the fly based on the request's header? I posed the question to Azure support and sadly didn't get a definitive answer.
I have finally got a conclusive answer from Verizon. The Via header from CDN's POP node to origin was effectively disabling the compression (this page would explain it better: https://community.akamai.com/customers/s/article/Beware-the-Via-header-and-its-bandwidth-impact-on-your-origin?language=en_US). Handling that in our web server (either strip the header or configure the web server to compress regardless) solved the issue. In other words, if the client support Brotli and origin prefers Brotli, Verizon's CDN caches and uses the content compressed with Brotli.
In other words, Microsoft's documentation is misleading and incomplete.
I'm using Firebug and NetExport on Linux to save all web browser communication (mostly HTTP and HTTPS requests and responses). However, in my .har file I see messages like this:
The resource from this URL is not text: http://...
Instead of these messages I want to see the actual, full binary content (not even a single bit transformed or changed or lost). How do I get that?
I have root access on the local machine. A solution using Chrome or Firefox is fine.
Please don't recommend that I download binary files manually, there are too many of them, and I need to time the downloads perfectly, with the correct set of cookies (which may expire by the time I download manually). Please don't recommend non-Linux solutions, I have access only to Linux systems. Please don't recommend Wireshark (or tcpdump), because it can't save decrypted HTTPS traffic if I don't have the private key of the server.
In about:config I've set extensions.firebug.cache.mimeTypes to a space-separated list of MIME types, restarted Firefox, and everything got saved.
application/x-shockwave-flash image/gif image/jpeg image/png application/octet-stream
Please note that some documents are still missing from the .har file, I get this:
Reload the page to get source for: http://...
Our IIS server has Dynamic and Static HTML Compression enabled, but when I browse to our website and view the Response Headers in Fiddler, I only see the "Content-Encoding: gzip" header for one resource (a flash file).
Why would the other response types not have this header? Does it mean that compression is NOT working for the other responses?
The only way to be 100% sure that compression is active is to compare the size of the downloaded resource against the original file on the server. The network tab of the Firebug extension can help you here.
It looks like our company network was actually stripping out the Content-Encoding header. (I have no idea why). When I browse from home the gzipping seems to work fine. This post on StackExchange.com helped me figure it out.
We have a problem with an IIS5 server.
When certain users/browsers click to download .zip files, binary gibberish text sometimes renders in the browser window. The desired behavior is for the file to either download or open with the associated zip application.
Initially, we suspected that the wrong content-type header was set on the file. The IIS tech confirmed that .zip files were being served by IIS with the mime-type "application/x-zip-compressed".
However, an inspection of the HTTP packets using Wireshark reveals that requests for zip files return two Content-Type headers.
Content-Type: text/html;
charset=UTF-8
Content-Type:
application/x-zip-compressed
Any idea why IIS is sending two content-type headers? This doesn't happen for regular HTML or images files. It does happen with ZIP and PDF.
Is there a particular place we can ask the IIS tech to look? Or is there a configuration file we can examine?
I believe - and i may be wrong that the http 1.1 header sends multiple headers definitions and the most specific has precedence .
so in your example here it is sending 2 text/html and then application/x-zip-commercial so the second one would be the most specific - if that cant be handled on the client then the more general one is used (the first one in this case ) -
I have read through this http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html and that sort of points to what you are saying - not sure if this is what is actually happening though.
Of course i may be totally wrong here
Make sure that you don't have any ISAPI filters or ASP.net HTTP modules set up to rewrite the headers. If they don't check to see if the header already exists, it will be appended rather than replaced. We had issues a while ago with an in-house authentication module not correctly updating the headers so we were getting two Authorization headers, one from IIS and one from our module.
What software has been installed on the server to work with .zip files?
It looks like IIS picks up MIME translations from the registry, perhaps zip-software you use has registered the MIME-type. This doesn't explain why IIS would respond with two content-type headers, so any ISAPI filter and other Mime-table is suspect.
This may be related to this knowledge base article. It is suggesting that IIS may be gzipping the already zipped file, but some browsers just buck pass straight to a secondary application giving you bad data (as it has been zipped twice). If you change the mime type of the zip extension to application/octet-stream this may not happen.
It sounds like there may be a issue with your configuration of IIS. However that is not possible to tell from your post if this is the case.
You can have mime types configured on several levels on your IIS. My IIS 5 knowledge is a bit rusty, as far as I can remeber this behavior is the same for IIS 6. I tried to simulate this on a IIS 6 enviroment, but only ever received one mime type depending on the accepted header
I have set the the header for zip files on the site to application/x-zip-compressed and for the file I have explicity set it to
tinyget -srv:dev.24.com -uri:/helloworld.zip -tbLoadSecurity
WWWConnect::Connect("server.domain.com","80")
IP = "127.0.0.1:80"
source port: 1581
REQUEST: **************
GET /helloworld.zip HTTP/1.1
Host: server.domain.com
Accept: */*
RESPONSE: **************
HTTP/1.1 200 OK
Content-Length: 155
Content-Type: text/html
Last-Modified: Wed, 29 Apr 2009 08:43:10 GMT
Accept-Ranges: bytes
ETag: "747da786a6c8c91:0"
Server: Microsoft-IIS/6.0
Date: Wed, 29 Apr 2009 10:47:10 GMT
PK??
? ? ? helloworld.txthello worldPK??ΒΆ
? ? ? ? helloworld.txtPK?? ? ? < 7 ? hello world sample
WWWConnect::Close("server.domain.com","80")
closed source port: 1581
However I dont feel this prove much. It does however raise a few questions:
What is all the mime maps that have been setup on the server (ask the server admin for the metabase.xml file, and then you can make sure he has not missed some setting)
Is those clients on a network that is under your control? Probably not, I wonder what proxy server might be sitting inbetween your server and the clients
How does the IIS log's look like, for that request, I am spesifically intrested in the Accept header.
I wonder what fiddler will show?
I've encountered a similar problem. I was testing downloads on IIS 6 and couldn't figure out why a zipped file called test.zip was displaying as text in IE8 (it was fine in other browsers, where it would download).
Then I realised that for the test I'd compressed a very small text file. My guess is that IE sniffed the file, saw the text (which was pretty much uncompressed because of the small size) and decided it was plain text.
I tried again with a larger file and the download prompt appeared OK in IE8.
May not be relevant to your case, but thought I'd mention it.
Tim