How do I know if IIS is really Compressing my HTML? - iis

Our IIS server has Dynamic and Static HTML Compression enabled, but when I browse to our website and view the Response Headers in Fiddler, I only see the "Content-Encoding: gzip" header for one resource (a flash file).
Why would the other response types not have this header? Does it mean that compression is NOT working for the other responses?

The only way to be 100% sure that compression is active is to compare the size of the downloaded resource against the original file on the server. The network tab of the Firebug extension can help you here.

It looks like our company network was actually stripping out the Content-Encoding header. (I have no idea why). When I browse from home the gzipping seems to work fine. This post on StackExchange.com helped me figure it out.

Related

Preventing file fownload (any file even css and js) in IIS8

Is there any way to set up IIS in a way that no file can be downloaded? (without xml configuration and using the IIS UI only).
I know I can remove all mime-types but that is inconvenient, I'm looking for a better option.
You cannot stop of css or js as the server cannot differentiate between a normal
request and a download request when it comes to css or js.
Say you have a CSS or js included in your webpage any html page .This css file will need to be downloaded by the browser.So at server side(IIS) you cannot differentiate a normal download request or browser request. You can deny the download request if the referrer is not matching.For example if someone take sthe URL and directly copying and pasting it in another browser,the referrer header will not be present.Similarly if someone else hotlink to your resources(css,js files),you can stop that too.
By deny download based on Referrer using URLRewrite

Why is compression not working in servicestack

I'm having trouble getting compression to work with ServiceStack. I return .ToOptimizedResult from my server, and I get a log entry that tells my that the header is added:
ServiceStack.WebHost.Endpoints.Extensions.HttpResponseExtensions:
DEBUG: Setting Custom HTTP Header: Content-Encoding: deflate
However the content returned is not compressed. I've checked using both Fiddler and Network inspector in Chrome.
Sorry to all
It seems that maybe my antivirus (BitDefender) decompresses the data to scan for virus, even though I disabled the AV. When testing on other computers the output is compressed.

Vary header when content is not gzip:ed on IIS 7 as origin for CDN

I'm trying to set up my IIS server as an origin server for a CDN. I have solved some issues already for example that IIS doesn't give gziped content to proxies (if they have the via header) and also that frequentHitThreshold problem.
My CDN supplier pointed out that another problem with IIS is that it doesn't return a "Vary" header if the client doesn't request the content gziped. According to them the problem is that if for some reason the first client that request the content doesn't want the content gziped the CDN then doesn't request a new version of the file since the Vary header doesn't indicate that it should return two different files depending on "Accept-Encoding".
My only solution so far is to add "Vary: Accept-Encoding" as a custom header but since IIS automatically add this vary header when gziped is requested so i end up with multiple values like "Vary: Accept-Encoding, Accept-Encoding".
Anyone have any solution to this? Or can confirm that it's a real issue.
This is a real issue. IIS gzip module overwrites existing Vary headers. Please vote on this MS Connect issue. Related article here.
This issue is now addressed by an official patch to IIS. To download and further info, visit http://support.microsoft.com/kb/2877816
Erez Benari, IIS PM

What's the best way to troubleshoot Akamai headers these days?

Traditionally, I would inspect the Akamai headers by installing a Firefox extension called akamaiheaders.xpi. Unfortunately, I think the last version of Firefox to support this was 3.
As I understand it, this plugin would add special headers to all HTTP requests that Firefox made, which would prompt Akamai to add a bunch of headers to the response (telling me whether the file was cached, where it got it from, etc.). Then, using a tool like HTTPFox or Firebug, I could easily see which assets were cached and which ones were not.
I've searched all over, but I can't find anything as simple and easy to use as that. Does anyone know of anything out there that allows me to track all the Akamai headers for all the assets my browser loads that works in either FF, Chrome, or Safari?
You can use curl and/or wget for this:
curl -H "Pragma: akamai-x-cache-on, akamai-x-cache-remote-on, akamai-x-check-cacheable, akamai-x-get-cache-key, akamai-x-get-extracted-values, akamai-x-get-nonces, akamai-x-get-ssl-client-session-id, akamai-x-get-true-cache-key, akamai-x-serial-no" -IXGET http://www.oxfordpress.com/
or
wget -S -O /dev/null --header="Pragma: akamai-x-cache-on, akamai-x-cache-remote-on, akamai-x-check-cacheable, akamai-x-get-cache-key, akamai-x-get-extracted-values, akamai-x-get-nonces, akamai-x-get-ssl-client-session-id, akamai-x-get-true-cache-key, akamai-x-serial-no" http://www.oxfordpress.com/
If you want to test staging environment, you need to remember to send Host header, eg:
curl -H "Host: www.oxfordpress.com" -H "Pragma: ..." -IXGET http://oxfordpress.com.edgesuite-staging.net/
This way or another, it's always about sending proper Pragma headers and then reading response headers.
List of Pragma headers as well as explanations for X-Cache response header can be found here: http://webspherehelp.blogspot.com/2009/07/understanding-akamai-headers-to-debug.html.
I know this question is old, but since I came across it in my search today I thought I'd add an answer for the next person who comes along.
There are a couple of extensions in the Chrome store for this now:
Akamai debug headers which just adds headers to your net panel in web inspector
Exceda Akamai Headers Extension which seems to also work for purging cache.
Akamai debug headers is the one I chose and it's working well so far.
You can use a local proxy (e.g. Fiddler or Charles Proxy, my personal favorite) and add the following header to outgoing requests:
Pragma: akamai-x-cache-on, akamai-x-cache-remote-on, akamai-x-check-cacheable, akamai-x-get-cache-key, akamai-x-get-extracted-values, akamai-x-get-nonces, akamai-x-get-ssl-client-session-id, akamai-x-get-true-cache-key, akamai-x-serial-no
If you're using Chrome or Chromium, you can use the extensions Header Hacker or Pragma Header. With either one, you will be have to add Pragmas manually.
If you can find the akamaiheader.xpi file, you can just open it and change the maxVersion in install.rdf to 9.*
.xpi files are just ZIP files, and on most machines you can just add .zip to the filename and doubleclick on it.
To debug akamai headers, for the Chrome browser, try this extension: CDN Headers & Cookies - Chrome Web Store
https://chrome.google.com/webstore/detail/cdn-headers-cookies/obldlamadkihjlkdjblncejeblbogmnb
Note: Enable 'Load Akamai Headers' in the settings (click the 'Lego minifig Head' icon, click the gear, and check on 'Load Akamai Headers').
It has been suggested on the Akamai community.
https://community.akamai.com/community/web-performance/blog/2015/03/31/using-akamai-pragma-headers-to-investigate-or-troubleshoot-akamai-content-delivery
They have a new version of the XPI out which you can download in Luna. There's also another Plugin which adds a 'content source' pane into Firebug for a quick reference of what on the page was Akamaised.
As I say, to download both plugins you need to login to Luna and look under 'Support' > 'More Tools' > 'Browser Extensions'. The XPI isn't publicly accessible.
YMMV but as far as I recall being told by colleagues the Exceda plugin duplicated HTTP requests which can be a bit messy whilst debugging.
For Chrome I find ModHeader + Setting up a profile where the Pragma headers are sent works fine.

How to detect which content is not secured on mixed content SSL page.?

I've added a SSL certificate to an existing site, and now in IE I get a mixed content warning. Problem is, I don't know what's the non-secure content IE is warning me about. It's a simple html page, with a few Flash, a few images, a loaded CSS and JS.
How can I find out what's the non-secured content..?
Edit:
I found the culprit: it's the JS AC_RunActiveContent.js used to display Flash movie. So anyone has an idea on how to prevent SSL mixed content when using AC_RunActiveContent.js.?
This means that something is requesting content using the http protocol specifically, or you have an absolute path to an image or other content that begins with http instead of https.
A few tips: Use relative paths everywhere you can. If you must use an absolute path, and it's to a server you own, use https. If you're loading stuff from off your site, you're probably stuck with the mixed-content warning.
This also goes for your scripts, check out the JS, and the CSS template and make sure they're not the guilty parties - if they are change them to use relative paths, or to request items via https instead of http (assuming you're positive that the server they're referencing supports https, if it doesn't you're stuck).
There are a few other details, this might be helpful.
Ok, so here is the solution for my particular problem. It was the codebase value in my code that needed to be https as well (I didn't think it would trigger the warning, as my Flash were displaying correctly, oh well)...
AC_FL_RunContent( 'codebase','https://download.macromedia.com/pub/shoc...
Link to Adobe info on this: Security Information error in Internet Explorer
I use the Firefox console -- it reports the http resources it blocks from fetching on a mixed content page.
Search your source for http: only. Another great tool to help you out is Fiddler with which you can see what's getting downloaded upon requesting your page.

Resources