Using CFContent on IE with SSL - iis

CFContent does not work with IE on a SSL site. I have been looking for a fix but have not found one. As an alternative I was to have the CFContent download the excel sheet to a directory and then use cflocation to forward to that file. I really have no use for all of these files on my hard drive though.
I even tried using CFHeader
<CFHEADER NAME="Content-Disposition" VALUE="inline; filename=emp.xls">
<CFCONTENT TYPE="application/vnd.ms-excel">
Any ideas?

Typically issues like this with Internet Explorer and SSL have to do with caching.
Make sure that Internet Explorer does not have the setting Do not save encrypted pages to disk enabled (checked). You can find that setting in IE under Internet Options > Advanced > under the Security section.
Make sure that you are not sending no-cache headers with the response. Like these for example:
<cfheader name="PRAGMA" value="NO-CACHE" />
<cfheader name="CACHE-CONTROL" value="NO-CACHE" />
Note that some hardware/web servers can also send these no-cache headers if configured to do so.
You can use a network monitoring tool like Fiddler to "see" these headers.

Related

How to Use eTag on IIS for text/html Pages

I have a website which sits on a non-public domain and is delivered via a proxy through on a different domain. We're having some trouble with caching of content - this is an Umbraco site and making changes updates the pages if you hit the domain directly, but not through the proxy.
I've been informed that the proxy honours response headers and setting an eTag would fix the issue. Having looked into this I can see that IIS sets the eTag by default, and I can see this is working on static content i.e. .js, .css files like so:
However, if I visit a page on the site, for example /uk/products/product I don't see the eTag header.
Is this expected behaviour, should it only be working with those static content files or can I set this on the page to tell the proxy that it should recache?
The ETag HTTP response header is an identifier for a specific version of a resource. It lets caches be more efficient and save bandwidth, as a web server does not need to resend a full response if the content has not changed. Additionally,etags help prevents simultaneous updates of a resource from overwriting each other ("mid-air collisions").
If the resource at a given URL changes, a new Etag value must be generated.
Static content does not change from request to request. The content that gets returned to the Web browser is always the same. Examples of static content include HTML, JPG, or GIF files.
IIS automatically caches static content (such as HTML pages, images, and style sheets), since these types of content do not change from request to request. IIS also detects changes to the files when you make updates, and IIS flushes the cache as needed.
to enable caching in iis you could use iis output caching feature:
1)open iis manager. select site.
2)select the output caching feature from the middle pane.
3)select edit feature setting from the middle pane.
4)check the enable cache and enable kernel cache box and click ok.
if you want to set the ETag as blank you could also do by adding below code in web.config file:
<httpProtocol>
<customHeaders>
<add name="ETag" value="" />
</customHeaders>
</httpProtocol>
refer this below article for more detail:
Caching
To use or not to use ETag, that is the question.
Configure IIS Output Caching
I've read that IIS after version 7 automatically enables E-tags, however, I ran a Pingdom speed test and the report advised me to enable E-tags. I'm not sure that report is accurate, or the information I read about IIS 7 and newer may not be correct.

How to disable Http Response Headers that arent set in the applicationHost.config or the web.config?

Hi I have a windows server with IIS 10 on where I am hosting angularjs web apps with the help of IIS-Node. I've been trying to harden the response headers and I find that Im doubling up on some of them.
For instance:  x-frame-options: is listed twice the first time it is loaded it is SAME ORIGIN which is not set in either applicationHost.config or webhost.config but is being added from somewhere Im not aware of, the second  x-frame-options: is shown as DENY which is what I expect it to be as I did set this using applicationHost.config.
A similar thing is happening with cache-control in the browser (chrome) it is shown as public, max-age=0,No-store when I only set it as No-store using applicationHost.config. So again cache-control:public, max-age=0, is being set somewhere else, not by me.
Please can any one tell me how to turn off these unwanted response headers?
I have searched IIS and Googled but I keep getting pointed back toward applicationHost.config or web.config. Thanks in advance.

Where is this redirect being set?

I am trying to figure out where some redirects are being set for a web site I'm managing. It is an IIS server and the application is mainly in cold fusion. I have found with certainty that the IIS settings do not specify any redirects. No cold fusion settings that I've found allow redirects from the server side. There are no '< cflocation >' or header changes in the cold fusion files that would cause this redirect. Finally, there are no redirects in any of the .htaccess files. What other possible causes/locations could there be for a 301 redirect being set?
Thanks!
Redirects can also be done with cfheader:
<cfheader statuscode="301" statustext="Moved permanently" />
<cfheader name="location" value="http://www.mysite.com/new-location-for-content/" />
Or there can be a Javascript redirect as well, something like:
document.location='http://www.mysite.com/new-location-for-content/';
Or there may be a bit of ASP, etc., etc.
More examples here http://www.somacon.com/p145.php

What's the best way to troubleshoot Akamai headers these days?

Traditionally, I would inspect the Akamai headers by installing a Firefox extension called akamaiheaders.xpi. Unfortunately, I think the last version of Firefox to support this was 3.
As I understand it, this plugin would add special headers to all HTTP requests that Firefox made, which would prompt Akamai to add a bunch of headers to the response (telling me whether the file was cached, where it got it from, etc.). Then, using a tool like HTTPFox or Firebug, I could easily see which assets were cached and which ones were not.
I've searched all over, but I can't find anything as simple and easy to use as that. Does anyone know of anything out there that allows me to track all the Akamai headers for all the assets my browser loads that works in either FF, Chrome, or Safari?
You can use curl and/or wget for this:
curl -H "Pragma: akamai-x-cache-on, akamai-x-cache-remote-on, akamai-x-check-cacheable, akamai-x-get-cache-key, akamai-x-get-extracted-values, akamai-x-get-nonces, akamai-x-get-ssl-client-session-id, akamai-x-get-true-cache-key, akamai-x-serial-no" -IXGET http://www.oxfordpress.com/
or
wget -S -O /dev/null --header="Pragma: akamai-x-cache-on, akamai-x-cache-remote-on, akamai-x-check-cacheable, akamai-x-get-cache-key, akamai-x-get-extracted-values, akamai-x-get-nonces, akamai-x-get-ssl-client-session-id, akamai-x-get-true-cache-key, akamai-x-serial-no" http://www.oxfordpress.com/
If you want to test staging environment, you need to remember to send Host header, eg:
curl -H "Host: www.oxfordpress.com" -H "Pragma: ..." -IXGET http://oxfordpress.com.edgesuite-staging.net/
This way or another, it's always about sending proper Pragma headers and then reading response headers.
List of Pragma headers as well as explanations for X-Cache response header can be found here: http://webspherehelp.blogspot.com/2009/07/understanding-akamai-headers-to-debug.html.
I know this question is old, but since I came across it in my search today I thought I'd add an answer for the next person who comes along.
There are a couple of extensions in the Chrome store for this now:
Akamai debug headers which just adds headers to your net panel in web inspector
Exceda Akamai Headers Extension which seems to also work for purging cache.
Akamai debug headers is the one I chose and it's working well so far.
You can use a local proxy (e.g. Fiddler or Charles Proxy, my personal favorite) and add the following header to outgoing requests:
Pragma: akamai-x-cache-on, akamai-x-cache-remote-on, akamai-x-check-cacheable, akamai-x-get-cache-key, akamai-x-get-extracted-values, akamai-x-get-nonces, akamai-x-get-ssl-client-session-id, akamai-x-get-true-cache-key, akamai-x-serial-no
If you're using Chrome or Chromium, you can use the extensions Header Hacker or Pragma Header. With either one, you will be have to add Pragmas manually.
If you can find the akamaiheader.xpi file, you can just open it and change the maxVersion in install.rdf to 9.*
.xpi files are just ZIP files, and on most machines you can just add .zip to the filename and doubleclick on it.
To debug akamai headers, for the Chrome browser, try this extension: CDN Headers & Cookies - Chrome Web Store
https://chrome.google.com/webstore/detail/cdn-headers-cookies/obldlamadkihjlkdjblncejeblbogmnb
Note: Enable 'Load Akamai Headers' in the settings (click the 'Lego minifig Head' icon, click the gear, and check on 'Load Akamai Headers').
It has been suggested on the Akamai community.
https://community.akamai.com/community/web-performance/blog/2015/03/31/using-akamai-pragma-headers-to-investigate-or-troubleshoot-akamai-content-delivery
They have a new version of the XPI out which you can download in Luna. There's also another Plugin which adds a 'content source' pane into Firebug for a quick reference of what on the page was Akamaised.
As I say, to download both plugins you need to login to Luna and look under 'Support' > 'More Tools' > 'Browser Extensions'. The XPI isn't publicly accessible.
YMMV but as far as I recall being told by colleagues the Exceda plugin duplicated HTTP requests which can be a bit messy whilst debugging.
For Chrome I find ModHeader + Setting up a profile where the Pragma headers are sent works fine.

JBoss Seam Excel and HTTPS

Consider the following test view using Seam's Excel library:
<e:workbook type="csv">
<e:worksheet name="Export" >
<e:cell value="1" row="0" column="0"/>
<e:cell value="2" row="1" column="1"/>
</e:worksheet>
</e:workbook>
I'd like to secure parameters to a more complicated version via HTTPS. The unsecure view generates the file fine. When I change the scheme to "https" in view.page.xml, instead of my csv file, the browser is redirected to http://localhost/seam/docstore/document.seam with the conversation id in the query string. Other pages secured using https (e.g. login) are working fine.
Any suggestions on resolving or better diagnosing the problem?
Thanks!
It could be related to your security constraints settings in web.xml. Take a look here.
If excel is trying to access resources outside of the restricted area you won't be able to generate the file.

Resources