Force file to download with .htaccess - .htaccess

So I am using the following rule in the htaccess:
AddType SCHM wsc
<FilesMatch "\.(wsc)$">
ForceType SCHM
Header set Content-Disposition attachment
</FilesMatch>
But when I go to the file's location it doesn't force the download

Since the question is already answered in the comments, this is just to provide an answer in the way how Stackoverflow designated it.
Like in the question it can be solved by using mod_headers of Apache 2. Since Content-Disposition is not part of the standard of HTTP, you may add some other header to achieve your objective.
<FilesMatch "\.(wsc)$">
Header set Content-Type application/octet-stream
Header set Content-Disposition attachment
</FilesMatch>
Another thing you should consider is that your browser may cache the responce of the server. The browser will still send the request, but the request will contain a node that the browser already have the file from a given date. If the files hasn't changed since the given date, the server will not send the new headers to your browser. This means if you change the .htaccess, you may not see any impact until you disable caching in your browser or you change the timestamps of the file.
You can also add
Header set X-Content-Type-Options "nosniff"
for better compatiblity (and maybe security). It prevents the browser from doing MIME-type sniffing, which would ignore the declared content-type. See here for more information.

RFC2616 say for 19.5.1 Content-Disposition
If this header is used in a response with the application/octet- stream content-type, the implied suggestion is that the user agent should not display the response, but directly enter a `save response as...' dialog.
http://www.w3.org/Protocols/rfc2616/rfc2616-sec19.html#sec19.5.1

Related

.htaccess send headers if homepage requested

I am using plain domain name url to display homepage and within htaccess I have
<FilesMatch "\.(html|htm)$">
Header set Cache-Control "max-age=3600, private, must-revalidate"
</FilesMatch>
to send Cache-Control header for html files. Unfortunatelly when just the domein is requested the FilesMatch obviously does not match, but I need to send the cache header anyway. ExpiresByType is not a solution as some files of type text/html are needed to keep uncached (forexample php script sending text/html data). Sending header by PHP is also not an option as the homepage html is cached in filesystem and served by apache directly (htaccess rule) without touching php. I am sure there must be some easy solution, but not able to figure out right now..thanks for help

How to disable Http Response Headers that arent set in the applicationHost.config or the web.config?

Hi I have a windows server with IIS 10 on where I am hosting angularjs web apps with the help of IIS-Node. I've been trying to harden the response headers and I find that Im doubling up on some of them.
For instance:  x-frame-options: is listed twice the first time it is loaded it is SAME ORIGIN which is not set in either applicationHost.config or webhost.config but is being added from somewhere Im not aware of, the second  x-frame-options: is shown as DENY which is what I expect it to be as I did set this using applicationHost.config.
A similar thing is happening with cache-control in the browser (chrome) it is shown as public, max-age=0,No-store when I only set it as No-store using applicationHost.config. So again cache-control:public, max-age=0, is being set somewhere else, not by me.
Please can any one tell me how to turn off these unwanted response headers?
I have searched IIS and Googled but I keep getting pointed back toward applicationHost.config or web.config. Thanks in advance.

How to configure Websphere 7 to always include "X-Content-Type-Options:nosniff" in response headers?

I'm working on application scan using OWASP and got this report. what i think is that, to configure WAS to include the header in all response headers it there's a way. thanks in advance for all your answers.
Vulnerability:
The Anti-MIME-Sniffing header X-Content-Type-Options was not set to 'nosniff'. This allows older versions of Internet Explorer and Chrome to perform MIME-sniffing on the response body, potentially causing the response body to be interpreted and displayed as a content type other than the declared content type.
Current (early 2014) and legacy versions of Firefox will use the declared content type (if one is set), rather than performing MIME-sniffing.
Suggested Solution:
Set the "X-Content-Type-Options: nosniff" header for resources (javascript, css, etc.) that are directly served by the web server. This can be done through server configuration so this might involve documentation updates.
Affected URLs / resources:
https://css-acme-tst.usmt0520.lpc.lawson.com/sso/domain.js
https://css-acme-tst.usmt0520.lpc.lawson.com/sso/login.css
What i did so far.
what i did is this. i place the tags right after commented out property modules/mod_headers.so and restart my appserver but still the same response header.
LoadModule headers_module modules/mod_headers.so
<Directory mod_headers.c>
Header always set X-Content-Type-Options nosniff
</Directory>
Try putting it into IfModule instead or into VirtualHost.
I have tried this one and it works fine:
LoadModule headers_module modules/mod_headers.so
<IfModule mod_headers.c>
Header always set X-Content-Type-Options nosniff
</IfModule>

Creating a great .htaccess file that handles shared resources well

I am trying to create an ideal .htaccess file that will tick all the boxes for taking advantage of server compression, but only where it makes sense, and caching of resources on public proxies, again where it makes sense. I have been feeling my way through the process and I think I am pretty much there, but I suspect there might be a bit of finessing left to do and I thought I'd invite suggestions. I have my suspicions it's not there yet because of a great tool I have discovered and I have to share that with you to begin with.
www.pingdom.com has a great suite of website analysis tools, many of which are free to use and personally I think the best is http://tools.pingdom.com/fpt/. This shows you the load time of every element of your page, but more importantly, under it's 'Performance Grade' tab it offeres a breakdown of where things could be better. Now I use a number of JQuery resources that are served by Google (and others) and I understand these should exist on many proxy servers. I'm not sure how to say that in my .htaccess file (although I have tried) and sure enough, Pingdom's anaylsis includes the following feedack:
The following publicly cacheable, compressible resources should have a
"Vary: Accept-Encoding" header:
•http://jmar777.googlecode.com/svn/trunk/js/jquery.easing.1.3.js
•http://kwicks.googlecode.com/svn/branches/v1.5.1/Kwicks/jquery.kwicks-1.5.1.pack.js
Well I thought I'd done that, but then again, perhaps it's up to the servers that actually serve those resources to set those headers, and maybe there's nothing I can do about it? Is that so? Anyway here is my .htaccess file at the moment. Please note I have the caching set insanely low because I am still just experimenting / learning with it. I will adjust this up before I go live with it.
suPHP_ConfigPath /home/mydomain/public_html
<Files php.ini>
order allow,deny
deny from all
</Files>
<ifModule mod_deflate.c>
<filesMatch "\.(js|css|php|htm|html)$">
SetOutputFilter DEFLATE
</filesMatch>
</ifModule>
# 1 HOUR
<filesMatch "\.(ico|pdf|flv|jpg|jpeg|png|gif|js|css|swf|htm|html|)$">
Header set Cache-Control "max-age=3600, public"
</filesMatch>
# PHP - NO CACHING WANTED DUE TO USING SESSION COOKIES
<filesMatch "\.(php)$">
Header set Cache-Control "private"
</filesMatch>
# STORE BOTH COMPRESSED AND UNCOMPRESSED FILES FOR JS & CSS
<IfModule mod_headers.c>
<FilesMatch "\.(js|css|xml|gz)$">
Header append Vary Accept-Encoding
</FilesMatch>
</IfModule>
You can see I am trying to do a 'Vary Accept-Encoding' towards the end of the file but not sure if this is what's needed. How do I tell clients to access JQuery and the like the proxies those files are undoubtedly stored at, and is there anything else I can do to make my .htaccess file deliver faster and my search engine friendly content?
Thank you for your thoughts.
Edit:
It seems my questions here were not clear enough so here goes with some clarification:
1) Is the JQuery library, hosted at Google, something whose proxy availability is somehow under the control of my .htaccess settings, because I make remote reference to it in my PHP, and if so, how should I say, in my .htaccess file, 'please cache that library in a proxy for a year or so'?
2) How too should I specify that Google hosted files should be provided compressed and uncompressed via 'Vary Accept-Encoding'? At a guess I'd say both issues were under Googles control and not mine, so to make that absolutely explicit...
3) Is the compression choices and proxification of files like the JQuery library under my control or under (in this case) Googles?
4) Generally, is anything in my .htaccess file expressed in a sub-optimal (long winded) way and how could I shorten/compact it?
5) Is anything in the .htaccess file sequenced in a way that might cause problems - for example I refer to CSS under three separate rules - does the order matter?
(End of Edit).
Is the JQuery library, hosted at Google, something whose proxy availability is somehow under the control of my .htaccess settings, because I make remote reference to it in my PHP, and if so, how should I say, in my .htaccess file, 'please cache that library in a proxy for a year or so'?
This assertion is not correct. The browser decides to cache or not, to download or not depending on the header exchange for that request only. So if a query response involves requests to multi-site, then your .htaccess file(s) only influence how it caches your files. How it caches Google's is up to Google to decide. So for example, a request to http://ajax.googleapis.com/ajax/libs/jquery/1.7.1/jquery.min.js received the response headers:
Age:133810
Cache-Control:public, max-age=31536000
Date:Fri, 17 Feb 2012 21:52:27 GMT
Expires:Sat, 16 Feb 2013 21:52:27 GMT
Last-Modified:Wed, 23 Nov 2011 21:10:59 GMT
Browsers will normally cache for a year but may decide to revalidate on reuse:
If-Modified-Since:Wed, 23 Nov 2011 21:10:59 GMT
And in this case ajax.googleapis.com will reply with a 301 and the following headers:
Age:133976
Date:Fri, 17 Feb 2012 21:52:27 GMT
Expires:Sat, 16 Feb 2013 21:52:27 GMT
This short request/response dialogue will typically require ~50 mSec since this content is CDN delivered.
You might wish to rework your other supplemental Qs in this light since some don't apply

HTTP Headers - Cache Question

I am making a a request to an image and the response headers that I get back are:
Accept-Ranges:bytes
Content-Length:4499
Content-Type:image/png
Date:Tue, 24 May 2011 20:09:39 GMT
ETag:"0cfe867f5b8cb1:0"
Last-Modified:Thu, 20 Jan 2011 22:57:26 GMT
Server:Microsoft-IIS/7.5
X-Powered-By:ASP.NET
Note the absence of the Cache-Control header.
On subsequent requests on Chrome, Chrome knows to go to the cache to retrieve the image. How does it know to use the cache? I was under the impression that I would have to tell it with the Cache-Control header.
You have both an ETag and a Last-Modified header. It probably uses those. But for that to happen, it still needs to make a request with If-None-Match or If-Modified-Since respectively.
To set the Cache-Control You have to specify it yourself. You can either do it in web.config , IIS Manager for selected folders (static, images ...) or set it in code. The HTTP 1.1 standard recommends one year in future as the maximum expiration time.
Setting expiration date one year in future is considered good practice for all static content in your site. Not having it in headers results in If-Modified-Since requests which can take longer then first time requests for small static files. In these calls ETag header is used.
When You have Cache-Control: max-age=315360000 basic HTTP responses will outnumber If-Modified-Since> calls and because of that it is good to remove ETag header and result in smaller static file response headers. IIS doesn't have setting for that so You have to do response.Headers.Remove("ETag"); in OnPreServerRequestHeaders()
And if You want to optimize Your headers further You can remove X-Powered-By:ASP.NET in IIS settings and X-Aspnet-Version header (altough I don't see in Your response) in web.config - enableVersionHeader="false" in system.web/httpRuntime element.
For more tips I suggest great book - http://www.amazon.com/Ultra-Fast-ASP-NET-Build-Ultra-Scalable-Server/dp/1430223839

Resources