Was wondering is this possible in .htaccess?
I'm currently caching .js, .css and all image files via PHP (and providing the cached only if the file has not been modified by checking the filemtime()).
However someone suggested it's possible via .htaccess and much faster, so was hoping maybe someone can shed some light...I've looked around and found various snippets but none which cover what I'm after.
If you've got mod_expires installed on your apache server you can put something like this in your .htaccess file. This example is PHP orientated (actually grabbed from the Drupal 7 .htaccess file) but should serve as a good starting point.
FileETag MTime Size
<IfModule mod_expires.c>
# Enable expirations.
ExpiresActive On
# Cache all files for 2 weeks after access (A).
ExpiresDefault A1209600
<FilesMatch \.php$>
# Do not allow PHP scripts to be cached unless they explicitly send cache
# headers themselves. Otherwise all scripts would have to overwrite the
# headers set by mod_expires if they want another caching behavior.
ExpiresActive Off
</FilesMatch>
</IfModule>
Related
For all files, except pdf, in a specific directory as well as its sub-directories on the server, I would like to set the expiration header to 10 hours. How can I do this in the .htaccess file?
<Directory "/foldername">
<IfModule mod_expires.c>
ExpiresActive On
ExpiresByType * "access plus 10 hours"
</IfModule>
</Directory>
I understand Directory cannot be used in .htaccess. But how to do this?
You could put the .htaccess into the specific directory, and do it in there without any further restriction. Or use an If condition, to apply this based on what the request URI starts with.
ExpiresByType * won't work, according to the documentation, this needs an actual mime type as "argument".
But ExpiresDefault also exists, and allows you to specify the default expiry for all files.
I have adapted the .htaccess on my WordPress site and made additions such as the activation of GZIP.
<IfModule mod_deflate.c>
SetOutputFilter DEFLATE
</IfModule>
Then I put a rule in the subfolder that should prohibit access to certain pages.
<FilesMatch "connection\.php|data\.php|protection\.php">
order allow,deny
deny from all
</FilesMatch>
Of course, I wanted the rules created in the root folder to also apply to the additional rules in the subfolders (of course only in the respective folders). Today I read a article that made me suspicious in which I read that a .htaccess file in a subfolder completely overwrites a .htaccess file from a parent folder and not adds the new specific points and only when a new point is added to a point from the higher-level folders which is suppose it gets overwritten. So I tried it out and in my opinion it is not true because, according to GZIP Tester, the files are also got zipped in the folder where I don't add this
<IfModule mod_deflate.c>
SetOutputFilter DEFLATE
</IfModule>
Here is a diagram from the page where I found this article.
It's German, but I think you will understand that. (Verzeichnis = Root folder, Unterverzeichnis = subfolder)
The question is what's right, did I make a mistake and have to re-list the rules every time I want to extend the root folder in each .htaccess file (in sub-folders) or was the text on the website just wrong?
From the official Apache docs (https://httpd.apache.org/docs/current/howto/htaccess.html#how):
The configuration directives found in a .htaccess file are applied to the directory in which the .htaccess file is found, and to all subdirectories thereof. However, it is important to also remember that there may have been .htaccess files in directories higher up. Directives are applied in the order that they are found. Therefore, a .htaccess file in a particular directory may override directives found in .htaccess files found higher up in the directory tree. And those, in turn, may have overridden directives found yet higher up, or in the main server configuration file itself.
In my htaccess file there is this :
<FilesMatch "\.(js|css|pdf|txt)$">
Header set Cache-Control "max-age=7257608"
</FilesMatch>
Now, if I wanted to alter a css type file. the css will change if I refreshed the page. Other users still get the old css file because of the cache. what can I do on my side to let the users browsers recognize there is a change in the css file?
Generally rather than setting a cache age in the .htaccess, make sure you're configured to use if-modified-since which is documented in the Apache Caching Guide, using the mod_cache extension:
Generally, it's as simple as this, with exceptions written for secured resources:
LoadModule mem_cache_module modules/mod_mem_cache.so
<IfModule mod_mem_cache.c>
CacheEnable mem /
MCacheSize 4096
MCacheMaxObjectCount 100
MCacheMinObjectSize 1
MCacheMaxObjectSize 2048
</IfModule>
I have a files directory for my image storage in my web root folder, i want to know how to secure that folder. i prevent people from uploading scripts to that folder, i check file extensions, if it is not an image then it will not save to that folder.
but faking extensions are done easily, what happens if someone manage to upload a script to my files directory and access that from the browser
so i need a way to prevent scripts from running inside that folder and only allow images to run.
i know htaccess can do that but i dont know how to set it up. my .htaccess file is like this:
AddHandler cgi-script .php .pl .py .jsp .asp .htm .shtml .sh .cgi
Options -ExecCGI
ForceType application/octet-stream
<FilesMatch "(?i)\.(gif|jpe?g|png)$">
ForceType none
</FilesMatch>
Options All -Indexes
but it is not working, i saved a php file in that folder then tried to accessed it from the browser and i can still access it. do you know how to make this work? or if you have more secure approach to this, please tell me.
thank you
I think that it isn't working because you have only added an extra handler, you haven't removed the other handlers.
It is easiest to put another .htaccess file in the folder you want to protect (rather than messing with the match directive) that contains:
# Fix PHP, you should do matching commands for JSP and ASP, & html
RemoveType application/x-httpd-php php
# .... add the other remove-handler statements here .... #
# Optionally make these equivalent to text files.
# UPDATE: Taken this out as you dont want people to see PHP files at all
#AddType text/html php
# To disable cgi and server side includes & indexes
# You need to check the setup of Apache, some of the file types
# listed should already be handled as CGI (.pl, .py, .sh)
Options -ExecCGI -Includes -Indexes
# Completely block access to PHP files
<FilesMatch "\.(php|phps|html|htm|jsp|asp)$">
Order allow,deny
Deny from all
</Files>
# Add in any additional types to block
That covers PHP and CGI, you should do matching commands for JSP and ASP
UPDATE: Added code to completely block access to PHP files - sorry, thought initially that you simply didn't want them executing. Also note that I've commented out the line that turns PHP files into text files.
I've recently tried to optimize my site for speed and brandwith. Amongst many other techniques, I've used GZIP on my .css and .js files.
Using PuTTY I compressed the files on my site and then used:
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteCond %{HTTP:Accept-encoding} gzip
RewriteCond %{HTTP_USER_AGENT} !Konqueror
RewriteCond %{REQUEST_FILENAME}.gz -f
RewriteRule ^(.*)\.css$ $1.css.gz [QSA,L]
RewriteRule ^(.*)\.js$ $1.js.gz [QSA,L]
<FilesMatch \.css\.gz$>
ForceType text/css
</FilesMatch>
<FilesMatch \.js\.gz$>
ForceType text/javascript
</FilesMatch>
</IfModule>
<IfModule mod_mime.c>
AddEncoding gzip .gz
</IfModule>
in my .htaccess file so that they get served properly because all my links are without the ".gz".
My problem is, I cant work on the GZIP file in Dreamweaver. Is there a plugin or extension of somesort that allows Dreamweaver to temporarily uncompress thses files so it can read them?
Or is there a way that I can work on my local copies as regular files, and server side they automatically get compressed when they are uploaded.
Or is there a different code editor I should be using that would completely get around this?
Or a just a different technique to doing this?
I hope this question makes sense,
Thanks
Dreamweaver do not have the capability built in to natively work with zipped or gzipped files. After you pull down a file from your server, you would need to extract the file(s), make your edits, and then re-pack the file(s) to upload them. If you do not have an application locally to do this, I'd suggest: 7-Zip: http://7-zip.org/
A server side solution could also be used, but I guess that you'd have to have a caching mechanism on the sever that would first check if a newer version of a file exists, if it does then gzip it, if not move on to serving the file. Perhaps ask a new question specific to gzip files to serve using the server language of your choice, I'm sure there are a number of solutions out there.