I've placed a simple cache control in my .htaccess file:
#cache css and javascript files for one week
<FilesMatch ".(js|css)$">
Header set Cache-Control "max-age=604800"
</FilesMatch>
When I test the desktop site at Google's Page Site tester: https://developers.google.com/speed/pagespeed/insights ... it shows the javascript and images are being cached properly. However, when I test my mobile website, the caching isn't working. My htaccess file is contained in the public_html directory alongside all my desktop files (ie. public_html/index.html, public_html/images/, public_html/css/, public_html/.htaccess etc.) My mobile site is contained here: public_html/mobile/.
Would I need to add a second .htaccess file to the mobile directory to make it work?
Thanks.
The best option is to use .htaccess file of html5 boilerplate. It is highly optimised for cache,gzip,cross-domain ajax plus a lot of features.
Also do check whether mod_deflate is on or not.
You don't need any additional .htaccess file just use a single file in the root of your directory.
Related
I have a ton of PDF files in different folders on my website. I need to prevent them from being indexed by Google using .htaccess (since robots.txt apparently doesn't prevent indexing if other pages link to the files).
However, I've tried adding the following to my .htaccess file:
<Files ~ "\.pdf$">
Header append X-Robots-Tag "noindex, nofollow, noarchive, nosnippet"
</Files>
to no avail; the PDF files still show up when googling "site:mysite.com pdf", even after I've asked Google to re-index the site.
I don't have the option of hosting the files elsewhere or protecting them with a login system; I'd really like to simply get the htaccess file to do the job. What am I missing?
As I see in the comment made on another answer, I understand that
you are looking for removing indexed file/folder which is already done by google. You can temporary forbid it using following if you stop anyone accessing directly.
First, let me give you a workaround
after that I will let you know what you can do which will be taking bit longer time.
<Files "path/to/pdf/* ">
Order Allow,Deny
Deny from all
Require all denied
</Files>
this way all files/folders inside the given directory will be forbidden to use in the HTTP method. This means you can only access it programmatically for sending in attachment or deleting or something but the user will not be able to view these.
You can make a script on your serverside which will access file internally and show file using parsing instead direct URL.(assuming data is critical as of now).
Example
$contents = file_get_contents($filePath);
header('Content-Type: ' . mime_content_type($filePath));
header('Content-Length: ' . filesize($filePath));
echo $contents;
Indexing vs Forbidding (No need of this now)
Preventing indexing basically prevent this folder/files to be index by google bots or search engine bots, anyone visiting directly will still be able to view the file.
In the case of Forbidding, no external entity/users/bots will able to see/access this file/folder.
If you have recently forbidden access of your pdf folder, it may still be visible to Google until Googlebot visits again on your site and find those missing or you mention noindex for that specific folder.
You can read more about crawler rate on https://support.google.com/webmasters/answer/48620?hl=en
If you still want these to remove, you can visit the Google search console and request the same. visit: https://www.google.com/webmasters/tools/googlebot-report?pli=1
Just paste this in your htaccess file, use set instead of append
<Files ~ "\.pdf$">
Header set X-Robots-Tag "noindex, nofollow"
</Files>
I'm trying to limit access to a directory based on the results of a php script. I have the following in my .htaccess folder where the files are located:
RewriteCond %{REQUEST_URI} !=league_access.php
RewriteRule .* league_access.php
I have also tried:
RewriteEngine on
RewriteRule .* league_access.php
If you go to the directory http://www.bowling-tracker.com/bowl/league_documents/1/ you will note that it is firing the league_access.php script (as it currently only types "Running the Test Script
Restricted access" to the page.
So that is acting correctly.
http://www.bowling-tracker.com/bowl/league_documents/1/test.html you will see that you're granted access to the page (rather than it going to the league_access.php script).
This website is on FastComet (public hosting company) so I cannot change server settings or files except the .htaccess file.
Any help to resolve this would be greatly appreciated.
Thanks....
FastComet Team here! Part of our shared hosting environment is utilizing NginX as a reverse proxy to the Apache web service. This configuration gets the advantages of both services at the same time and ensures a better performance of your project. NginX is processing all requests for static content, such as PDF files or HTML pages. Here's a list of all file types that will be processed by the NginX service:
3gp|gif|jpg|jpeg|png|ico|wmv|avi|asf|asx|mpg|mpeg|mp4|pls|mp3|mid|wav|swf|flv|html|htm|js|css|exe|zip|tar|rar|gz|tgz|bz2|uha|7z|doc|docx|xls|xlsx|pdf|iso
However, if the request is for dynamic content, such as a PHP script, it will be passed from the NginX to the Apache service. You are correctly setting the rule in question in the .htaccess file of your website, but this file is only read by the Apache service, not NginX. In other words, if there is a request for a static content, such as a PDF file here:
http://www.bowling-tracker.com/bowl/league_documents/1/Rules_Thurs_Night_Mixed.pdf
or an HTML page here:
http://www.bowling-tracker.com/bowl/league_documents/1/test.html
it will be processed by NginX without considering the .htaccess rules that you have set. There is an easy way of resolving that by excluding the processing of HTML, HTM and PDF types files for your domain or even your entire hosting account. This way, those requests will be processed by the Apache web server, instead of NginX. In this case, the .htaccess rules that you apply will be taken into consideration by the system and they will work without any issues.
In my yii project i have Changelog and Licence text files. I know about RBAC and applied it on every Controller but how can i prevent any guest user to view these text files. As till now anyone can view this.
I have used this in my htaccess file
<Files ~ "(.txt)">
Order allow, deny
Deny from all
</Files>
But this is worked for txt file and these files have no extension
You can block access to all the files without extension using this rule in your site root .htaccess or Apache config/vhost file:
RewriteEngine On
# If the request is for a valid file
RewriteCond %{REQUEST_FILENAME} -f
# if there is no extension then block
RewriteRule ^[^.]+$ - [F]
You question is a little board, and so the answer is a little general. but there are a couple approaches;
option 1. remove the Changelog and Licence files? if these are yii install changelog and licence then they dont need to be left on the server. just ensure you complying with the licence requirements.
option 2.
you mentioned "guest user" which htaccess is not going to integrate well with yii for authorized users. you could move the files into a folder with a .htaccess containing a single line Deny from all. this blocks everyone except the PHP executed on your server.
you can now create a method/action in a controller which just echos the file contents. file-get-contents or readfile. wrap this your authentication so only non-guest users are able to use the method.
if there are only two static files, then maybe just an 'action' for each. if its many files that are changing names etc, then you accept an id to the controller pass to a model that uses scandir and checks the file really exists and spits out your output to view.
option 2.1
instead of folder with a .htaccess you could also move the files to the parent of the webhost base dir if you have this access. this means that your webserver can not serve the file, but the php can still reach it with local paths.
option 3
in .htaccess you can use AuthType basic and will invoke your webserver to prompt the user for username and password as configured in the .htaccess. this is problematic as the interface is not user friendly and is very difficult to integrate with your webapps user db.
option 4
.htaccess can support other AuthTypes but option 2 becomes much easier at this point.
I am extremely new to the concept of .htaccess, and I wanted to know how I could use it to allow a file to be used on a script on a .html file in the same directory as the .htaccess and the file. However, if you try to navigate to the file instead of viewing the script on the .html file, I would like it to be blocked. Thanks!
Update: Please see below comments!
Update 2: It seems that there is no way to achieve what I wished. That's ok, though. I just used a bunch of obfustication, and that seems to work well.
You are wanting to restrict access to a (script)file using htaccess so that a visitor can't directly link to the script file. Assuming this is working like described the visitor would load the HTML-file, the HTML-file would render and request the scriptfile....which will be blocked. So this isn't the way to go I reckon.
I would suggest changing the HTML-file to PHP when possible and include the script with a php include/require. This way the server-side code will determine what content is served.
Once you're including the file server-side you can prevent direct access to the file using htaccess by placing the code below inside your htaccess:
#Prevent Users From Accessing .inc* files in .htaccess
<Files ~ ".inc">
Order allow,deny
Deny from all
</Files>
In the above example direct access to .inc-files will be denied. Change this file-extension to your needs.
Inside your index.php file you'll need to include the file containing your script with something like:
include 'filewithscript.inc';
This should solve your problem.
I have an xml file on the server containing details to the database server. I don't want anyone to be able to access it via url but PHP should be able to load the file
Two ways:
Simple move all those kinds of files outside the webroot, for example /application instead of /public_html/myapplication. You only need accessible pages (index.php etc.) inside the webroot.
Or if that's not possible/too hard, add this in .htaccess in the folder that contains the XML file (but it cannot contain files that should be accessible)
.
Order Allow,Deny
Deny from All
you could use .htaccess file: http://httpd.apache.org/docs/1.3/howto/htaccess.html
but, why put it in XML? put it in PHP as variables, then even if they visit the page they won't be able to see it.