I've been looking around for a good answer to this, but I can't seem to find an answer that really works for me. It's a pretty simple question.
I'd like to restrict users from accessing a specific file like this:
https://www.example.com/styles/homestyles.css
Though I would like homestyles.css to still be accessible by the website. All solutions I have found so far seem to block all access to homestyles.css which causes the website to load with no styling.
I've tried using Options -Indexes in my .htaccess file, which returns a 403 error when the user tries to access https://www.example.com/styles, though files inside the /styles directory are still accessible.
What is the best way to do this using .htaccess?
Related
Good day. Got a problem here. I'm trying to forbid a folder and i succeeded from it,the problem for now is, I can't use it anymore.My web design does not work when I included it. How can i fix this?
I have tried to use options -indexes but it would not block the access if someone knows the file name like folder/css/index.css.
How can I block a user from accesing it without affecting my web-design.
By blocking access to the folder, you stopped the browser from being able to access it, which it needs to do in order to display it.
You won't ever be able to totally stop users from directly accessing the css file, if they're determined enough. The browser has to download the css file (so too can the users).
There are steps you can take to make it harder though. You could obfuscate/minify the CSS code to make it harder for people to understand - http://www.cssobfuscator.com
You could also check to make sure the referrer header is your website to prevent hotlinking - see http://www.htaccesstools.com/hotlink-protection/
Ultimately though, it will always be accessible and visible to your users
You can forbid access to the folder using the following in your /.htaccess
RewriteEngine on
RewriteCond %{REMOTE_ADDR} !your_ip_address_here
RewriteRule ^folder/css/index\.css - [F,L,NC]
That means only your IP address will be able to access the protected folder.
My question pertains specifically to the two pages below, but is also more generally relating to methods for using clean URLs without an .htaccess file.
http://www.decitectural.com/
and
http://www.decitectural.com/about/
The pages above are hosted on Amazon's S3, which does not allow for the use of htaccess files. As a result, I have found no easy way to create a clean url rewrite scheme that sends all requests to an index file which, in turn, interprets the URL using javascript and loads up the correct page (with AJAX, or, as is the case with decitectural, with simple div visibility toggling).
In order to circumvent this problem, I usually edit the amazon S3 bucket properties and set both the index page and the error page to the index.html file. In this case, the index.html file is served even when an invalid path (such as /about/) is requested. This has, for the most part, been a functioning solution... That is, until I realized that I was also getting a 404 with the index.html page which would stop Google from indexing it.
This has led me to seek out an alternative solution to this problem. Currently, as a temporary fix, I am actually creating the /about/ directory on the server with a duplicate of the index.html file in it. This works, but obviously is not a real solution to the problem.
I would appreciate any advice on how to set up a clean URL routing scheme on S3 or in any instance where an .htaccess file can't be used.
Here's a few solutions: Pretty URLs without mod_rewrite, without .htaccess
Also, I guess you can run a script to create the files dynamically from an array or database so it generates all your URLs:
/index.html
/about/index.html
/contact/index.html
...
And hook the script on every edit, in a cron or run manually. Not the best in terms of performance but hey, it should work.
I think you are going about it the wrong way. S3 gives you complete control of the page structure of your site. If you want your link to be "/about", just upload a file called "about", and you're done. (Set the headers so that the browser knows it's HTML.)
Yes, it will break if someone links to "/about/" or "/about.html". But pretty much any site will break if you mess with their links in odd ways. You will have to be vigilant when linking to your own site, because you won't have any rewrite rules to clean up for you. But you should have automation doing that.
For all kinds of reasons, I have a dir inside app/webroot/ that needs protection (/files). I'm not familiar with the inner workings of CakePHP, because I hired someone for this project. This person, however, can not supply an answer. It's also not feasible to move the directory.
What I tried was placing a .htaccess inside the app/webroot/files/ and link to an .htpasswd file outside the the regular file tree. This does not work, I'm getting a 401 error. Placing the .htpasswd inside the same dir doesn't change anything.
From other questions I have gathered that I need to modify the other .htaccess files used by CakePHP, but it's not clear to my how.
The one other question that looks like mine is about protecting the entire webroot dir, which is not what I need.
I also tried securing the files with a PHP download script that checked the session, but somehow that is often failing with my clients, not sure why.
The HTTP 401 status (note: 'status', not 'error') is the 'Unauthorized' status, which is precisely the status you need for unauthorized people.
This sort of suggests to me that the protecting works, but that checking the authorization credentials to allow access fails.
It might help if you post your .htaccess code (leave out any sensitive data of course ;) )
I have a list of documents here: www.example.com/documents
I want to key the documents folder in the public_html / htdocs folder (not above it). However, I don't want people to be able to navigate to www.example.com/documents or for Google to index the content. But I still need to use links to the documents across the site (mainly within a logged in area).
Any suggestions?
There's a chance I misunderstood the question, but I think you'd like to disable directory listing. If so, just put
Options -Indexes
in your .htaccess file. This tells Apache not to create that fancy file list when the URI http://example.com/directory/ is requested, so the user will get a 404 error. Reqests to files within the directory are unaffected.
You can also do various fancy things with the directory listing by using the mod_autoindex directives like IndexIgnore.
What are the different approaches to securing a directory?
including an index page so contents can't be viewed
the problem with this is that people can still access the files if they know the filename they're after
including an htaccess file to deny all
this seems to be the best approach, but is there any case that an htaccess file can be passed by? are there any cases as well where htaccess is not available?
restricting folder access
this is also a nice solution, but the problem is, the folder I'm trying to secure should be viewable and writable by the program.
Are there any other ways that folder security can be done?
Best practice for Apache is to use htaccess to restrict - this only restricts from the webserver - but that should be what you need. You can add authentication into this - but for most needs to you can just deny all acess - which hides the directory completely.
Another method that can also work well with using htaccess to deny direct access would be to use htaccess in your route directory to rewrite urls. This means that a request such as /example/listItems/username/ted can be rewritten as a call to a php or other file such as:
/application/index.php?module=listItems&username=ted
The advantage of doing this is that the webserver does not give out paths to any directories so it is much more difficult for people to hack around looking for directories.
If you want to protect a directory of images you could also use htaccess to redirect to a different directory so that /images/image5.png is actually a call to :
/application/images/image5.png
You could also try not placing your protected directory under your www dir but on other "non www visible" location. If your app needs to read / write data, tell it to do it on the other location. Modify its properties so only the app has the proper rights to do so.