Securing files and folders with htaccess - security

I have a couple of files on my server that contains sensitive information. Only the server should be allowed to edit these files, no one else should be able to read/access them. They are stored as .txt.
I've stored them in a separate folder, and added a .htaccess file with:
<Files *>
Deny from all
</Files>
My question is weather it's secure enough to store sensitive information with .htaccess, or if someone can hack it and get access to the files?
Thanks

.htaccess is as secure as you can get, on a server-side basis.
All .ht files by default are un-accessible to the public, so no-one can edit or view the .htaccess file unless they access it through FTP ect. So the .htaccess file is secure as your server is.

Related

Preventing indexing of PDF files with htaccess

I have a ton of PDF files in different folders on my website. I need to prevent them from being indexed by Google using .htaccess (since robots.txt apparently doesn't prevent indexing if other pages link to the files).
However, I've tried adding the following to my .htaccess file:
<Files ~ "\.pdf$">
Header append X-Robots-Tag "noindex, nofollow, noarchive, nosnippet"
</Files>
to no avail; the PDF files still show up when googling "site:mysite.com pdf", even after I've asked Google to re-index the site.
I don't have the option of hosting the files elsewhere or protecting them with a login system; I'd really like to simply get the htaccess file to do the job. What am I missing?
As I see in the comment made on another answer, I understand that
you are looking for removing indexed file/folder which is already done by google. You can temporary forbid it using following if you stop anyone accessing directly.
First, let me give you a workaround
after that I will let you know what you can do which will be taking bit longer time.
<Files "path/to/pdf/* ">
Order Allow,Deny
Deny from all
Require all denied
</Files>
this way all files/folders inside the given directory will be forbidden to use in the HTTP method. This means you can only access it programmatically for sending in attachment or deleting or something but the user will not be able to view these.
You can make a script on your serverside which will access file internally and show file using parsing instead direct URL.(assuming data is critical as of now).
Example
$contents = file_get_contents($filePath);
header('Content-Type: ' . mime_content_type($filePath));
header('Content-Length: ' . filesize($filePath));
echo $contents;
Indexing vs Forbidding (No need of this now)
Preventing indexing basically prevent this folder/files to be index by google bots or search engine bots, anyone visiting directly will still be able to view the file.
In the case of Forbidding, no external entity/users/bots will able to see/access this file/folder.
If you have recently forbidden access of your pdf folder, it may still be visible to Google until Googlebot visits again on your site and find those missing or you mention noindex for that specific folder.
You can read more about crawler rate on https://support.google.com/webmasters/answer/48620?hl=en
If you still want these to remove, you can visit the Google search console and request the same. visit: https://www.google.com/webmasters/tools/googlebot-report?pli=1
Just paste this in your htaccess file, use set instead of append
<Files ~ "\.pdf$">
Header set X-Robots-Tag "noindex, nofollow"
</Files>

deny access to directory listing using htaccess

I want to deny access to particular directory to show list of files in it in the browser. For example, If I go to the url, localhost/myproject/assets, it will show all the list of files in it, I want to deny that. And also if logged in user access specific file in it, for ex : localhost/myproject/assets/uploads/img/1.jpg then it should be accessible.
Also how to deny access to a localhost/myproject/assets/uploads/img/1.jpg if that 1.jpg is uploaded by some other user.
I'm new to laravel ,Any help is much appreciated. thanks
You could add the following to the .htaccess file in the folder. This might help.
Options -Indexes
You cannot deny the access to the jpg uploaded by another user.
If you are using Apache, you can place a .htaccess file in the folder you want to block. Then you can use deny from all to block all requests to that folder.
This works because a .htaccess file can be in every directory in your web root, and only cares about the directory it is in and its subdirectories.
See this answer.

Is there a way to restrict the external users to access my server files

Is there a way to restrict the external users to access my server files..
example is when i access this dir http://puptaguig.net/evaluation/js/ it shows the 404 page(though it's not obvious) but when i tried to view control.js here http://puptaguig.net/evaluation/js/controls.js it opened up..
IndexIgnore *
<Files .htaccess>
order allow,deny
deny from all
</Files>
i just want to make these files inside my server directory to secured from outside viewing for some reasons..but how?
Best Regards..
siegheil/js? Should be siegheil/ns for sure?
You could chmod 000 and then no one would see them or access them. You can't have people accessing and not seeing them at the same time. Can't be done.
You can add below lines to your httpd.conf or. htaccess this will avoid access of your JavaScripts
<Files ~ "\.js$">
Order allow,deny
Deny from all
Satisfy All
</Files>
The only way I can think to manage this is deny access to your js files by throwing a .htaccess in the siegheil/js/ folder that says something along the lines of:
deny from all
or just simply put your code in a folder above the root document level of the site itself.
After that, you then use something like minify to retrieve the js files from the backend (PHP / some other server language side) and have the minified / obfuscated code placed in another folder or just outputted directly from the script.
With all that said, in the end, the js code must be downloaded one way or another to be run by the browser. This will make it impossible to prevent people from looking at your code and figuring out what it does if they really want to.
You were able to access http://puptaguig.net/evaluation/js/controls.js but not http://puptaguig.net/evaluation/js/ because most Apache installs prevent an anonymous user from viewing the directory contents, and only permit access to specific files in the directory.
There is no way "hide" client-side JS because without access to those files your users will not be able to run your script. As suggested by #General Redneck, you can obfuscate and minify your js using a tools like minify or uglifyJS, but those can, potentially, been un-minified (minification is still a good idea for performance reasons). Ultimately you are fighting against the "open" nature of the web. I'd suggest putting a license on your code, and keeping an open mind : )
If you really need something to be secure, try accomplishing the essential functionality (which you want to keep private) with a backend language like php or asp.net and feeding the relevant data to you JS script.
You should create an .htaccess file in the relevant directory that has
-Indexes
in it. This will prevent listing of the directory and will cause a 403 error to be raised. Your application can then handle that however it wants to display whatever you want.

Prevent user from accessing the uploaded file

I have a module which enable user to upload photos to a certain path like
domain/media/img/uploadedFiles/
I would like to user can upload photo to this location but he cannot reach the uploaded photo by writing
domain.com/media/img/uploadedFiles/filename
I have achieved not to list the files in that path by using .htaccess file but If user knows the name of the uploaded file he can still reach that file.
Thanks
Assuming you're using Apache, you can block access to files in .htaccess too. For example:
<Files private.html>
Order allow,deny
Deny from all
</Files>
To prevent users from accessing any files in the directory, try putting an .htaccess file containing this inside the directory, which sets the default state to deny:
Order Allow,Deny
For more examples of specifying what resources you want to protect, see http://httpd.apache.org/docs/2.2/sections.html
See http://httpd.apache.org/docs/2.2/mod/mod_authz_host.html for more information on access control with Apache.

How to prevent files settings xml file from being downloaded by entering url but allow php to see

I have an xml file on the server containing details to the database server. I don't want anyone to be able to access it via url but PHP should be able to load the file
Two ways:
Simple move all those kinds of files outside the webroot, for example /application instead of /public_html/myapplication. You only need accessible pages (index.php etc.) inside the webroot.
Or if that's not possible/too hard, add this in .htaccess in the folder that contains the XML file (but it cannot contain files that should be accessible)
.
Order Allow,Deny
Deny from All
you could use .htaccess file: http://httpd.apache.org/docs/1.3/howto/htaccess.html
but, why put it in XML? put it in PHP as variables, then even if they visit the page they won't be able to see it.

Resources