Are all .dotfiles secure from http requests, or only .htaccess/.htpasswd? - .htaccess

Following an answer to this question, I'm starting to look at keeping multiple .htaccess files for my different environments. The gist of it is, you create a file for each environment (.htaccess-dev, .htaccess-prod, etc) so you can track them all in Git, then symlink .htaccess to whichever file you want to use on a given environment. Simple enough, and easy to rebuild if it gets destroyed.
Before I implement this though, I wanted to do my diligence - I can't find anything relating to security of .dotfiles past .htaccess/.htpasswd. If I had .htaccess-dev and .htaccess-prod on my production server, would they be accessible through a browser? Are there any other security considerations I should be aware of?

There's probably something like this inside your server configuration (older Apache):
<FilesMatch "^.ht">
Order allow,deny
Deny from all
</FilesMatch>
Or maybe this (new Apache):
<Files ".ht*">
Require all denied
</Files>
Or even this (nginx):
location ~ /\.ht {
deny all;
}
As the first line of each bit suggests, these rules restrict access to any file starting with .ht. However, there's no guarantee that this configuration option is there, it just happens to be in the default config for some web servers.
In short, there's nothing magical about .htaccess files not being accessible, it's all in your config file. In your case, your alternative htaccess files happen to match the rule, but you're probably better off just writing similar rules for other files you want to deny access to, so you can make it explicit that you do want these stored but don't want them published.

Related

What is this file in .htaccess?

I am realy wonder why in .htaccess has those code bellow, can tell me what is this code?
<Files 403.shtml>
order allow, deny
allow from all
</Files>
deny from 212.92.53.18
It is not definitely malware.
At least, not in the sense it's intended for malicious reasons...
In the case you are using cpanel and you have used its IP Deny Manager to block access to 212.92.53.18 then this will automatically be written to your .htaccess file with the intended purpose of blocking that IP (and any others you may wish to enter):
<Files 403.shtml>
order allow, deny
allow from all
</Files>
deny from 212.92.53.18
Do you use cpanel and if so, do you remember doing that?
Allowing the 403 to All simply prevents a loop. If you block an IP using the 'deny from' method, then serving of the 403 to that IP would also get blocked, creating a loop. Allowing the specific 403 file to ALL, will override the block -- of serving the 403 to that specific IP -- that otherwise would have occurred. That prevents a loop.
<Files 403.shtml>
order allow, deny
allow from all
</Files>
I used it myself on an old domain. It simply says "allow anyone to access the file named 403.shtml"; which is the forbidden access error. Of course, you would use this usually if you created a custom 403.shtml page.
The denied IP in this case would not see the custom 403.shtml and instead would get a White-screen-of-death.
So this is not, in any way shape or form, malware related.
UPDATE: This answer was based on speculation using the facts provided when it was originally posted. The overall consensus seems to be this modification of the .htaccess file is most likely the result of using server management software such as CPanel so it’s not—on its own—an indication of malware infection.
The contents of that .htaccess are a bit odd.
<Files 403.shtml>
order allow, deny
allow from all
</Files>
deny from 212.92.53.18
The <Files 403.shtml> part refers to the 403.shtml file and it seems to be allowing a custom 403: Forbidden response (assumption based on file naming) .shtml file to be sent. The order allow, deny and related allow from all explain it to me. It seems like the site is blocking all traffic in some way but wants that 403.shtml to come through?
But the deny from 212.92.53.18 is quite specific & odd as a result. That is basically blocking any/all access from 212.92.53.18.
Now typing that out it seems like the .htaccess is set to explicitly deny access from address 212.92.53.18 which would send a 403 response code, and the <Files 403.shtml> allows the actual 403: Forbidden htaccess page to be sent?
But still, it seems odd for a directive to block traffic from one single IP address would be in an .htaccess file like that.
EDIT: Did a Google search for <Files 403.shtml>—because if you know Apache configs, that is a highly odd directive—and it seems like this might be part of some malware? Look at this page as well as this page and this other page.
Seems like this is part of a definite XSS backdoor? Perhaps the .htaccess is in a malware directory, and the deny from 212.92.53.18 is denying the infected server from accessing itself?
ANOTHER EDIT: Okay, putting on my thinking cap—as well as personal experience with web malware—and looking at the specificity of the deny from 212.92.53.18 I think I know what the deal is. This is part of a malware infection. But I bet that 212.92.53.18 is a node on a bonnet because you can curl -I it & visit it in a browser & it seems to be an active server. Most client IP addresses just won’t do that; who has a web server exposed on a basic ISP connection, right? Unless the machine is infected. So the 403.shtml is not actually a real 403: Forbidden page but actually part of the malware. Meaning, a connection being made FROM 212.92.53.18 would trigger 403.shtml—which is a server side include HTML file—that could be used for unauthorized access. I mean, when has anyone in 2014 last seen active .shtml files on legit servers, right? It’s all PHP, Python, Java or Ruby nowadays.
This?
<Files 403.shtml>
order allow,deny
allow from all
</Files>
deny from xx.xx.xx.xx
Hacker? Backdoor? Malware? Ukraninian DOS attack?
Of course it IS NOT. It's nothing of the sort.
It is automatically generated by cPanel, when the "IP Blocker" is used.
cPanel writes it to your .htaccess file
The 'deny from' is simply the IP specified when using the cPanel IP Blocker tool. cPanel is clever enough to know a little more is needed than just a simple 'deny' IP4 entry.
Probably it's terrorific hack and malware. Ukraine/Russian/Indonesian hackers. On july 2016 they have attacked a lot of sites with Prestashop with a vulnerability on image file uploads. They upload that 403.shtml to the root and then they destroy the server and files. I have checked that my web is on their web page that inform hacked websites. They block some nights your access to the web with a DDOS attack to get the pass of mysql and ftp. In prestashop you have to upload urgent to 1.6.1.16 or upload some protection files. Unfortunately, I have do that, but they don't stop and try again blocking my webshop.
The only another option is that you put block ip on cpanel, but the trick is what Giacomo1968 says in their answer. Congratulations.

restrict access to all php files besides three of them

I've made a site 1 year ago using php, when I had alot less experience. My teacher and I were analysing the code today and there seems to be a security issue. He wants me to fix it before he gives me the points I need.
I've got an index.php and an edit.php file in the root directory, and a login page in /php/login.php (which I find to be a very silly place to put a login file in, now that I look back on it, I would probably swap edit.php's and login.php's directory's if I were to rewrite my site).
Basically, I want these three files to be accessible externally. I want all other php files to be restricted from the outside, so it's impossible to do an ajax call to /php/phpsavefile.php from outside the system (which is the security issue I mentioned). edit.php makes the ajax call to /php/savefile.php.
I think this is what I need to get the job done:
Order Deny,Allow
Deny from all
Allow from 127.0.0.1
<Files /index.php>
Order Allow,Deny
Allow from all
</Files>
But how can I add three files instead of just one after <Files and before >?
I've also tried second approach:
Order Deny,Allow
Deny from all
This doesn't seem to work because an ajax call appears to be a regular http request as well, so it gets a 403 response.
Another approach I tried was putting the restricted php files inside a map called "private"
in the same folder where "httpdocs" remains (the parent folder of webroot). My teacher had told me about an admin folder, that no one can access but the site itsself. I tried including the restricted php files inside the private folder, but it didn't seem to include it properly...
Any help or tips for this novice at .htaccess would be appreciated :-)
Edit:
.htaccess allow access to files only from includes
Ray's comment said:
Of course, because they are requested by the client. You can't "allow the client" and "not allow the client" to serve files.
I suppose this is true, but how can I prevent people from calling my ajax file?
I secured it by checking if the user was logged in.

Is there a way to restrict the external users to access my server files

Is there a way to restrict the external users to access my server files..
example is when i access this dir http://puptaguig.net/evaluation/js/ it shows the 404 page(though it's not obvious) but when i tried to view control.js here http://puptaguig.net/evaluation/js/controls.js it opened up..
IndexIgnore *
<Files .htaccess>
order allow,deny
deny from all
</Files>
i just want to make these files inside my server directory to secured from outside viewing for some reasons..but how?
Best Regards..
siegheil/js? Should be siegheil/ns for sure?
You could chmod 000 and then no one would see them or access them. You can't have people accessing and not seeing them at the same time. Can't be done.
You can add below lines to your httpd.conf or. htaccess this will avoid access of your JavaScripts
<Files ~ "\.js$">
Order allow,deny
Deny from all
Satisfy All
</Files>
The only way I can think to manage this is deny access to your js files by throwing a .htaccess in the siegheil/js/ folder that says something along the lines of:
deny from all
or just simply put your code in a folder above the root document level of the site itself.
After that, you then use something like minify to retrieve the js files from the backend (PHP / some other server language side) and have the minified / obfuscated code placed in another folder or just outputted directly from the script.
With all that said, in the end, the js code must be downloaded one way or another to be run by the browser. This will make it impossible to prevent people from looking at your code and figuring out what it does if they really want to.
You were able to access http://puptaguig.net/evaluation/js/controls.js but not http://puptaguig.net/evaluation/js/ because most Apache installs prevent an anonymous user from viewing the directory contents, and only permit access to specific files in the directory.
There is no way "hide" client-side JS because without access to those files your users will not be able to run your script. As suggested by #General Redneck, you can obfuscate and minify your js using a tools like minify or uglifyJS, but those can, potentially, been un-minified (minification is still a good idea for performance reasons). Ultimately you are fighting against the "open" nature of the web. I'd suggest putting a license on your code, and keeping an open mind : )
If you really need something to be secure, try accomplishing the essential functionality (which you want to keep private) with a backend language like php or asp.net and feeding the relevant data to you JS script.
You should create an .htaccess file in the relevant directory that has
-Indexes
in it. This will prevent listing of the directory and will cause a 403 error to be raised. Your application can then handle that however it wants to display whatever you want.

Allow safe FTP upload

I'd like to allow my friend to upload some photos for me over FTP to my server (shared host). It's a trusted friend but I'd still like to block the execution of any php or similar scripts etc.
How can I use .htaccess (in a directory above the one I allow FTP to acces) to block everything except a list of approved extensions (images) and disallow htaccess (to prevent any further modifications)?
Does such method still have security risks?
Thanks!
You should be able to use
<FilesMatch ".+">
Order Deny,Allow
Deny From All
Allow From localhost # OR WHATEVER HERE
</FilesMatch>
<FilesMatch "\.(jpg|gif|stuff)$">
Order Deny,Allow
Allow From All
</FilesMatch>
EDIT
For preventing further modifications to htaccess, you need to set filesystem permissions accordingly (aka OS dependent), since you are most likely to give your friend full FTP access (including delete/overwrite/append).

Securing files and folders with htaccess

I have a couple of files on my server that contains sensitive information. Only the server should be allowed to edit these files, no one else should be able to read/access them. They are stored as .txt.
I've stored them in a separate folder, and added a .htaccess file with:
<Files *>
Deny from all
</Files>
My question is weather it's secure enough to store sensitive information with .htaccess, or if someone can hack it and get access to the files?
Thanks
.htaccess is as secure as you can get, on a server-side basis.
All .ht files by default are un-accessible to the public, so no-one can edit or view the .htaccess file unless they access it through FTP ect. So the .htaccess file is secure as your server is.

Resources