Security: Is it a good practice to name folders on the server that are difficult to guess? - linux

Security question: Is it a good practice to name folders on the server by names that are difficult to guess (8+ symbols, not a simple "admin" or "services")? I'm asking about folders that contain not just icons or .js files or .css files, but .php files and are protected by .htaccess file (deny from all).

No. Security through obscurity isn't.
Plus it's really irritating for anybody using the machine via a shell, ftp, etc.
What would it protect against? Regardless of names, folder access should be handled by the machine's and/or network's normal security mechanisms. If they get past that, it doesn't matter what your artifacts are named–Ur PwNeD.

Good practice would be to keep your PHP files outside your web server's document root. E.g., if your doc root is /var/www, then you might have there just a single index.php file, and all that file does is launch your app:
set_include_path('/something/besides/var/www');
require_once 'foo.php';
require_once 'bar.php';
do_something();
This way, your web server doesn't even know that the PHP files exist, and can't serve them even if you have an accidentally misconfigured .htaccess.

This is security through obscurity. While there is no harm in doing it , It doesn't give anything in terms of security.

Related

How does htaccess / htpasswd effect file writing?

I've been developing a website on a local web server and I'm pretty happy with it. I'm about ready to deploy it but I've been looking at how to limit folder access via htaccess. My concern is I grab some php variables from a document on the web server and I'm worried that by denying htaccess ill also prevent the php file from reading and writing to this document. Is this the case? If so how would I go about setting up a hierarchy in which my php can read and write to my document but people can't access the folder that its in?
.htaccess is a means to configure a server on a per-directory basis.
If you are going to be writing files using PHP, then it is going to be doing so using the file system (unless you are using HTTP PUT or similar, but you'd know if you were), so the server configuration is irrelevant.
Apache will simply forward your requests to the PHP interpreter. Once the request is past Apache, all rewrites/folder restrictions have already been validated, which means PHP never knows about them (and it shouldn't).
Htaccess is a webserver restriction, if you can access the page, then PHP doesn't care if you have it or not, so you can fopen / edit your files from PHP without problems. Of course if you write your file to a (different) directory that is htaccess protected, the user will have to insert the password to read it

Is there a security difference between storing files outside of the DocumentRoot versus "deny from all" htaccess directives?

Knowing that a deny from all directive will traverse all sub-directories and files below it, and ignoring the obvious caveats of "if you forget" to copy the .htaccess file or if you typo creating an .htaccess file...
Is there a risk in security between storing non-public files outside of the DocumentRoot versus placing an .htaccess file with a deny from all directive in each non-public directory in the DocumentRoot?
There are a few things to consider here:
.htaccess is only going to protect your file from access over the
web. For example, suppose you have a typical FTP server setup with
virtual users who are restricted to the document root. If an
attacker gains access to your FTP server (which is not that
far-fetched given how insecure most FTP configurations are), they
will have access to both the .htaccess file and any of your
protected files that are in the document root.
That was just one example that may not apply to your environment,
but the idea that I'm really trying to get at is that .htaccess
files don't give you that much depth in your security. They protect
you in one context (access over the Internet) but not in others.
Your server administrator has the ability to disable specific .htaccess
directives, to disable certain Apache modules (which your .htaccess file
may use), and even to disable the use of .htaccess files period. If you
don't have control over your Apache configuration (which I'm assuming
is the case since you're choosing to overwrite it with an .htaccess file),
you also don't really have control over whether your .htaccess file is going
to be respected. It really comes down to your relationship with your
host/server administrator and what they decide to allow.
Finally, if the .htaccess file is writable by the user your Apache
server is running as, a determined hacker can modified that file.
Ex. if you're using Wordpress, many popular themes will demand write
access to the .htaccess file so that they can control URL rewriting.
I'd imagine some other Content Management Systems do the same.
With all that said, using an .htaccess file (or directly altering your Apache configuration files) may still be a perfectly valid security measure for you. It depends on what your environment as a whole looks like -- how your server is configured, what you're trying to protect, etc. Hopefully I at least gave you some things to think about.

Upload file security >> Restricting names and extensions not enough? (can not rename, or move files)

"The most important safeguard is to keep uploaded files where they cannot be directly accessed by the users via a direct URL. This can be done either by storing uploaded files outside of the web root or configuring the web server to deny access to the uploads directory.
Another important security measure is to use system-generated file names instead of the names supplied by users when storing files on the file system. This will prevent local file inclusion attacks and also make any kind of file name manipulation by the user impossible"
I understand this, however - I am providing options for Wordpress users to upload files to their image directory, so I can not do either of these afaik. The files need to go into the images directory, and be named a name of their choosing.
Here is what I am doing so far:
1) Only allowing files with names with one extension, and the extension must be from a trusted list.
2) Only allowing alphanumeric, spaces and underscores in the first part of the name and less than 30 chars.
3) Not allowing files with the name .htaccess to be uploaded
4) Only allowing admin access to the upload and using wp nonces
5) Checking mime type
6) Checking file size
Some questions I have are:
If I deny uploading any file named '.htaccess' and am denying any file with .php extension, shouldn't this prevent someone from upolading an image file with .php code embedded?
I understand that I can use php to copy images without malicious code, however I am planning to allow the upload of .ttf files and .css files as well.
I could scan those files with php for script question marks, etc. Is this advisable? If so what would I search for beyond this?
If I am only allowing admins access and am using nonces and the above methodology, how secure is my code and are their other things that I should be doing?
Any help is greatly appreciated!
I thought I would bump this - having a hard time finding much feedback here.
If you do a thorough scrubbing of file names, and only whitelist image, text and css files, what kind of security does that buy you.
Currently, I am uploading as a random name in a directory, scrubbing the name, one extension, whitelisted and re-saving in a public image directory. And only allowing access by wp admins.
You should disable PHP execution in the images directory. That would prevent a lot of the potential problems you've considered without having to worry about having missed some tricky filename construction. Add php_flag engine off to the apache configuration for that directory.
Unless you really need them for some reason, you should also disable .htaccess files, at least in the images directory. Everything you can do in a .htaccess file can be done in an apache configuration file outside of any directory that might be writeable by the web server. See the AllowOverride directive.

LAMP: Recommended Directory and File Permissions

My project resides in a shared Linux hosting server. The hosting provider, of course, has already set up the necessary directory and file ownerships relative to other server users. My concern for now is how to setup permissions within my domain so my users can have read access to the files and folders they should have and still let my scripts retain read/write access to it.
Question: What would be the recommended permissions on:
Public files and folders (read only?)
Files where uploaded files from forms are stored
Files and folders where GD and cache files are being written into
Folders where my server-side scripts are stored (I used mainly PHP)
My WWW root folder (where index.php resides)
This is a perfect example of where you need the Principle of Least Privilege. Allow ReadOnly to the webserver's user for RO content, allow writing only to a directory/files that absolutely need to be written. Explicitly deny access to things you don't want people to read (config files, htaccess, anything with paths/ip addresses/passwords), don't allow any extra processing if you're not using it (CGI executables, Server Side Includes).
The best way to do it is to start with deny everything and slowly open thing up as you go. First try serving static content, see what is the minimal amount of Apache directives/modules and filesystem ownerships and permissions to get it working. Then try some RO PHP scripts. Then try some RW PHP scripts. Then DB connectivity, and so on, you get the idea... It's a very tedious processes, and you want to plan ahead the sort of things you want to test; I tend to write long scripts with wget commands trying to do both good and bad things to the server. Make one change, restart, rerun the script, see what changes from the last time. Observe-modify-analyze, until you cant stand looking at it anymore ;)

cgi-bin directory contents: What else can be stored there, apart from the CGI scripts/executables?

What files should/should not be stored in the cgi-bin folder/directory on a web server?
Obviously, executable scripts/files that make up a web application, called from a web browser can be stored there.
But is there a common industry opinion about what else can be stored there?
Is there a very strong reason why nothing else apart than the scripts/executables is allowed there?
My preference is to store all files belonging to an application in the cgi-bin directory/folder, as a subfolder off it - for each application.
For example directory cgi-bin/myapplication would contain:
the cgi scripts/executables
datafiles
configuration files
This simplifies installation and also simplifies the steps to run different versions of a application in parallel, e.g. for trialling a new version.
Concerns about security access to non-script files can be addressed by using the correct user permissions and also Apache .htaccess to control access to the directory and files.
It would seem that popular free applications are in favour of this everything-under-one-directory approach: The versions of bugzilla, the free defect and feature tracking tool, e.g. 3.4.4 are offered in this structure, while earlier versions, e.g. 2.x installed bugzilla components to at least three folders.
Drupal, the powerful and popular free content management system also takes this approach of everything-under-one-directory, albeit doesn't use the cgi-bin folder but the approach is the same.
What are your thoughts?
There is nothing special about the cgi-bin folder. It is like any publicly-accessible web folder that has the "allow-script" flag set (or the equivalent for your web server) - something that has become almost meaningless in the world of PHP/JSP and the likes.
You should only store files that you wish to be public in any folder under your webroot. You probably don't want your data and configuration to be downloadable by any user on the internet, so don't keep them in /cgi-bin
Certain servers may try and execute any file in /cgi-bin if requested. This could cause problems, especially if text or data files are executed as shell script.
Applications like Drupal are intended to be easy for anyone to install, regardless of what permissions they may have on their web-host. This is the main reason it keeps everything together. If you have the ability to put files where you want, it is always a good practise to keep non-public files outside of the webroot. If you must keep them under the webroot, then ensure that you use your server's configuration to deny public access to the non-public files.

Resources