LAMP: Recommended Directory and File Permissions - linux

My project resides in a shared Linux hosting server. The hosting provider, of course, has already set up the necessary directory and file ownerships relative to other server users. My concern for now is how to setup permissions within my domain so my users can have read access to the files and folders they should have and still let my scripts retain read/write access to it.
Question: What would be the recommended permissions on:
Public files and folders (read only?)
Files where uploaded files from forms are stored
Files and folders where GD and cache files are being written into
Folders where my server-side scripts are stored (I used mainly PHP)
My WWW root folder (where index.php resides)

This is a perfect example of where you need the Principle of Least Privilege. Allow ReadOnly to the webserver's user for RO content, allow writing only to a directory/files that absolutely need to be written. Explicitly deny access to things you don't want people to read (config files, htaccess, anything with paths/ip addresses/passwords), don't allow any extra processing if you're not using it (CGI executables, Server Side Includes).
The best way to do it is to start with deny everything and slowly open thing up as you go. First try serving static content, see what is the minimal amount of Apache directives/modules and filesystem ownerships and permissions to get it working. Then try some RO PHP scripts. Then try some RW PHP scripts. Then DB connectivity, and so on, you get the idea... It's a very tedious processes, and you want to plan ahead the sort of things you want to test; I tend to write long scripts with wget commands trying to do both good and bad things to the server. Make one change, restart, rerun the script, see what changes from the last time. Observe-modify-analyze, until you cant stand looking at it anymore ;)

Related

linux hosting permissions shared server

What is the actual difference between executable and read permissions on a shared linux server, meaning, how exactly does that relate to what a web visitor can do with, for example a php file? Using godaddy shared hosting, for example, under basic permissions, if web user is not readable, but is executable, the same thing happens as when it is readable but not executable - the php file executes. Also, on a shared linux server, what exactly does making a file writable for web user- someone who doesn't have access to server login but visits the page through a browser do?
The basic answer is: nothing. Visitors to a website aren't directly accessing any of the files, PHP or otherwise. They send an HTTP request to the server service (wow, that's terrible wording) on the computer (e.g.: Apache), which then loads the page, executes the PHP, etc. So when you're changing permissions, the pertinent permissions to change are what permissions the Apache account (which, depending on the distro, can be either nobody or www-data) has on those files. As for what the permissions actually do, this Wikipedia page describes it quite well.
You can test this yourself if you have a Linux box. Take a directory with files in it and sudo chmod -R 744 it. Then, try to ls -l into it. You'll be able to see file names, but not any other information about the file (including the contents - nanoing any file in that directory will result in creating a new file).
You have to remember that all this relies on what the web server wants to do, since everything has to go through the web server. It's not like reading a file from a disk. So when you request "index.php" or "index.cgi", you are not reading the contents of the file. The web server will see that the file you're requesting is a program, and it will run the program. Instead of outputting the contents of the file, it will output whatever the program outputs. This is simply a setting, and has nothing to do with permissions. Also, you do not have the ability to change this setting if you're using a shared hosting account.
on a shared linux server, what exactly does making a file writable [...] do?
You can't make a file "writable" with HTTP. Again, this is not like accessing a file system on a local drive. You can make a server-side program that can handle file uploads, but again, this is has nothing to do with permissions.
I hope this is what you meant. Let me know if you meant something else.

Are folder permissions on a web server adequate security?

I'm working on a project which uses a folder full of flat-file databases. I'd like to make sure these databases are only accessible to scripts running off the server, so I set the folder permissions to 700.
This results in all scripts functioning properly, but a 403 Forbidden whenever I try to access the database folder in my browser. This is good.
However, I'm wondering: am I missing something? Is there any way — short of gaining access to my FTP account – for an outside user to access this folder? Or can I rest easy?
The proper solution is storing them outside the document root. If you cannot do that, but know that Apache will be used, create a .htaccess in the folder with the following contents:
order deny, allow
deny from all
Using filesystem permissions may or may not work depending on the environment - in a perfect setup the webserver would use the same uid as your system user that owns the files. Then your approach wouldn't work.

Protecting folder and its files

I wish to protect folder with core files of CMS and its sub folders and files from accessing via web, and I tried with .htaccess file with this:
order deny,allow
deny from all
Problem I have is that I can protect that folder but some script from that folder or its sub folder then do not work good.
I also tried with this:
order deny,allow
deny from all
allow from 127.0.0.1
allow from 76.xx.xx.xx
In this case 76.xx.xx.xx is static IP of site.
Is there any way to prevent accessing files in that folder but still to make all work ok?
Another question.
I wish to secure more my site from hackers. So, is there any way to prevent injecting malicious files and code in my scripts/files and/or to block my site of executing files from other sites, hosts, to allow just working with local files.
I prefer .htaccess file, but if it is needed I have access to WHM if there is need for editing other files (but in that case I will need step by step guide). I am running site on Linux VPS with Cent-OS 5 system.
The usual way to do this is to put the accessible files in an apache-accessible directory, but all the rest into a directory out of the way from Apache. For example:
/usr/
local/
mycms/
public/
lib/
/var/
www/
mycms -> softlink to /usr/local/mycms/public
Or better yet, make mycms an alias in Apache config, pointing at the public directory. This way, the files that should be accessible are, those that shouldn't be aren't, and you can still reference all your other files simply by ../lib/ etc.
I know this does not really answer your question literally, and if the CMS directory structure is not under your control, this may not be the best way to do it.
Another way is through rewrites - simply rewrite all requests to your CMS directory except for your CMS's entry script into requests for the entry script.

cgi-bin directory contents: What else can be stored there, apart from the CGI scripts/executables?

What files should/should not be stored in the cgi-bin folder/directory on a web server?
Obviously, executable scripts/files that make up a web application, called from a web browser can be stored there.
But is there a common industry opinion about what else can be stored there?
Is there a very strong reason why nothing else apart than the scripts/executables is allowed there?
My preference is to store all files belonging to an application in the cgi-bin directory/folder, as a subfolder off it - for each application.
For example directory cgi-bin/myapplication would contain:
the cgi scripts/executables
datafiles
configuration files
This simplifies installation and also simplifies the steps to run different versions of a application in parallel, e.g. for trialling a new version.
Concerns about security access to non-script files can be addressed by using the correct user permissions and also Apache .htaccess to control access to the directory and files.
It would seem that popular free applications are in favour of this everything-under-one-directory approach: The versions of bugzilla, the free defect and feature tracking tool, e.g. 3.4.4 are offered in this structure, while earlier versions, e.g. 2.x installed bugzilla components to at least three folders.
Drupal, the powerful and popular free content management system also takes this approach of everything-under-one-directory, albeit doesn't use the cgi-bin folder but the approach is the same.
What are your thoughts?
There is nothing special about the cgi-bin folder. It is like any publicly-accessible web folder that has the "allow-script" flag set (or the equivalent for your web server) - something that has become almost meaningless in the world of PHP/JSP and the likes.
You should only store files that you wish to be public in any folder under your webroot. You probably don't want your data and configuration to be downloadable by any user on the internet, so don't keep them in /cgi-bin
Certain servers may try and execute any file in /cgi-bin if requested. This could cause problems, especially if text or data files are executed as shell script.
Applications like Drupal are intended to be easy for anyone to install, regardless of what permissions they may have on their web-host. This is the main reason it keeps everything together. If you have the ability to put files where you want, it is always a good practise to keep non-public files outside of the webroot. If you must keep them under the webroot, then ensure that you use your server's configuration to deny public access to the non-public files.

Not able to delete directory

I am having a frequent problems with my web hosting (its shared)
I am not able to delete or change permission for a particular directory. The response is,
Cannot delete. Directory may not be empty
I checked the permissions and it looks OK. There are 100's of files in this folder which I don't want.
I contacted my support and they solved it saying it was permission issue. But it reappeared. Any suggestions?
The server is Linux.
You can't rmdir a directory with files in it. You must first rm all files and subdirectories. Many times, the easiest solution is:
$ rm -rf old_directory
It's entirely possible that some of the files or subdirectories have permission limitations that might prevent them from being removed. Occasionally, this can be solved with:
$ chmod -R +w old_directory
But I suspect that's what your support people did earlier.
This could also be because your FTP client might not be showing the hidden files (like cache, or any hiddn files that your application might create), while the hidden files are preventing you from deleting the directory. (though, in your case, I am not sure if this is the cause .. .it could be permission issue with your hosting provider.. Webserver running as another user (like apache or www) combined with your directories having global write perms).
I assume that's a response from an FTP server?
Usually, a message from an FTP server really means it. If it says the directory is not empty, there might be certain files you cannot see that exists in the directory which maybe one of:
Your PHP/JSP/ASP/whatever scripts may run under a different user account thus creating files which you may not be able to see/delete
Is your hosting's web interface run under your FTP account? There might be conflicting permissions there if you manage some files from the web interface and then later via FTP.
Hosting server/operating system files created unintentionally e.g. from the hosting's web interface
If it comes from a script, write a one-time throw-away script that delete the files and that directory and then uploads and executes it.
And just to be sure, some FTP server doesn't support direct directory deletion, you need all the files first, is that the case?

Resources