I'm working on a project which uses a folder full of flat-file databases. I'd like to make sure these databases are only accessible to scripts running off the server, so I set the folder permissions to 700.
This results in all scripts functioning properly, but a 403 Forbidden whenever I try to access the database folder in my browser. This is good.
However, I'm wondering: am I missing something? Is there any way — short of gaining access to my FTP account – for an outside user to access this folder? Or can I rest easy?
The proper solution is storing them outside the document root. If you cannot do that, but know that Apache will be used, create a .htaccess in the folder with the following contents:
order deny, allow
deny from all
Using filesystem permissions may or may not work depending on the environment - in a perfect setup the webserver would use the same uid as your system user that owns the files. Then your approach wouldn't work.
Related
I'm working on a php project where a particular feature will have to access the files stored from an external directory:Network Attached Storage(linux). Lets say the path is /volume1/accounts and this is mounted in the linux server where my site is hosted using apache. I will have to retrieve files from that directory. is there a way in PHP to do that? My client says that its already been mounted.
No matter what I do I cant access using these test codes
print "<pre>".print_r(scandir("/volume1/accounts/"), true)."</pre>";
print "<pre>".print_r(scandir("192.168.0.233/volume1/accounts"), true)."</pre>";
print "<pre>".print_r(scandir("192.168.0.233:/volume1/accounts"), true)."</pre>";
How am I suppose to do it? Please help me.
Generally, PHP engine is executed with apache server's privileges. So mounted directory has no permissions or ownership for apache server, It'll be not able to show file lists. Could you try to make directory on /volume1/accounts/ and change ownership and permissions? If apache server is working with apache:apache ownership, please change ownership of directory as same.
Assuming a user has access to all files within the public_html directory. Doesn't this mean they could have access the a node.js application code within it? Surely this is a massive security risk.
What is the normal way of handling this? Would you user files permissions to restrict the file, or place the node directory outside of the public_html and reference it somehow? If so, how?
Many thanks for any answers given!!
Yes your server scripts should live outside public_html. Only files that you want to make available to the public should be placed under public_html.
Your node server script can refer to the "./public_html" or "../public_html" folder if it is stored in or above the folder containing the script, or it can even refer to "/path/to/public_html" if it is stored elsewhere on your filesystem.
I asked this question a while back and even though I put up several bounties, I never got much of an answer (see here). More generally, I want to know if there is any concept of security with suPHP? What's to stop anyone from going to
www.example.com/rm-f-r.php
or
www.example.com/return_some_iamge.php
Because those scripts get executed with the privileges of the user, it's essentially guaranteed acesss.
EDIT To elaborate on the above, my problem is a conceptual one. Assume we have a file at /home/user/test.php. Let this file do anything (rm -f -r /, fetch and return a picture, reboot the computer...) If I point my browser to that file (assuming the containing folder is an enabled site under Apache) how do I tell the browser to only let the owner of that file execute it?
EDIT 2: I never explicitly stated this as I assumed suPHP is only used with apache (ie. web browsers), but I am talking about authenticating linux users with only a browser. If we do not authenticate, then anyone technically has access to any script on the server (with web sites this is not a problem as they always have permissions set to 0644, so essentially the whole world can see. PHP files on the other hand, have permissions generally set to 0700)
suPHP has the effect that the PHP runtime executes with the permission of the user that authored the .php file. This means that a PHP program author can only read and write files that he himself owns, or otherwise has access to.
If you put a PHP file on your website you are making it publicly runnable by anyone that comes along to your website - using suPHP does not change this. Without logging in to your site, all web users are effectively anonymous and there is no way to reliably identify an individual. suPHP only controls the local permissions the script will have when it is executed, it does not intend to introduce any form of web user authentication or authorisation.
If you wish to control which users can actually run a script, you need to implement some login functionality and force the users to log in to your site. Then add a check to the sensitive PHP script (or Apache configuration) which will make it abort the request, if the current logged in web user is not one you wish to execute that script.
My project resides in a shared Linux hosting server. The hosting provider, of course, has already set up the necessary directory and file ownerships relative to other server users. My concern for now is how to setup permissions within my domain so my users can have read access to the files and folders they should have and still let my scripts retain read/write access to it.
Question: What would be the recommended permissions on:
Public files and folders (read only?)
Files where uploaded files from forms are stored
Files and folders where GD and cache files are being written into
Folders where my server-side scripts are stored (I used mainly PHP)
My WWW root folder (where index.php resides)
This is a perfect example of where you need the Principle of Least Privilege. Allow ReadOnly to the webserver's user for RO content, allow writing only to a directory/files that absolutely need to be written. Explicitly deny access to things you don't want people to read (config files, htaccess, anything with paths/ip addresses/passwords), don't allow any extra processing if you're not using it (CGI executables, Server Side Includes).
The best way to do it is to start with deny everything and slowly open thing up as you go. First try serving static content, see what is the minimal amount of Apache directives/modules and filesystem ownerships and permissions to get it working. Then try some RO PHP scripts. Then try some RW PHP scripts. Then DB connectivity, and so on, you get the idea... It's a very tedious processes, and you want to plan ahead the sort of things you want to test; I tend to write long scripts with wget commands trying to do both good and bad things to the server. Make one change, restart, rerun the script, see what changes from the last time. Observe-modify-analyze, until you cant stand looking at it anymore ;)
What files should/should not be stored in the cgi-bin folder/directory on a web server?
Obviously, executable scripts/files that make up a web application, called from a web browser can be stored there.
But is there a common industry opinion about what else can be stored there?
Is there a very strong reason why nothing else apart than the scripts/executables is allowed there?
My preference is to store all files belonging to an application in the cgi-bin directory/folder, as a subfolder off it - for each application.
For example directory cgi-bin/myapplication would contain:
the cgi scripts/executables
datafiles
configuration files
This simplifies installation and also simplifies the steps to run different versions of a application in parallel, e.g. for trialling a new version.
Concerns about security access to non-script files can be addressed by using the correct user permissions and also Apache .htaccess to control access to the directory and files.
It would seem that popular free applications are in favour of this everything-under-one-directory approach: The versions of bugzilla, the free defect and feature tracking tool, e.g. 3.4.4 are offered in this structure, while earlier versions, e.g. 2.x installed bugzilla components to at least three folders.
Drupal, the powerful and popular free content management system also takes this approach of everything-under-one-directory, albeit doesn't use the cgi-bin folder but the approach is the same.
What are your thoughts?
There is nothing special about the cgi-bin folder. It is like any publicly-accessible web folder that has the "allow-script" flag set (or the equivalent for your web server) - something that has become almost meaningless in the world of PHP/JSP and the likes.
You should only store files that you wish to be public in any folder under your webroot. You probably don't want your data and configuration to be downloadable by any user on the internet, so don't keep them in /cgi-bin
Certain servers may try and execute any file in /cgi-bin if requested. This could cause problems, especially if text or data files are executed as shell script.
Applications like Drupal are intended to be easy for anyone to install, regardless of what permissions they may have on their web-host. This is the main reason it keeps everything together. If you have the ability to put files where you want, it is always a good practise to keep non-public files outside of the webroot. If you must keep them under the webroot, then ensure that you use your server's configuration to deny public access to the non-public files.