Is there any security with suPHP? - security

I asked this question a while back and even though I put up several bounties, I never got much of an answer (see here). More generally, I want to know if there is any concept of security with suPHP? What's to stop anyone from going to
www.example.com/rm-f-r.php
or
www.example.com/return_some_iamge.php
Because those scripts get executed with the privileges of the user, it's essentially guaranteed acesss.
EDIT To elaborate on the above, my problem is a conceptual one. Assume we have a file at /home/user/test.php. Let this file do anything (rm -f -r /, fetch and return a picture, reboot the computer...) If I point my browser to that file (assuming the containing folder is an enabled site under Apache) how do I tell the browser to only let the owner of that file execute it?
EDIT 2: I never explicitly stated this as I assumed suPHP is only used with apache (ie. web browsers), but I am talking about authenticating linux users with only a browser. If we do not authenticate, then anyone technically has access to any script on the server (with web sites this is not a problem as they always have permissions set to 0644, so essentially the whole world can see. PHP files on the other hand, have permissions generally set to 0700)

suPHP has the effect that the PHP runtime executes with the permission of the user that authored the .php file. This means that a PHP program author can only read and write files that he himself owns, or otherwise has access to.
If you put a PHP file on your website you are making it publicly runnable by anyone that comes along to your website - using suPHP does not change this. Without logging in to your site, all web users are effectively anonymous and there is no way to reliably identify an individual. suPHP only controls the local permissions the script will have when it is executed, it does not intend to introduce any form of web user authentication or authorisation.
If you wish to control which users can actually run a script, you need to implement some login functionality and force the users to log in to your site. Then add a check to the sensitive PHP script (or Apache configuration) which will make it abort the request, if the current logged in web user is not one you wish to execute that script.

Related

executing a script uploaded through file upload in public folder

We have to fix some security vulnerability in our system, and one of the items is to: disable execution of uploaded scripts/exe's through file upload control.
We have excel upload facility. Lets say hypothetically hacker changes the .exe to .xls and uploads it (there are ways to block that, but ignore that for now). Also assume that
the upload folder is within pubilc directory from where the website is installed in IIS. OR
Someone can access that file by specifying a full path of file thru some api endpoint of which hacker is aware of
Now given that there is an exe or a script which is accessible to the hacker through above means, is it possible for hacker to run that script/exe in someway, so that it can cause harm to the server where the site is hosted?
I am not really security expert hence cant think ways how that can be possible? How a hacker can remotely run exe/script on server, given that they does not have any access to the server.
One of the things that you should definitely do is to remove IIS handlers permissions from running scripts, otherwise anybody can upload a ".asp" or a ".aspx" or any other script engine file and then execute it by requesting it. One simple way to test that is just create a "test.asp" file with "<%= Now() >" and if that returns you the date, then anybody can upload scripts and run them in your server.
The way to disable that in IIS 7+ would be to add a configuration file in a parent directory and edit the permission for handlers, for example assuming a child folder called "public" you can drop the following web.config to disable that:
<configuration>
<location path="public">
<system.webServer>
<handlers accessPolicy="Read" />
</system.webServer>
</location>
</configuration>
You can test then that it should no longer execute the file and instead block it. If you want to allow download of them, then you'll need to configure the static file handler (and request filtering) to handle everything instead, but make sure you do that for that folder only since you don't want people downloading your source code.
Running the script would require remote access to the server, either directly or by exploiting some bug in the website code (similar to SQL injection). The risk here is mostly in hosting malware, especially if you allow user uploads to be downloaded by other users. While getting malware onto a machine is not as simple as just renaming an executable to another file type (it still has to be run as an executable rather than an Excel spreadsheet, for instance, to be able to function), it is possible to embed malware in various types of files, such that the act of opening that file causes execution of the malware. In that sense, you really can't tell at a glance whether a file is malware or not. It could look like an Excel file even open up properly in Excel, but still wreck havoc. The only way to be safe is scan all user uploaded files with a good antimalware application.
As far as running something remotely goes, though, the access to the server required to run the script would provide a much better avenue for mischief that your upload form, anyways. So anyone who could manage that kind of access isn't going to be trying to exploit you through your upload form, and anyone who uploads something malicious without that access can't really do anything.

open_basedir how much you can trust it?

I have a VPS from godaddy and 3 sites on it. 1 of it is my main subject which i have taken every precaution in the code (100% hand made) to secure it (PHP/Mysql/Javascript) and to have it run smoothly.
Now for the 1st time we giving one of the other 2 accounts to a 3rd party to make a website. I do not know/trust them so i want to secure mine as good as possible meaning that I want to complete cutoff access to the filesystem / php etc. of my account.
I have mod_security and suPHP built-in to apache but when I turned PHP 5 Handler from dso to suphp (Apache suEXEC is on) one folder's htacess went bananas and did not work resulting in file not found. I read about some modifications I have to do in htaccess files and also fix files' permissions (I tend to give 777 to folders containing useruploads -like photos, docs which is ok for everyone to see so no problem- ofc not want anyone but me deleting them) cause supposedly 777 folders/fiels will trigger error.
So did not want to get into that process and found about the WHM>Security Center >Security Center> PHP open_basedir Tweak, which states that
'PHP's open_basedir protection prevents users from opening files outside of their home directory with php'
Is this enough? I mean it will block only opening my files? Will it protect me from someone trying to copy his own malicious php files to my directory and then running it to wreak chaos?
I am kinda new at this (1st time I encountered that issue ) and would like some feedback from your experience
I can't be completely sure, but you seem to be concerned for 1 site you own and you want to protect (main) and another (2) site(s) that are going to be created by a 3rd party you don't trust.
I think anyone would need way more info to tell you how they could access your "main" site. Basically if they are running on the same instance of PHP/Apache it means they share the same user, that means that any vulnerability found in any of the "dangerous" sites can affect your "main" site.
But you say they are VPS, so the sites should be not related at all, living in different virtual servers. In that case, there seems to be not reasons to worry.

linux hosting permissions shared server

What is the actual difference between executable and read permissions on a shared linux server, meaning, how exactly does that relate to what a web visitor can do with, for example a php file? Using godaddy shared hosting, for example, under basic permissions, if web user is not readable, but is executable, the same thing happens as when it is readable but not executable - the php file executes. Also, on a shared linux server, what exactly does making a file writable for web user- someone who doesn't have access to server login but visits the page through a browser do?
The basic answer is: nothing. Visitors to a website aren't directly accessing any of the files, PHP or otherwise. They send an HTTP request to the server service (wow, that's terrible wording) on the computer (e.g.: Apache), which then loads the page, executes the PHP, etc. So when you're changing permissions, the pertinent permissions to change are what permissions the Apache account (which, depending on the distro, can be either nobody or www-data) has on those files. As for what the permissions actually do, this Wikipedia page describes it quite well.
You can test this yourself if you have a Linux box. Take a directory with files in it and sudo chmod -R 744 it. Then, try to ls -l into it. You'll be able to see file names, but not any other information about the file (including the contents - nanoing any file in that directory will result in creating a new file).
You have to remember that all this relies on what the web server wants to do, since everything has to go through the web server. It's not like reading a file from a disk. So when you request "index.php" or "index.cgi", you are not reading the contents of the file. The web server will see that the file you're requesting is a program, and it will run the program. Instead of outputting the contents of the file, it will output whatever the program outputs. This is simply a setting, and has nothing to do with permissions. Also, you do not have the ability to change this setting if you're using a shared hosting account.
on a shared linux server, what exactly does making a file writable [...] do?
You can't make a file "writable" with HTTP. Again, this is not like accessing a file system on a local drive. You can make a server-side program that can handle file uploads, but again, this is has nothing to do with permissions.
I hope this is what you meant. Let me know if you meant something else.

Are folder permissions on a web server adequate security?

I'm working on a project which uses a folder full of flat-file databases. I'd like to make sure these databases are only accessible to scripts running off the server, so I set the folder permissions to 700.
This results in all scripts functioning properly, but a 403 Forbidden whenever I try to access the database folder in my browser. This is good.
However, I'm wondering: am I missing something? Is there any way — short of gaining access to my FTP account – for an outside user to access this folder? Or can I rest easy?
The proper solution is storing them outside the document root. If you cannot do that, but know that Apache will be used, create a .htaccess in the folder with the following contents:
order deny, allow
deny from all
Using filesystem permissions may or may not work depending on the environment - in a perfect setup the webserver would use the same uid as your system user that owns the files. Then your approach wouldn't work.

LAMP: Recommended Directory and File Permissions

My project resides in a shared Linux hosting server. The hosting provider, of course, has already set up the necessary directory and file ownerships relative to other server users. My concern for now is how to setup permissions within my domain so my users can have read access to the files and folders they should have and still let my scripts retain read/write access to it.
Question: What would be the recommended permissions on:
Public files and folders (read only?)
Files where uploaded files from forms are stored
Files and folders where GD and cache files are being written into
Folders where my server-side scripts are stored (I used mainly PHP)
My WWW root folder (where index.php resides)
This is a perfect example of where you need the Principle of Least Privilege. Allow ReadOnly to the webserver's user for RO content, allow writing only to a directory/files that absolutely need to be written. Explicitly deny access to things you don't want people to read (config files, htaccess, anything with paths/ip addresses/passwords), don't allow any extra processing if you're not using it (CGI executables, Server Side Includes).
The best way to do it is to start with deny everything and slowly open thing up as you go. First try serving static content, see what is the minimal amount of Apache directives/modules and filesystem ownerships and permissions to get it working. Then try some RO PHP scripts. Then try some RW PHP scripts. Then DB connectivity, and so on, you get the idea... It's a very tedious processes, and you want to plan ahead the sort of things you want to test; I tend to write long scripts with wget commands trying to do both good and bad things to the server. Make one change, restart, rerun the script, see what changes from the last time. Observe-modify-analyze, until you cant stand looking at it anymore ;)

Resources