executing a script uploaded through file upload in public folder - security

We have to fix some security vulnerability in our system, and one of the items is to: disable execution of uploaded scripts/exe's through file upload control.
We have excel upload facility. Lets say hypothetically hacker changes the .exe to .xls and uploads it (there are ways to block that, but ignore that for now). Also assume that
the upload folder is within pubilc directory from where the website is installed in IIS. OR
Someone can access that file by specifying a full path of file thru some api endpoint of which hacker is aware of
Now given that there is an exe or a script which is accessible to the hacker through above means, is it possible for hacker to run that script/exe in someway, so that it can cause harm to the server where the site is hosted?
I am not really security expert hence cant think ways how that can be possible? How a hacker can remotely run exe/script on server, given that they does not have any access to the server.

One of the things that you should definitely do is to remove IIS handlers permissions from running scripts, otherwise anybody can upload a ".asp" or a ".aspx" or any other script engine file and then execute it by requesting it. One simple way to test that is just create a "test.asp" file with "<%= Now() >" and if that returns you the date, then anybody can upload scripts and run them in your server.
The way to disable that in IIS 7+ would be to add a configuration file in a parent directory and edit the permission for handlers, for example assuming a child folder called "public" you can drop the following web.config to disable that:
<configuration>
<location path="public">
<system.webServer>
<handlers accessPolicy="Read" />
</system.webServer>
</location>
</configuration>
You can test then that it should no longer execute the file and instead block it. If you want to allow download of them, then you'll need to configure the static file handler (and request filtering) to handle everything instead, but make sure you do that for that folder only since you don't want people downloading your source code.

Running the script would require remote access to the server, either directly or by exploiting some bug in the website code (similar to SQL injection). The risk here is mostly in hosting malware, especially if you allow user uploads to be downloaded by other users. While getting malware onto a machine is not as simple as just renaming an executable to another file type (it still has to be run as an executable rather than an Excel spreadsheet, for instance, to be able to function), it is possible to embed malware in various types of files, such that the act of opening that file causes execution of the malware. In that sense, you really can't tell at a glance whether a file is malware or not. It could look like an Excel file even open up properly in Excel, but still wreck havoc. The only way to be safe is scan all user uploaded files with a good antimalware application.
As far as running something remotely goes, though, the access to the server required to run the script would provide a much better avenue for mischief that your upload form, anyways. So anyone who could manage that kind of access isn't going to be trying to exploit you through your upload form, and anyone who uploads something malicious without that access can't really do anything.

Related

How to hide PHP code on the server from other person under root?

Good day!
There are PHP scripts, classes, configs. All this stuff is interconnected, I need to give a person access to the server so that he works (started under the root) with these scripts, while changing only the config files, and in order to not be able to view the source code.
I've researched various free obsfukators which converting code into something:
<?php include(base64_decode('Li4vY29uZmlnLnBocA=='));include(base64_decode('cHJpdmF0ZS92ZW5kb3IvYXV0b2xvYWQucGhw'));$krc_5bf7f45b=[];foreach($bhi_6f9322e1as $xol_e8b7be43){$xol_e8b7be43=explode(base64_decode('Og=='),$xol_e8b7be43);try{$uic_c59361f8=new \xee_d9cb1642\cko_659fc60();$uic_c59361f8->ldc_aa08cb10($xol_e8b7be43[0],$xol_e8b7be43[1]);$krc_5bf7f45b[]=$uic_c59361f8;}catch(Exception $wky_efda7a5a)
What if the files of configs have variable names and it turns out that when obfuscating the main working code, the variables have different names? Not to force the user to run through the obsfukation corrected config every time? So far, this option seems the only one.
Is it possible inside the server under Ubuntu to somehow limit the ability to copy or view or download certain files or make some other methods of protection-hiding, but at the same time with the ability to run this code. It was thought to hide the code somewhere in the depths of the file system folders, calling them random names, and run them somehow through the symlinks by the file name or something like that. Is it possible to?
Option not to provide root access to the server, but to launch via the browser, to give access only to FTP to upload the config to a separate folder. But there are a number of points - all scripts run up to a week, and must be executed as root. How to solve it?

linux hosting permissions shared server

What is the actual difference between executable and read permissions on a shared linux server, meaning, how exactly does that relate to what a web visitor can do with, for example a php file? Using godaddy shared hosting, for example, under basic permissions, if web user is not readable, but is executable, the same thing happens as when it is readable but not executable - the php file executes. Also, on a shared linux server, what exactly does making a file writable for web user- someone who doesn't have access to server login but visits the page through a browser do?
The basic answer is: nothing. Visitors to a website aren't directly accessing any of the files, PHP or otherwise. They send an HTTP request to the server service (wow, that's terrible wording) on the computer (e.g.: Apache), which then loads the page, executes the PHP, etc. So when you're changing permissions, the pertinent permissions to change are what permissions the Apache account (which, depending on the distro, can be either nobody or www-data) has on those files. As for what the permissions actually do, this Wikipedia page describes it quite well.
You can test this yourself if you have a Linux box. Take a directory with files in it and sudo chmod -R 744 it. Then, try to ls -l into it. You'll be able to see file names, but not any other information about the file (including the contents - nanoing any file in that directory will result in creating a new file).
You have to remember that all this relies on what the web server wants to do, since everything has to go through the web server. It's not like reading a file from a disk. So when you request "index.php" or "index.cgi", you are not reading the contents of the file. The web server will see that the file you're requesting is a program, and it will run the program. Instead of outputting the contents of the file, it will output whatever the program outputs. This is simply a setting, and has nothing to do with permissions. Also, you do not have the ability to change this setting if you're using a shared hosting account.
on a shared linux server, what exactly does making a file writable [...] do?
You can't make a file "writable" with HTTP. Again, this is not like accessing a file system on a local drive. You can make a server-side program that can handle file uploads, but again, this is has nothing to do with permissions.
I hope this is what you meant. Let me know if you meant something else.

Is there any security with suPHP?

I asked this question a while back and even though I put up several bounties, I never got much of an answer (see here). More generally, I want to know if there is any concept of security with suPHP? What's to stop anyone from going to
www.example.com/rm-f-r.php
or
www.example.com/return_some_iamge.php
Because those scripts get executed with the privileges of the user, it's essentially guaranteed acesss.
EDIT To elaborate on the above, my problem is a conceptual one. Assume we have a file at /home/user/test.php. Let this file do anything (rm -f -r /, fetch and return a picture, reboot the computer...) If I point my browser to that file (assuming the containing folder is an enabled site under Apache) how do I tell the browser to only let the owner of that file execute it?
EDIT 2: I never explicitly stated this as I assumed suPHP is only used with apache (ie. web browsers), but I am talking about authenticating linux users with only a browser. If we do not authenticate, then anyone technically has access to any script on the server (with web sites this is not a problem as they always have permissions set to 0644, so essentially the whole world can see. PHP files on the other hand, have permissions generally set to 0700)
suPHP has the effect that the PHP runtime executes with the permission of the user that authored the .php file. This means that a PHP program author can only read and write files that he himself owns, or otherwise has access to.
If you put a PHP file on your website you are making it publicly runnable by anyone that comes along to your website - using suPHP does not change this. Without logging in to your site, all web users are effectively anonymous and there is no way to reliably identify an individual. suPHP only controls the local permissions the script will have when it is executed, it does not intend to introduce any form of web user authentication or authorisation.
If you wish to control which users can actually run a script, you need to implement some login functionality and force the users to log in to your site. Then add a check to the sensitive PHP script (or Apache configuration) which will make it abort the request, if the current logged in web user is not one you wish to execute that script.

Coldfusion security issue...how to hide directory of files?

So, I decided to try to break my website...I googled my site by typing in site:mysite.com/whatever and behold, all of the users uploaded files were available for view under a specific directory.
What kind of script/ counter measure should I use to block these files from being viewed? I already have a script that checks the path and the logged in status, however this doesn't seem to be working. I've looked all over for solutions...but I can't quite find one. I'm using ColdFusion 8.
This isn't a ColdFusion issue so much as a web server configuration issue.
You should either:
configure your web server not to show a directory of files when using a URL without a filename (e.g., http://www.example.com/files/)
drop a blank default web document (index.html, index.htm, default.htm, index.cfm, whatever) into that directory so that it displays that document rather than the list of files. If you use index.cfm, it'll fire your Application.cfm/cfc in your file path and use whatever other security you've built.
(or, better, do both)
The best way to secure your file listings and the files themselves is to store them in another folder outside of the Web site root folder. You can then serve them up using CFDIRECTORY and CFCONTENT. The pages that display the files can check your access controls and only serve the files to those allowed to see them.

LAMP: Recommended Directory and File Permissions

My project resides in a shared Linux hosting server. The hosting provider, of course, has already set up the necessary directory and file ownerships relative to other server users. My concern for now is how to setup permissions within my domain so my users can have read access to the files and folders they should have and still let my scripts retain read/write access to it.
Question: What would be the recommended permissions on:
Public files and folders (read only?)
Files where uploaded files from forms are stored
Files and folders where GD and cache files are being written into
Folders where my server-side scripts are stored (I used mainly PHP)
My WWW root folder (where index.php resides)
This is a perfect example of where you need the Principle of Least Privilege. Allow ReadOnly to the webserver's user for RO content, allow writing only to a directory/files that absolutely need to be written. Explicitly deny access to things you don't want people to read (config files, htaccess, anything with paths/ip addresses/passwords), don't allow any extra processing if you're not using it (CGI executables, Server Side Includes).
The best way to do it is to start with deny everything and slowly open thing up as you go. First try serving static content, see what is the minimal amount of Apache directives/modules and filesystem ownerships and permissions to get it working. Then try some RO PHP scripts. Then try some RW PHP scripts. Then DB connectivity, and so on, you get the idea... It's a very tedious processes, and you want to plan ahead the sort of things you want to test; I tend to write long scripts with wget commands trying to do both good and bad things to the server. Make one change, restart, rerun the script, see what changes from the last time. Observe-modify-analyze, until you cant stand looking at it anymore ;)

Resources