I have a VPS from godaddy and 3 sites on it. 1 of it is my main subject which i have taken every precaution in the code (100% hand made) to secure it (PHP/Mysql/Javascript) and to have it run smoothly.
Now for the 1st time we giving one of the other 2 accounts to a 3rd party to make a website. I do not know/trust them so i want to secure mine as good as possible meaning that I want to complete cutoff access to the filesystem / php etc. of my account.
I have mod_security and suPHP built-in to apache but when I turned PHP 5 Handler from dso to suphp (Apache suEXEC is on) one folder's htacess went bananas and did not work resulting in file not found. I read about some modifications I have to do in htaccess files and also fix files' permissions (I tend to give 777 to folders containing useruploads -like photos, docs which is ok for everyone to see so no problem- ofc not want anyone but me deleting them) cause supposedly 777 folders/fiels will trigger error.
So did not want to get into that process and found about the WHM>Security Center >Security Center> PHP open_basedir Tweak, which states that
'PHP's open_basedir protection prevents users from opening files outside of their home directory with php'
Is this enough? I mean it will block only opening my files? Will it protect me from someone trying to copy his own malicious php files to my directory and then running it to wreak chaos?
I am kinda new at this (1st time I encountered that issue ) and would like some feedback from your experience
I can't be completely sure, but you seem to be concerned for 1 site you own and you want to protect (main) and another (2) site(s) that are going to be created by a 3rd party you don't trust.
I think anyone would need way more info to tell you how they could access your "main" site. Basically if they are running on the same instance of PHP/Apache it means they share the same user, that means that any vulnerability found in any of the "dangerous" sites can affect your "main" site.
But you say they are VPS, so the sites should be not related at all, living in different virtual servers. In that case, there seems to be not reasons to worry.
Related
I asked this question a while back and even though I put up several bounties, I never got much of an answer (see here). More generally, I want to know if there is any concept of security with suPHP? What's to stop anyone from going to
www.example.com/rm-f-r.php
or
www.example.com/return_some_iamge.php
Because those scripts get executed with the privileges of the user, it's essentially guaranteed acesss.
EDIT To elaborate on the above, my problem is a conceptual one. Assume we have a file at /home/user/test.php. Let this file do anything (rm -f -r /, fetch and return a picture, reboot the computer...) If I point my browser to that file (assuming the containing folder is an enabled site under Apache) how do I tell the browser to only let the owner of that file execute it?
EDIT 2: I never explicitly stated this as I assumed suPHP is only used with apache (ie. web browsers), but I am talking about authenticating linux users with only a browser. If we do not authenticate, then anyone technically has access to any script on the server (with web sites this is not a problem as they always have permissions set to 0644, so essentially the whole world can see. PHP files on the other hand, have permissions generally set to 0700)
suPHP has the effect that the PHP runtime executes with the permission of the user that authored the .php file. This means that a PHP program author can only read and write files that he himself owns, or otherwise has access to.
If you put a PHP file on your website you are making it publicly runnable by anyone that comes along to your website - using suPHP does not change this. Without logging in to your site, all web users are effectively anonymous and there is no way to reliably identify an individual. suPHP only controls the local permissions the script will have when it is executed, it does not intend to introduce any form of web user authentication or authorisation.
If you wish to control which users can actually run a script, you need to implement some login functionality and force the users to log in to your site. Then add a check to the sensitive PHP script (or Apache configuration) which will make it abort the request, if the current logged in web user is not one you wish to execute that script.
Security question: Is it a good practice to name folders on the server by names that are difficult to guess (8+ symbols, not a simple "admin" or "services")? I'm asking about folders that contain not just icons or .js files or .css files, but .php files and are protected by .htaccess file (deny from all).
No. Security through obscurity isn't.
Plus it's really irritating for anybody using the machine via a shell, ftp, etc.
What would it protect against? Regardless of names, folder access should be handled by the machine's and/or network's normal security mechanisms. If they get past that, it doesn't matter what your artifacts are named–Ur PwNeD.
Good practice would be to keep your PHP files outside your web server's document root. E.g., if your doc root is /var/www, then you might have there just a single index.php file, and all that file does is launch your app:
set_include_path('/something/besides/var/www');
require_once 'foo.php';
require_once 'bar.php';
do_something();
This way, your web server doesn't even know that the PHP files exist, and can't serve them even if you have an accidentally misconfigured .htaccess.
This is security through obscurity. While there is no harm in doing it , It doesn't give anything in terms of security.
My major weakness is securing my sites -- I know, a bad weakness.
I have a site now that when I view the source in Firebug, I'm seeing all kinds of scripts with the src of http://mylocksmithusa.com/sitebuilder/acura2002.php -- but they're not in my files that I can find. How did they get there, how can I find them, how can I remove them, and how do I stop them from happening again?
Similarly, another site of mine keeps having its index.php file rewritten, mostly just having some of the code erased and then a bunch of <iframe>jareqjj93u8q2u35w</iframe> jibberish added to the bottom.
My FTP passwords are very secure -- should I change them and hope it stops? Or is there more to it that I just don't understand about locking down my sites from this kind of dangerous abuse?
Thanks!
Looks like you've been a victim of a SQL injection attack or trusting unsanitised input from your query strings.
If files are being physically appended to there's also the chance that you've got some code on your site that allows file uploads without checking a user is authenticated and/or doesn't check the type of file being uploaded. This would allow a miscreant to upload a script to modify every file on your site and add these links.
A common source of the iframe-addition attacks is infected client computers leaking their FTP passwords. So you need to check both the server and any clients you have used to connect to it for malware, before changing the passwords. (And preferably changing to SFTP; nobody should still be using FTP in 2009.)
Don't just assume your machines aren't infected because you're running anti-virus software. Today's AVs are pathetically, hopelessly behind the malware writers. Take multiple opinions and if an AV finds anything don't trust it to fix the problem, because chances are it will fail, potentially leaving infections present. Instead reinstall the OS.
In the first case with the HTML seemingly added at serve-time, the server itself may be infected and need reinstalling. Either way, get them off the web until you're sure they're cleaned.
You are asking us how someone popped your site and the only honest answer is "we don't know".
Check ftp and other services, check your code for where you get input (cookie, querystring, post params etc) but the most likely is you installed an old known vulnerable version of wordpress or some webapp and a roving worm found it and exploited it.
My site got was attacked the same way too <script src=http://mylocksmithusa.com/sitebuilder/acura2002.php></script>. After I checked it, apparently almost all of my .js file was appended with document.write([the script above]) on the last line. Additionally, almost all .html file was appended with an iframe just after the <body> tag (and then few last lines in the files were removed), and several of .php files in my site got appended with eval([super long codes]) on the beginning of the file.
After cleaning up the files, yesterday I changed my ftp password, and today the site isn't hacked anymore (before, every few hours the files will be changed). So I suspect some worm or something already got access to your ftp. What you can do is change the ftp password, take your site offline for a while, until you cleaned up the files.
There are lots of similar questions to this, but they all seem to involve either configuring permissions or installing a plugin.
I'm looking for a solution that is "dumb" - i.e. to allow the code to be deployed from source control and automatically have access to certain paths blocked, without anyone needing to configure the server.
I only need directory & file blocking, none of the other abilities that .htaccess has.
Just to be clear, we are using ColdFusion, not .NET, and whilst CF has assorted ways to handle its own scripts, it doesn't do anything with non-CFML scripts. (It is possible to do, for example config.xml.cfm, but that is a messy solution that requires updating code, etc.)
(Of course, ideally these directories/files shouldn't even be in the webroot, and if I could switch to Apache or IIS7 I could simply use .htaccess, but those aren't options at the moment.)
My current solution is going to be a readme.deploy.txt that contains instructions on how to manually set the permissions on the relevant files & directories in IIS Manager, but obviously I'd much prefer to avoid human intervention for it - any suggestions?
You could create a script that would do this when you cycle through your deployment, like say a scheduled task where you use a PowerShell script or batch script that sets up the enviroment.
With IIS6 this is going to require mucking with the Metabase, which could solve your problem, but it will require scripts to have access to the system metabase and execute system commands or you going to have to learn how to use the ii6 metabase command files
see this This Article
If they have root access maybe it would be wise to just create a installation utility that can tweak the settings for them.
Good luck,
mike
Well, for ASP.net specifically you have the .config files which allow you to control some aspect of those web folders.
However, I'm not aware of anything like .htaccess for IIS.
.NET has Routing which allows you to 'rewrite' paths. The MVC framework has it built in... I'm not sure on how to configure/use it for 'normal' ASP.NET applications.
Update: didn't know you weren't on .NET.
Maybe you're just looking for File/Folder permission settings? Don't know anything about setting those by using a config file...
I had happen in the past that one of our IT Specialist will move the robots.txt from staging from production accidentally. Blocking google and others from indexing our customers' site in production. Is there a good way of managing this situation?
Thanks in advance.
Ask your IT guys to change the file permissions on robots.txt to "read-only" for all users, so that it takes the extra steps of:
becoming Administrator/root
changing the permissions to allow writes
overwriting robots.txt with the new file
As an SEO, I feel your pain.
Forgive me if I'm wrong, but I'm assuming that the problem is caused because there is a robots.txt on your staging server because you need to block the whole staging environment from the search engines finding and crawling it.
If this is the case, I would suggest your staging environment be placed internally where this isn't an issue. (Intranet-type or network configuration for staging). This can save a lot of search engine issues with that content getting crawled say, for instance, they deleted that robots.txt file from your Staging by accident and get a duplicate site crawled and indexed.
If that isn't an option, recommend staging to be placed in a folder on the server like domain.com/staging/ and use just one robots.txt file in the root folder to block out that /staging/ folder entirely. This way, you don't need to be using two files and you can sleep at night knowing another robots.txt won't be replacing yours.
If THAT isn't an option, maybe ask them to add it to their checklist to NOT move that file? You will just have to check this - A little less sleep, but a little more precaution.
Create a deployment script to move the various artifacts (web pages, images, supporting files, etc) and have the IT guy do the move by running your script. Be sure not to include robots.txt in that script.
I'd set up code on the production server which held the production robots.txt in another location and have it monitor the one that's in use.
If they're different, then I'd immediately overwrite the in-use one with the production version. Then it wouldn't matter if it gets overwritten since the bad version won't exists for long. In a UNIX environment, I'd do this periodically with cron.
Why is your staging environment not behind a firewall and not publicly exposed?
The problem is not Robots.txt...The problem is your network infrastructure.