Write over a htaccess file? - .htaccess

An .htaccess file is uploaded to a directory via ftp, the owner and group of the said file is then generally the ftp user and / or root.
If the said directory had file permissions set to 0777 would it at all be possible for a remote script to write over the said .htaccess file, or would every attempt always be blocked as the owner and group of the .htaccess file is the ftp user (and the root), and the hacker (depending on which port they were attempting to enter through) will not be logged into the server as the ftp user (and hopefully not the root user either).
The reason I ask is because I have the need for a directory to be permissions 0777 and am concerned that the .htaccess file (which prevents scripts from running in the said directory) could simply be overwritten meaning the said server would be vunerable to attack.
Thanks,
John

Personally, I wouldn't set 0777 permissions on a directly containing a .htaccess file. In that situation I would probably advise moving the files requiring 0777 permissions into a sub directory.

You're going to be vulnerable to an attack if a script has write access to that folder regardless. Here's an example from a true story on a friend's server:
Old version of TimThumb allowed files to be uploaded maliciously
The file uploaded was Syrian Shell, a script used to decrypt user permissions and begin creating new files
Access was given to the intruder and the server was effectively turned into a host for a phishing site.
I highly recommend you take a look at your structure. Move write access to a subdirectory. Hope this helps.

Related

Can a .file (.htpasswd) be accessed via browser?

I was reading this How secure is .htaccess password protection? and reviewing the fact that one of my sites has the .htpasswd file in a web facing directory.
But, if the permissions on a .htpasswd file are correct (644), can it be accessed at all via a browser or any other means?
I guess the real question is whether or not I should follow the advice and move the .htpasswd to /home/user rather than /home/user/public_html

Deny external access to folder

is there a way to deny outside access to my upload directory ?! I don't want users to access my upload directory : www.example.com/uploads
i used .htaccess in the root of my upload folder however all the links were broken
in my .htaccess :
deny from all
any solution ?
If you wish to disable directory listing, simply place 'Options -Indexes' in your htaccess.
You've applied a 'deny from all', which essentially stops ANYONE from accessing files in the directory to which it applies.
Also make sure that 'AllowOverride All' is specified in the vhost definition, otherwise you are unable to override settings via the htaccess file. That is my understanding anyway.
If you wish to disable access to the upload directory, and control which files in specific users can access, I'd recommend going through a script written in a language such as PHP. A user requests a file from the script, the script looks to see if they're allowed to view the file. IF they are, they file is displayed. IF they aren't then it is not.
References
http://www.thesitewizard.com/apache/prevent-directory-listing-htaccess.shtml
http://mathiasbynens.be/notes/apache-allowoverride-all

file permissions - who is owner,group,public in a web server

i have read many tutorials about file permissions but all they say is for example "if you don't want others to write to your files, set it to xxx..."
but in a web host, who is who really?
there is just a web server (apache) and php and mysql and other programs. there is no "other users". the tutorials said that apache is considered "public". but i have a php scripts wich gets an uploaded file and puts it in "downloads" directory. i set that directory's permission to 744. it means group and public should only be able to "read" and owner has full access.
i expected my uploaded file not to be transfered to that directory because of no "write" permission for "public". but the file was there. and more confusing for me was when i tried to download the file, i got a "forbidden" error. i expected to be able to download the file because the public had the "read" permission.
The user this case is the web server itself. Apache is usually running as the user "apache" or "www-data" when it reads and writes files to the server filesystem. Static content should be readable by the server. Upload locations must be writable. Depending on the other users on the system you may consider the web server to be the "other" user and the webmaster account the actual file owner.

Need a clever solution to dynamically produce an .htaccess file

Here's what I need to do -- either:
include an external file in my .htaccess file that resides on Server B, or
parse the .htaccess file on Server A using PHP, or
even a more clever solution (which I can't dream up at this time given my limited experience with httpd.conf and apache directives)
Background
I have an .htaccess file on Server A. I set its permissions to -rw-rw-rw (0666) and build it dynamically based on events throughout the day on Server B in order to achieve certain objectives of my app on Server A. I have since discovered that my hosting provider sweeps their server (Server A) each night and removes world writable files files and changes their permissions to 0664. Kudo's to them for securing the server. [Please no comments on my method for wanting to make my .htaccess file world writeable -- I truly understand the implications]
The .htacess file on Server A simply exists to provide Shibboleth authentication. I state this because the only aspect of the apache directives that is dynamic is the Require user stack.
Is it possible to include the "user stack" that resides on Server B in my .htaccess file that resides on Server A?
Or can I parse the .htaccess file on Server A via the PHP engine?
Thanks for helping my solve this problem.
Here's what the .htaccess looks like:
AuthType shibboleth
AuthName "Secure Login"
ShibRequireSession on
Header append Cache-Control "private"
Require user bob jill steve
All I want to do is update the bob jill steve list portion of the file each and every time I add/change/delete users in my application in an effort to make my Shibboleth required users (on Server A) synch with my MySQL/PHP web app (living on Server B).
(Version 2 of this post missed the Require user point on first reading -- sorry).
My immediate and my second instinct here is that dynamic .htaccess files (especially designed to be written from a separate web service) are a disaster waiting to happen in security terms and your hosting provider is right to do this, so you should regard this as a constraint.
However there is nothing to stop a process on server A within the application UID (or GID if mode 664) rewriting the .htaccess file. Why not add a script to A which will service an "htaccess" update request. This can accept the updated Require user dataset as (JSON encapsulated, say) parameter, plus some form shared secret signature. This script can include any necessary validation and update the htaccess file locally. Server B can then build the list and initiate this transfer via web request.
Postcript following reply by Dr DOT
My first comment is that I am really surprised that your ISP runs your scripts as nobody. I assume by this that all accounts are handled the same and therefore there is not UID / GID access control separation of files created by separate accounts -- a big no-no in a shared environment. Typically in suEXEC /suPHP implementations any interactive scripts run in the UID of the scriptfile -- in your case, I assume your ftp account -- what you anonymise to myftpuser. All I can assume is that your ISP is running shared accounts using mod_php5 with apache running as nobody, which is very unusual, IMHO.
However I run a general information wiki for a doctor which is also set up this way, and what I do is to have all of the application writeable contents in (in my case) directories owned by www-data. There is surely nothing stopping you setting up such a directory with its own .htaccess file in it -- all owned by nobody and therefore updateable by a script.
If you want a simple example of this type of script see my article Running remote commands on a Webfusion shared service.
Here's how I solved the problem a few days ago.
Given my HSP sweeps the server every night and changes any world writable file to 664 I thought about a different approach.
I did this:
during the day I made the directory containing my non-writable .htaccess file to 0777
then I deleted my .htaccess file
then I re-ran my script -- my fopen() command uses mode "w" (so I thought...if the file doesn't exist right now, why not let my php script create it brand new.)
because I said somewhere above here that my php runs as "nobody" -- voila!!!! I now had a file owned by nobody in the directory
Later that night my HSP swept the server and changed my directory from world writable -- but no big deal ... I got my .htaccess file owned by "nobody' and I can update the Require user directive automatically.
Thanks for everyone's help on this.

Question regarding setting up a new website in htdocs folder

Today I moved my website to a new hosting company (Verio). I've done this lots of times before, and I know that your website should go inside the "htdocs" folder.
Now usually when I use FileZilla, I can do a "Right Click" on a filename to get the URL of that file. This is the result of my root default file: ftp://test#test.com/www/htdocs/Research/index.php
However, on the web, the true URL of my default file is: www.test.com/Research/index.php
My index.php file is in the website root folder. Does anyone know why FileZilla would include the server folders "www/htdocs" as part of the URL? These folders should not normally be visible to the user.
OR, is this look correct?
That ftp url is correct. Your FTP account has access to the two folders (www/htdocs/) before the document root, as most hosting providers provide.
You are also correct to assume that http access is limited to the document root. (Meaning they cannot see www/htdocs/)

Resources