Here's what I need to do -- either:
include an external file in my .htaccess file that resides on Server B, or
parse the .htaccess file on Server A using PHP, or
even a more clever solution (which I can't dream up at this time given my limited experience with httpd.conf and apache directives)
Background
I have an .htaccess file on Server A. I set its permissions to -rw-rw-rw (0666) and build it dynamically based on events throughout the day on Server B in order to achieve certain objectives of my app on Server A. I have since discovered that my hosting provider sweeps their server (Server A) each night and removes world writable files files and changes their permissions to 0664. Kudo's to them for securing the server. [Please no comments on my method for wanting to make my .htaccess file world writeable -- I truly understand the implications]
The .htacess file on Server A simply exists to provide Shibboleth authentication. I state this because the only aspect of the apache directives that is dynamic is the Require user stack.
Is it possible to include the "user stack" that resides on Server B in my .htaccess file that resides on Server A?
Or can I parse the .htaccess file on Server A via the PHP engine?
Thanks for helping my solve this problem.
Here's what the .htaccess looks like:
AuthType shibboleth
AuthName "Secure Login"
ShibRequireSession on
Header append Cache-Control "private"
Require user bob jill steve
All I want to do is update the bob jill steve list portion of the file each and every time I add/change/delete users in my application in an effort to make my Shibboleth required users (on Server A) synch with my MySQL/PHP web app (living on Server B).
(Version 2 of this post missed the Require user point on first reading -- sorry).
My immediate and my second instinct here is that dynamic .htaccess files (especially designed to be written from a separate web service) are a disaster waiting to happen in security terms and your hosting provider is right to do this, so you should regard this as a constraint.
However there is nothing to stop a process on server A within the application UID (or GID if mode 664) rewriting the .htaccess file. Why not add a script to A which will service an "htaccess" update request. This can accept the updated Require user dataset as (JSON encapsulated, say) parameter, plus some form shared secret signature. This script can include any necessary validation and update the htaccess file locally. Server B can then build the list and initiate this transfer via web request.
Postcript following reply by Dr DOT
My first comment is that I am really surprised that your ISP runs your scripts as nobody. I assume by this that all accounts are handled the same and therefore there is not UID / GID access control separation of files created by separate accounts -- a big no-no in a shared environment. Typically in suEXEC /suPHP implementations any interactive scripts run in the UID of the scriptfile -- in your case, I assume your ftp account -- what you anonymise to myftpuser. All I can assume is that your ISP is running shared accounts using mod_php5 with apache running as nobody, which is very unusual, IMHO.
However I run a general information wiki for a doctor which is also set up this way, and what I do is to have all of the application writeable contents in (in my case) directories owned by www-data. There is surely nothing stopping you setting up such a directory with its own .htaccess file in it -- all owned by nobody and therefore updateable by a script.
If you want a simple example of this type of script see my article Running remote commands on a Webfusion shared service.
Here's how I solved the problem a few days ago.
Given my HSP sweeps the server every night and changes any world writable file to 664 I thought about a different approach.
I did this:
during the day I made the directory containing my non-writable .htaccess file to 0777
then I deleted my .htaccess file
then I re-ran my script -- my fopen() command uses mode "w" (so I thought...if the file doesn't exist right now, why not let my php script create it brand new.)
because I said somewhere above here that my php runs as "nobody" -- voila!!!! I now had a file owned by nobody in the directory
Later that night my HSP swept the server and changed my directory from world writable -- but no big deal ... I got my .htaccess file owned by "nobody' and I can update the Require user directive automatically.
Thanks for everyone's help on this.
Related
Okay I searched and I can't find the solution.
I have a folder on my website where I'm keeping some script I use for something. My server via Cron job has to access this folder all the time. But if someone goes to the URL of the folder they can see the Index page, and I would like to prevent it. I want it to throw up a 403 or a 404 or a 500 or any page, I don't care what it is.
I tried this in htaccess:
deny from all
But this also blocks my own server and breaks the Cron job. I tried a few other solutions I found here and there, but they didn't work.
One of the things I saw is you can block everyone but allow your server access via IP. But I'm on Hostgator Shared hosting, and so my IP isn't static (as far as I know). I don't want to have to worry that at any time my server's IP may change and break my Cron thing.
Isn't there a nice elegant simple and permanent solution for this? Block access to the folder to all people, allow my own server/cron to access it at will.
Thank you :)
As it seems you want to call a script which is stored on the same server.
You don't need to use curl for it, you can just run the script by providing it's path and the php installation.
You can prevent outside access to the script if you just don't save the script in your public_html directory.
Your cronjob command will something look like this:
/opt/php70/bin/php /home/<USERNAME>/cron.php
The exact directory structure differs depending on your webhost.
I am running a raspberry pi using raspbian linux. I have apache web service installed and when i type in my ip address into the address bar of a browser it loads the default apache webpage saying it all works.
I have another folder located in home/Client5 on this device from which i am trying to load an index.html page but i am recieving a 404 not found error. eg
192.304.0.22/home/Client5/index.html
Not Found
The requested URL /home/Client5/task5.html was not found on this server.
I gather that theres something wrong in the above web adress or is it that i have to place this folder within the apache folder?
You have two solutions.
1/ If you want to keep the default apache pages, add a virtualhost (there are tons of docs on this, I don't think it needs to be repeated here).
2/ If you don't care about the default apache pages, edit /etc/apache2/sites-available/default and change DocumentRoot to make it point to /home/Client5/ . Add an index.html file in there, hit the Raspberry IP in your browser, you should see your page.
You might need to chmod -R ugo+rwX /home/Client5.
I don't know what you have under /home/Client5, but if it's a regular user, this setup is highly insecure. There are a bunch of additional steps to take if you want to host under home directories (first step, don't put pages in $HOME but create a subdir). It is safer to have a dedicated place with proper perms outside home dirs unless you really know what you're doing.
Is it working in local ? (XXX.XXX.X.XX:80) ? Surely yes, so take a look at your router.
If you have apache2, you local ip indicate to folder '/var/www/'.
If you want host page in '/home/Client5' you have to make virtual host :)
#edit
Read about it here
I've been doing web development for about 6 months now for fun and so I never really had a reason to be secure. Now I want to change that but I'm having a hard time understanding apache file permissions. I created the server and usually just ran var/www with 777 permissions because I needed to get by and didn't have information worth stealing. I researched user permissions and now I have run into a problem after configuring some things. I added the apache user "nobody" to a group I created called webserver, I also have an ftp user in this group. I set var/www permissions so that "me" and the group webserver have full permissions on for the folder and enclosed files and other users have no rights (can't read). When I attempt to view my sample website on 'localhost' I get a permission denied message from apache, but apache has full ownership of the file so why can't it process the file, send the appropriate response the the computer which requested it, and complete the transaction? Does Apache process http requests as a different user? I'm confused.
Usually on Ubuntu, apache run with the user www-data.
You can also pimp it by editing APACHE_RUN_USER and APACHE_RUN_GROUP in the envvars file.
Let's say there's a website www.example.com/user/john. Accessing that link takes you to www.example.com/user/john/index.html.
There are files like www.example.com/user/john/picture.png and www.example.com/user/john/document.html. These are also accessible to the public, but there's no link to these from index.html.
Is there a systematic way to find out these files? I'm asking because I'm going to set up my website, and I also want to put up a few files that I don't necessarily want every one to see, only people who I give the link to. So I'm wondering how easy/hard it is to find out that those files exist in my directory.
Most importantly you have to switch off the possibility to just browse the directory with the browser. Every server has its own way to switch this off. Then you can use the proposed way of "security through obscurity".
Another way can be, to have a specific folder whos access is restricted by a http basic authentication. This can be configured in the .htaccess file which you put in the root folder of your directory you want to share only with specific people.
Just google ".htacces" and "basic authentication".
HTTP does not provide a directory listing method as such. Most web servers can be set up to provide a HTML-formatted directory listing if an index file such as index.html does not exist. If you provide an index file so that autoindexing does not happen (or if you disable autoindex by web server configuration), and if your "hidden" file names are hard to guess, they should be pretty hard to find. I sometimes publish such files in a directory with a random gibberish name.
"Share links" used by Dropbox, Picasa and other services do the same, they just use longer random file/directory names or random parameters in the URL.
To provide real security you'll want to set up https (SSL/TLS) so that any eavesdroppers on the network cannot easily look at the requested URLs, and authentication such as HTTP Basic Authentication with username/password. For less sensitive stuff, http to a random hidden directory will often do fine.
An .htaccess file is uploaded to a directory via ftp, the owner and group of the said file is then generally the ftp user and / or root.
If the said directory had file permissions set to 0777 would it at all be possible for a remote script to write over the said .htaccess file, or would every attempt always be blocked as the owner and group of the .htaccess file is the ftp user (and the root), and the hacker (depending on which port they were attempting to enter through) will not be logged into the server as the ftp user (and hopefully not the root user either).
The reason I ask is because I have the need for a directory to be permissions 0777 and am concerned that the .htaccess file (which prevents scripts from running in the said directory) could simply be overwritten meaning the said server would be vunerable to attack.
Thanks,
John
Personally, I wouldn't set 0777 permissions on a directly containing a .htaccess file. In that situation I would probably advise moving the files requiring 0777 permissions into a sub directory.
You're going to be vulnerable to an attack if a script has write access to that folder regardless. Here's an example from a true story on a friend's server:
Old version of TimThumb allowed files to be uploaded maliciously
The file uploaded was Syrian Shell, a script used to decrypt user permissions and begin creating new files
Access was given to the intruder and the server was effectively turned into a host for a phishing site.
I highly recommend you take a look at your structure. Move write access to a subdirectory. Hope this helps.