Creating .htpasswd file outside webpage files - .htaccess

I'm working with a webpage that uses a content management system. The webpage belongs my university. We want create a private file with .htaccess and htpasswd.
The problem is I can't figure out how to place the htpasswd file outside the webpage files because I don't have access to the server machine. I need to do that because someone told me it is unsafe to place it within the webpage files. Any ideas?

If you have a shared hosting account for your website, I suggest you connect to it through FTP. In Directadmin as example, you have to put the file in your public_html folder (which is the root directory of your site).
In case you want to put it outside of the root (as you were asking), place it 1 directory lower (so not in the public_html but before that), and then link it to it.
In case you lack the permissions to do so, contact your hosting provider, cause he should be able to do it for you (and ask him for the direct path).

Related

How do you allow IIS to access a symlink folder?

I have a two webservers, on each webserver in the C:\inetpub\logs\LogFiles I have created a symbolic link to the other servers log file folders, so essentially I can be logged in to one of the webservers and see the logs on both servers in one place. This works perfectly.
I am building a webpage to make the log files available via a webpage, the code simply goes to the C:\inetpub\logs\LogFiles directory and lists the files in each sub folder, i.e. W3SVC1 (the local folder) and webserver2-w3svc1 (the remote log folder).
For the local folders it works fine, but I am getting the "access denied" error when trying to call Directory.GetFiles on the symlinkd folder. I suspect this is some sort of permissions error, but I tried giving the symlinkd folder full permissions to "everyone" but I still get the same error.
Is this something to do with the fact that when I created the symlinkd I had to enter the username and password of the webserver2, and these credentials cannot be accessed/used by IIS when trying to get access to the folder?
Is there anything I can do allow IIS to access the contents of this symlinkd folder?
I don't think you need to use a symlink, you can create a virtual directory mapping to that directory in IIS, just map it to the target path. In IIS, right click on the website and select Add Virtual Directory.
For more information, please refer to this official document.
After much experimentation, the only way to do this is as follows:
Create a new user on the computer.
Run the AppPool in IIS under this new users' identity as opposed to the default IUSR account.
Give the folder you are sharing permissions to this user AND 'share' this folder with the new user.

How can I move my old sites from shared host server to VPS?

I'm sure a lot of you guys used to be in the same situation as I am at right now.
Before
I used to owned shared hosting for about 2 years.
I kind of get used to it, whenever I create a new site.
I just need to upload my entire new folder including : index.html , styles, scripts, and other assets via FTP into the root directory to my shared host server. Then, I go to the url of that folder, I will see the site loaded, that's how I normally do it.
Now
I upgrade the way I host my site. I just recently purchased a VPS on Digital Ocean, and run Laravel application on it. Now, the site is way faster, and I have more control.
Unfortunately, I'm not sure what to do with all my old sites that I used to have.
How do move them into my new VPS ?
How do I go to them ? How is that work ?
Should I create a public_html folder or something ?
How can I achieve something like this ?
Any direction on this will be much appreciated !
Depending on your setup (single domain, multi-domain). If you're dealing with a single domain environment you'll just move everything over like normal. If you're in a multi-domain environment you'll need to point all your domains to the new server and setup different apache sites (config files) that point to their respective locations on disk.
In my experience with multi-domain environments and Apache 2.4 it's best to have /var/www/ be your center where you can store your .htpasswd or any other files like that, and a folder named public which has your outward facing websites in their subfolders.
Example:
web1.com would exist in /var/www/public/web1.com/...
web2.com would exist in /var/www/public/web2.com/...
You could alternatively have another public folder, but if you're specifically asking about laravel you'd want to point the apache config to the public directory as if you go any higher people have access to your .env file.
If you have everything in your single domain environment (public_html) and you now have a laravel site at your root you could alias a specific path to act as your "old site" data that points to a different folder than your laravel install.

Preventing user accesing Node.js application code

Assuming a user has access to all files within the public_html directory. Doesn't this mean they could have access the a node.js application code within it? Surely this is a massive security risk.
What is the normal way of handling this? Would you user files permissions to restrict the file, or place the node directory outside of the public_html and reference it somehow? If so, how?
Many thanks for any answers given!!
Yes your server scripts should live outside public_html. Only files that you want to make available to the public should be placed under public_html.
Your node server script can refer to the "./public_html" or "../public_html" folder if it is stored in or above the folder containing the script, or it can even refer to "/path/to/public_html" if it is stored elsewhere on your filesystem.

Protecting folder and its files

I wish to protect folder with core files of CMS and its sub folders and files from accessing via web, and I tried with .htaccess file with this:
order deny,allow
deny from all
Problem I have is that I can protect that folder but some script from that folder or its sub folder then do not work good.
I also tried with this:
order deny,allow
deny from all
allow from 127.0.0.1
allow from 76.xx.xx.xx
In this case 76.xx.xx.xx is static IP of site.
Is there any way to prevent accessing files in that folder but still to make all work ok?
Another question.
I wish to secure more my site from hackers. So, is there any way to prevent injecting malicious files and code in my scripts/files and/or to block my site of executing files from other sites, hosts, to allow just working with local files.
I prefer .htaccess file, but if it is needed I have access to WHM if there is need for editing other files (but in that case I will need step by step guide). I am running site on Linux VPS with Cent-OS 5 system.
The usual way to do this is to put the accessible files in an apache-accessible directory, but all the rest into a directory out of the way from Apache. For example:
/usr/
local/
mycms/
public/
lib/
/var/
www/
mycms -> softlink to /usr/local/mycms/public
Or better yet, make mycms an alias in Apache config, pointing at the public directory. This way, the files that should be accessible are, those that shouldn't be aren't, and you can still reference all your other files simply by ../lib/ etc.
I know this does not really answer your question literally, and if the CMS directory structure is not under your control, this may not be the best way to do it.
Another way is through rewrites - simply rewrite all requests to your CMS directory except for your CMS's entry script into requests for the entry script.

Coldfusion security issue...how to hide directory of files?

So, I decided to try to break my website...I googled my site by typing in site:mysite.com/whatever and behold, all of the users uploaded files were available for view under a specific directory.
What kind of script/ counter measure should I use to block these files from being viewed? I already have a script that checks the path and the logged in status, however this doesn't seem to be working. I've looked all over for solutions...but I can't quite find one. I'm using ColdFusion 8.
This isn't a ColdFusion issue so much as a web server configuration issue.
You should either:
configure your web server not to show a directory of files when using a URL without a filename (e.g., http://www.example.com/files/)
drop a blank default web document (index.html, index.htm, default.htm, index.cfm, whatever) into that directory so that it displays that document rather than the list of files. If you use index.cfm, it'll fire your Application.cfm/cfc in your file path and use whatever other security you've built.
(or, better, do both)
The best way to secure your file listings and the files themselves is to store them in another folder outside of the Web site root folder. You can then serve them up using CFDIRECTORY and CFCONTENT. The pages that display the files can check your access controls and only serve the files to those allowed to see them.

Resources