Preventing user accesing Node.js application code - security

Assuming a user has access to all files within the public_html directory. Doesn't this mean they could have access the a node.js application code within it? Surely this is a massive security risk.
What is the normal way of handling this? Would you user files permissions to restrict the file, or place the node directory outside of the public_html and reference it somehow? If so, how?
Many thanks for any answers given!!

Yes your server scripts should live outside public_html. Only files that you want to make available to the public should be placed under public_html.
Your node server script can refer to the "./public_html" or "../public_html" folder if it is stored in or above the folder containing the script, or it can even refer to "/path/to/public_html" if it is stored elsewhere on your filesystem.

Related

Creating .htpasswd file outside webpage files

I'm working with a webpage that uses a content management system. The webpage belongs my university. We want create a private file with .htaccess and htpasswd.
The problem is I can't figure out how to place the htpasswd file outside the webpage files because I don't have access to the server machine. I need to do that because someone told me it is unsafe to place it within the webpage files. Any ideas?
If you have a shared hosting account for your website, I suggest you connect to it through FTP. In Directadmin as example, you have to put the file in your public_html folder (which is the root directory of your site).
In case you want to put it outside of the root (as you were asking), place it 1 directory lower (so not in the public_html but before that), and then link it to it.
In case you lack the permissions to do so, contact your hosting provider, cause he should be able to do it for you (and ask him for the direct path).

File permissions for Django: Gunicorn, Nginx, and Static Files

I have a standard ubuntu 14.04 machine. I use it daily under the user mh00h. I'm interested in using this machine as a production server. How do I manage file permissions for Django and Nginx?
Nginx is currently configured to run under the www-data:www-data. This minimizes risk of the rest of a machine being compromised. Django/gunicorn likewise should run under a user other than mh00h. But under what user should gunicorn actually be run under? nobody, correct?
Next: I am storing all of my web development files under /home/mh00h/development. Owned by mh00h. /home/mh00h/development/project1 (plus all dir/files but /media and /static)? Owned by mh00h. I follow django two-scoops best practices to create a project directory with static files inside of it. Of course, Nginx is unable to access /home/mh00h/development/project1/project1/static now because all of those parent directories are owned by mh00h, not www-data (./static is owned by www-data).
To complicate the matter, virtualenvwrapper creates my virtual environments under /home/mh00h/.virtualenvs/.
I am hesitant to fraction away from two scoop's best practices and store /static separately in /var/www, because I want all of these directories to stay nicely packaged together for easy transport off to some other server later. Plus, it makes me messy if I compare myself against how two-scoops did it.
Where should my static files be stored?
Where should django specific files be stored?
What users/groups should be able to access which of 1 and 2?
Where should virtualenvwrapper environments be stored?
What permissions should these locations have?
Thank you.
All files and directories in our production environment are owned by root:root with 755/644 file permissions, unless otherwise required. Some private files (think private keys etc.) are only readable by the user/process that needs them, while still being writable only by root.
As for the project structure: all our projects have a dedicated directory under /srv/www/vhosts.d/. Virtual environments are stored under /srv/www/virtualenvs. It is perfectly possible to store them in your home folder, but I feel this central approach is more in line with the idea of a production server. With the right settings, all virtual envs are also accessible by all users.
Our main project directory contains several scripts (manage.py and several deployment/update scripts) and is further split out into subdirectores: i.e. web contains public files, src contains the source code, and frontend contains the template folder and sass folder. The whole project directory is contained in a git repository, but deployment-specific files (user uploaded files, search indexes, encryption keys) are all in .git-ignore.
Our nginx process runs as www-data. In general each Django project has its own user, and the gunicorn process runs as this user.

retrieving files from NAS linux network in PHP

I'm working on a php project where a particular feature will have to access the files stored from an external directory:Network Attached Storage(linux). Lets say the path is /volume1/accounts and this is mounted in the linux server where my site is hosted using apache. I will have to retrieve files from that directory. is there a way in PHP to do that? My client says that its already been mounted.
No matter what I do I cant access using these test codes
print "<pre>".print_r(scandir("/volume1/accounts/"), true)."</pre>";
print "<pre>".print_r(scandir("192.168.0.233/volume1/accounts"), true)."</pre>";
print "<pre>".print_r(scandir("192.168.0.233:/volume1/accounts"), true)."</pre>";
How am I suppose to do it? Please help me.
Generally, PHP engine is executed with apache server's privileges. So mounted directory has no permissions or ownership for apache server, It'll be not able to show file lists. Could you try to make directory on /volume1/accounts/ and change ownership and permissions? If apache server is working with apache:apache ownership, please change ownership of directory as same.

Are folder permissions on a web server adequate security?

I'm working on a project which uses a folder full of flat-file databases. I'd like to make sure these databases are only accessible to scripts running off the server, so I set the folder permissions to 700.
This results in all scripts functioning properly, but a 403 Forbidden whenever I try to access the database folder in my browser. This is good.
However, I'm wondering: am I missing something? Is there any way — short of gaining access to my FTP account – for an outside user to access this folder? Or can I rest easy?
The proper solution is storing them outside the document root. If you cannot do that, but know that Apache will be used, create a .htaccess in the folder with the following contents:
order deny, allow
deny from all
Using filesystem permissions may or may not work depending on the environment - in a perfect setup the webserver would use the same uid as your system user that owns the files. Then your approach wouldn't work.

LAMP: Recommended Directory and File Permissions

My project resides in a shared Linux hosting server. The hosting provider, of course, has already set up the necessary directory and file ownerships relative to other server users. My concern for now is how to setup permissions within my domain so my users can have read access to the files and folders they should have and still let my scripts retain read/write access to it.
Question: What would be the recommended permissions on:
Public files and folders (read only?)
Files where uploaded files from forms are stored
Files and folders where GD and cache files are being written into
Folders where my server-side scripts are stored (I used mainly PHP)
My WWW root folder (where index.php resides)
This is a perfect example of where you need the Principle of Least Privilege. Allow ReadOnly to the webserver's user for RO content, allow writing only to a directory/files that absolutely need to be written. Explicitly deny access to things you don't want people to read (config files, htaccess, anything with paths/ip addresses/passwords), don't allow any extra processing if you're not using it (CGI executables, Server Side Includes).
The best way to do it is to start with deny everything and slowly open thing up as you go. First try serving static content, see what is the minimal amount of Apache directives/modules and filesystem ownerships and permissions to get it working. Then try some RO PHP scripts. Then try some RW PHP scripts. Then DB connectivity, and so on, you get the idea... It's a very tedious processes, and you want to plan ahead the sort of things you want to test; I tend to write long scripts with wget commands trying to do both good and bad things to the server. Make one change, restart, rerun the script, see what changes from the last time. Observe-modify-analyze, until you cant stand looking at it anymore ;)

Resources