I have dozen of log files from my server. I wanted to share them over web server, keeping it as simple as possible. I set up authentication for nginx web server, and tried to create symlinks to log files my Python script is storing in dedicated folder in my home directory. First i tried to set this directory as nginx root, but i learned that it's a bad idea, so i decided to create symlinks in default nginx root directory, but it didn't work either.
I thought that maybe synchronizing files across both folders may work, but that honestly seems like a huge overkill for such a simple task.
How else should i approach this?
By default, nginx will try to prevent you from an insecure setup where you serve files outside of your nginx root. See here for how to fix: https://unix.stackexchange.com/questions/157022/make-nginx-follow-symlinks
Second, you need to ensure that the nginx user can list and read the files. (generally, via group permissions.)
It would actually be much better to store the files under the nginx root, and modify your python program.
Related
I installed Material Design Icons with npm install mdi --save and I am trying from my view access this mdi folder. I am trying to use ../node_modules/mdi/css/materialdesignicons.min.css but this is not working. I know that If I move the files do public I will get them, but this just make no sense. I think there is a way to get those modules even from the View file. What am I missing? Thankss
You will need to move the required files to public. Here's why:
You are serving your Laravel app to the world with a web server (maybe Apache, maybe Nginx, maybe something else). Web servers by design serve only certain files - you do not want a web server providing any old file from your server to clients all over the internet. So, the web server is locked down to prevent access to all of the files on the server except those you want to give access to. That's why we put files in a certain folder (/var/www for Apache).
Somewhere on your server you might have that CSS file in a node_modules directory somewhere. The web server is going to ignore that directory because it is probably not in the directory from which it serves file. To make sure that file is available to your clients, you will need to move it.
I have a standard ubuntu 14.04 machine. I use it daily under the user mh00h. I'm interested in using this machine as a production server. How do I manage file permissions for Django and Nginx?
Nginx is currently configured to run under the www-data:www-data. This minimizes risk of the rest of a machine being compromised. Django/gunicorn likewise should run under a user other than mh00h. But under what user should gunicorn actually be run under? nobody, correct?
Next: I am storing all of my web development files under /home/mh00h/development. Owned by mh00h. /home/mh00h/development/project1 (plus all dir/files but /media and /static)? Owned by mh00h. I follow django two-scoops best practices to create a project directory with static files inside of it. Of course, Nginx is unable to access /home/mh00h/development/project1/project1/static now because all of those parent directories are owned by mh00h, not www-data (./static is owned by www-data).
To complicate the matter, virtualenvwrapper creates my virtual environments under /home/mh00h/.virtualenvs/.
I am hesitant to fraction away from two scoop's best practices and store /static separately in /var/www, because I want all of these directories to stay nicely packaged together for easy transport off to some other server later. Plus, it makes me messy if I compare myself against how two-scoops did it.
Where should my static files be stored?
Where should django specific files be stored?
What users/groups should be able to access which of 1 and 2?
Where should virtualenvwrapper environments be stored?
What permissions should these locations have?
Thank you.
All files and directories in our production environment are owned by root:root with 755/644 file permissions, unless otherwise required. Some private files (think private keys etc.) are only readable by the user/process that needs them, while still being writable only by root.
As for the project structure: all our projects have a dedicated directory under /srv/www/vhosts.d/. Virtual environments are stored under /srv/www/virtualenvs. It is perfectly possible to store them in your home folder, but I feel this central approach is more in line with the idea of a production server. With the right settings, all virtual envs are also accessible by all users.
Our main project directory contains several scripts (manage.py and several deployment/update scripts) and is further split out into subdirectores: i.e. web contains public files, src contains the source code, and frontend contains the template folder and sass folder. The whole project directory is contained in a git repository, but deployment-specific files (user uploaded files, search indexes, encryption keys) are all in .git-ignore.
Our nginx process runs as www-data. In general each Django project has its own user, and the gunicorn process runs as this user.
I'm very new to Linux, please bear with me.
I have a linode with a LAMP stack running and I managed to configure my main site and a couple of subdomains and it's working great.
However, I want to have a dir called "dev" where I can put projects that I'm still working on. I need to be able to access this folder from my browser's address bar, and I don't want it to be through a DNS, but directly from my server's IP. For example:
http://218.42.42.42/dev/someproject
Since the document root is set to /var/www, placing the "dev" folder there isn't really an option - I want it to be in my ~ folder, for easier backups.
So what's the best way to make this work? A redirect, or should I move my doc root to the "dev" folder?
Thanks!
First, this would probably be more appropriate for Serverfault. With that in mind...
If I had to keep my dev environment in my home folder, I'd create a symlink in /var/www that ties to the dev folder.
As far as securing it, I don't know if this is still a recommended or viable way of handling secure access, but it seems like http://www.codinglogs.com/blog/server-management/vps-setup-guide/nginx-password-protect-web-directory might be the way to go as long as you feel secure using a username/password combination. Another valid answer (also on stackoverflow) would be password protect /backoffice folder in nginx.
If you want something more secure, the next step would probably firewall rules.
My project resides in a shared Linux hosting server. The hosting provider, of course, has already set up the necessary directory and file ownerships relative to other server users. My concern for now is how to setup permissions within my domain so my users can have read access to the files and folders they should have and still let my scripts retain read/write access to it.
Question: What would be the recommended permissions on:
Public files and folders (read only?)
Files where uploaded files from forms are stored
Files and folders where GD and cache files are being written into
Folders where my server-side scripts are stored (I used mainly PHP)
My WWW root folder (where index.php resides)
This is a perfect example of where you need the Principle of Least Privilege. Allow ReadOnly to the webserver's user for RO content, allow writing only to a directory/files that absolutely need to be written. Explicitly deny access to things you don't want people to read (config files, htaccess, anything with paths/ip addresses/passwords), don't allow any extra processing if you're not using it (CGI executables, Server Side Includes).
The best way to do it is to start with deny everything and slowly open thing up as you go. First try serving static content, see what is the minimal amount of Apache directives/modules and filesystem ownerships and permissions to get it working. Then try some RO PHP scripts. Then try some RW PHP scripts. Then DB connectivity, and so on, you get the idea... It's a very tedious processes, and you want to plan ahead the sort of things you want to test; I tend to write long scripts with wget commands trying to do both good and bad things to the server. Make one change, restart, rerun the script, see what changes from the last time. Observe-modify-analyze, until you cant stand looking at it anymore ;)
I am having a frequent problems with my web hosting (its shared)
I am not able to delete or change permission for a particular directory. The response is,
Cannot delete. Directory may not be empty
I checked the permissions and it looks OK. There are 100's of files in this folder which I don't want.
I contacted my support and they solved it saying it was permission issue. But it reappeared. Any suggestions?
The server is Linux.
You can't rmdir a directory with files in it. You must first rm all files and subdirectories. Many times, the easiest solution is:
$ rm -rf old_directory
It's entirely possible that some of the files or subdirectories have permission limitations that might prevent them from being removed. Occasionally, this can be solved with:
$ chmod -R +w old_directory
But I suspect that's what your support people did earlier.
This could also be because your FTP client might not be showing the hidden files (like cache, or any hiddn files that your application might create), while the hidden files are preventing you from deleting the directory. (though, in your case, I am not sure if this is the cause .. .it could be permission issue with your hosting provider.. Webserver running as another user (like apache or www) combined with your directories having global write perms).
I assume that's a response from an FTP server?
Usually, a message from an FTP server really means it. If it says the directory is not empty, there might be certain files you cannot see that exists in the directory which maybe one of:
Your PHP/JSP/ASP/whatever scripts may run under a different user account thus creating files which you may not be able to see/delete
Is your hosting's web interface run under your FTP account? There might be conflicting permissions there if you manage some files from the web interface and then later via FTP.
Hosting server/operating system files created unintentionally e.g. from the hosting's web interface
If it comes from a script, write a one-time throw-away script that delete the files and that directory and then uploads and executes it.
And just to be sure, some FTP server doesn't support direct directory deletion, you need all the files first, is that the case?