Nginx Configuration Versioning Strategy - linux

currently a project my team inherited has a complete mess on the nginx configuration across 10+ environments, we would like to implement a versioning strategy however im not sure how people "normally" achieve this. you make the whole nginx conf folder a git repo and ignore what you do not want to version? or have a separate folder with the config file repo and deploy the files with a script?

We manage it via separate Git repository exclusive only for nginx configuration. Yes, it includes everything inside /etc/nginx/ directory.
But it's not synced directly on server, instead a bash script is used to pull changes, update configuration, and reload nginx configuration.
Script example:
# Pull changes
git pull
# Sync changes excluding .git directory
rsync -qauh ./* "/etc/nginx" --exclude=".git"
# Set proper permissions
chmod -R 644 /etc/nginx
find /etc/nginx -type d -exec chmod 700 {} \;
# If you store SSL certs under `/etc/nginx/ssl`
# Set proper permission for SSL certs
chmod -R 600 /etc/nginx/ssl
chmod -R 400 /etc/nginx/ssl/*
# Reload nginx config
# but only if configtest is passed
nginx -t && service nginx reload

Related

Can git store or control directory user, group or permissions

I have developed a few git repositories for a web application. As part of the deployment process, a few of the folders need to be writable by Apache for file uploads. Does git have any control over this for either the user or group, or the respective permissions?
git can only set this for the entire repository, not for a subdirectory. Run in the root of your repo:
sudo chgrp -R apache .
sudo chmod -R ug+rwX *
git config core.sharedRepository group

How to set folder permissions for a particular container on Elastic Beanstalk

I have troubles setting permissions for a web folder on Elastic Beanstalk. I run multiple containers using custom docker images in one instance: apache-php, mysql, memcached, etc.. For the container "apache-php" I map a folder with my yii2 application to /var/www/html/.
When I manually make a bundle and do upload / deploy via Elastic Beanstalk console I sure have right permissions for the folder and everything works fine.
Now, when I deploy the app using "eb deploy", it drops all permissions and I get a server error and "The directory is not writable by the Web process: /var/www/html/backend/web/assets" in logs.
I can connect via ssh and set necessary permissions manually, but sure this is not convenient, since needs to be done every time I re-deploy the app.
So, my questions is what is the best way to automatically set permission for particular folder in particular container on Elastic Beanstalk?
Perhaps, I can use .ebextensions, but I didn't find how to run "container_commands" for particular container.
AWS EB Deployment starts your app in /var/app/ondeck
When deploying elastic beanstalk, your app is first unzipped into /var/app/ondeck/
Most likely, your local folder being deployed does not have the permissions you want on them.
If you need to make adjustments to your app, or the shell, during deployment, .ebextensions/*.config is the right place to do it.
Container commands should be run to that path
But keep in mind, that these commands will run EVERY time you deploy, whether needed or not, unless you use some method to test for pre-config.
container_commands:
08user_config:
test: test ! -f /opt/elasticbeanstalk/.preconfig-complete
command: |
echo "jail-me" > /home/ec2-user/.userfile
09writable_dirs:
command: |
chmod -R 770 /var/app/ondeck/backend/web/assets
chmod -R 770 /var/app/ondeck/[path]
99complete:
command: |
touch /opt/elasticbeanstalk/.preconfig-complete
files:
"/etc/profile.d/myalias.sh":
mode: "000644"
owner: root
group: root
content: |
alias webroot='cd /var/www/html/backend/web; ls -al --color;'
echo " ========== "
echo " The whole point of Elastic Beanstalk is that you shouldn't need to SSH into the server. "
echo " ========== "
Yes you should use ebextensions.
Create a folder in your app source root called .ebextensions. Create a file with a .config extension say 01-folder-permissions.config. Files are processed in lexicographical order of their name.
Contents of the file can be:
container_commands:
change_permissions:
command: chmod 777 /var/www/some-folder
Replace with appropriate folder and permissions. Read about container commands here.

Server unable to read htaccess file, denying access to be safe

I have created a simple app using AngularJS. When I tried to host that project in my website http://demo.gaurabdahal.com/recipefinder it shows the following error:
Forbidden
You don't have permission to access /recipefinder on this server.
Server unable to read htaccess file, denying access to be safe
But if I go to http://demo.gaurabdahal.com/ it displays "access denied" message as expected, that I have printed. But why is it unable to open that AngularJS projects "recipefinder". If I tried to put a simple HTML app there, it opens just fine.
The same AngularJS project works fine when I host that in github (http://gaurabdahal.github.io/recipefinder)
I can't understand what's wrong.
I had this problem too. My advice is look in your server error log file. For me, it was that the top directory for the project was not readable. The error log clearly stated this. A simple
sudo chmod 755 <site_top_folder>
fixed it for me.
Set group of your public directory to nobody.
This is a common problem with GoDaddy virtual server hosting when you bring up a new website.
Assuming you have SSH access to the server (you have to enable it on cPanel), login to your account. Upon successful login, you will be placed in the home directory for your account. The DocumentRoot for your website is located in a subdirectory named public_html. GoDaddy defaults the permissions for this directory to 750, but those permissions are inadequate to allow Apache to read the files for website. You need to change the permissions for this directory to 755 (chmod 755 public_html).
Copy the files for your website into the public_html directory (both scp and rsync work for copying files to a GoDaddy Linux server).
Next, make sure all of the files under public_html are world readable. To do this, use this command:
cd public_html
chmod -R o+r *
If you have other subdirectories (like css, js, and img), make sure they are world accessible by enabling both read and execute for world access:
chmod o+rx css
chmod o+rx img
chmod o+rx js
Last, you will need to have a .htaccess file in the public_html file. GoDaddy enforces a rule that prohibits the site for loading if you do not have a .htaccess file in your public_html directory. You can use vi to create this file ("vi .htaccess"). Enter the following lines in the file:
Order allow,deny
Allow from all
Require all granted
This config will work for both Apache 2.2 and Apache 2.4. Save the file (ZZ), and then make sure the file has permissions of 644:
chmod 644 .htaccess
Works like a charm.
You need to run these commands in /var/www/html/ or any other directory that your project is on:
sudo chgrp -R GROUP ./
sudo chown -R USER:GROUP ./
find ./ -type d -exec chmod 755 -R {} \;
find ./ -type f -exec chmod 644 {} \;
In my case (apache web server) I use www-data for USER and GROUP
Every public folder makes the permission to 755. Problem solved.
GoDaddy shared server solution
I had the same issue when trying to deploy separate Laravel project on a subdomain level.
File structure
- public_html (where the main web app resides)
[works fine]
- booking.mydomain.com (folder for separate Laravel project)
[showing error 403 forbidden]
Solution
go to cPanel of your GoDaddy account
open File Manager
browse to the folder that shows 403 forbidden error
in the File Manager, right-click on the folder (in my case booking.mydomain.com)
select Change Permissions
select following checkboxes
a) user - read, write, execute
b) group - read, execute
c) world - read, execute
Permission code must display as 755
Click change permissions
In linux,
find project_directory_name_here -type d -exec chmod 755 {} \;
find project_directory_name_here -type f -exec chmod 644 {} \;
It will replace all files and folder permission of project_directory_name_here and its inside stuff.
In my case apache was somehow configured wrong(?) so I had to set permissions to all parent dirs too. Just setting permission to .htaccess (and it's parent dir) didn't work.
Ok, I recently met the same issue too while working on a WordPress installation using apache2 on the server on Ubuntu 20.04.
I experienced this issue when I changed file ownership to another user:
Here's what worked for me:
$ sudo chown -R www-data:www-data /var/www/YOUR-DIRECTORY
Here's a bit more context into the issue:
The above command gives ownership of all the files [in that folder] to the www-data user and group. This is the user that the Apache web server runs as, and Apache will need to be able to read and write WordPress files in order to serve the website and perform automatic updates.
Be sure to point to your server’s relevant directory (replace YOUR-DIRECTORY with your actual folder).
You could run through this insightful article on digitalocean.
As for Apache running on Ubuntu, the solution was to check error log, which showed that the error was related with folder and file permission.
First, check Apache error log
nano /var/log/apache2/error.log
Then set folder permission to be executable
sudo chmod 755 /var/www/html/
Also set file permission to be readable
sudo chmod 644 /var/www/html/.htaccess
Just my solution. I had extracted a file, had some minor changes, and got the error above. Deleted everything, uploaded and extracted again, and normal business.
Important points in my experience:
every resource accessed by the server must be in an executable and readable directory, hence the xx5 in every chmod in other answers.
most of the time the webserver (apache in my case) is running neither as the user nor in the group that owns the directory, so again xx5 or chmod o+rx is necessary.
But the greater conclusion I reached is start from little to more.
For example, if
http://myserver.com/sites/all/resources/assets/css/bootstrap.css
yields a 403 error, see if http://myserver.com/ works, then sites, then sites/all, then sites/all/resources, and so on.
It will help if your server has directory indexes enable:
In Apache: Options +Indexes
This instruction might also be in the .htaccess of your webserver public_html folder.
I had same problem on Fedora, and found that problem was selinux.
to test that it is problem run command:
sudo setenforce 0
Otherwise or change in file /etc/sysconfig/selinux
SELINUX=enforcing
to
SELINUX=disabled
or add rules to selinux to allow http access
I had the same problem on a rackspeed server after changing the php version in the cpanel. Turned out it also changed the permissions of the folder... I set the permission of the folder to 755 with
chmod 755 folder_name
"Server unable to read htaccess file" means just that. Make sure that the permissions on your .htaccess file are world-readable.

How to set up a Git server with HTTP access on Linux

I need to create a Git repository on a Linux machine and then make it accessible via HTTP. Also need full access with one user and read-only to anon-users.
I've created local repositories before but I don't know how to create this (e.g.: inside /var/www or /opt/git/...)
I tried doing this:
-sudo Clone a GitHub repository into /var/www/repos/repo.git
-cd /var/www/repos/repo.git
-sudo git --bare update-server-info
-sudo mv hooks/post-update.sample hooks/post-update
-sudo service apache2 restart
Then I tried to access this repository from another machine:
-With browser : (http protocol)192.168.1.49/repo.git <-- WORKS
-With terminal: git clone --bare (http protocol)192.168.1.49/repo.git <--DOESN'T WORK
The terminal says:
Cloning into bare repository repo.git...
fatal: (http protocol)192.168.1.49/repo.git/info/refs?service=git-upload-pack not found: did you run git update-server-info on the server?
I think maybe it's a permissions problem. How I need to manage permissions inside /var/www?
EDIT: Already fixed, just needed:
-put the repository into /var/www/repos/ named repo.git
-change the permissions of the www folder with sudo chown -R www-data:www-data /var/www
-enable webdav with sudo a2enmod dav_fs
-config file into /etc/apache2/conf.d called git.conf
-create the file with users with sudo htpasswd -c /etc/apache2/passwd.git user
-rename the pot-update file and make it executable with sudo mv /var/www/repos/repo.git/hooks/post-update.sample /var/www/repos/repo.git/hooks/post-update && sudo chmod a+x /var/www/repos/repo.git/hooks/post-update
-update server and restart apache with sudo git update-server-info && sudo service apache2 restart
And, to fix the problem with pushing:
Edit the file .git/config into your repository folder (client machine) and put the username and password on the url:
url = (http protocol)user:password#url/repos/repo.git
So, now only I need is to set the read-only for anon-users.
Already fixed, just needed:
-put the repository into /var/www/repos/ named repo.git
-change the permissions of the www folder with sudo chown -R www-data:www-data /var/www
-enable webdav with sudo a2enmod dav_fs
-config file into /etc/apache2/conf.d called git.conf
-create the file with users with sudo htpasswd -c /etc/apache2/passwd.git user
-rename the pot-update file and make it executable with sudo mv /var/www/repos/repo.git/hooks/post-update.sample
/var/www/repos/repo.git/hooks/post-update && sudo chmod a+x
/var/www/repos/repo.git/hooks/post-update
-update server and restart apache with sudo git update-server-info && sudo service apache2 restart
And, to fix the problem with pushing:
Edit the file .git/config into your repository folder (client machine)
and put the username and password on the url: url = (http
protocol)user:password#url/repos/repo.git
So, now only I need is to set the read-only for anon-users.

Git save password of remote connection locally, connecting via root#

I would just like to start by saying I am completely new to git so I am probably doing things the wrong way but am trying to follow posts on here and guides online. I know this is probably a dumb post, but I am just a web designer so very basic with this stuff, would appreciate any advice about the way I am doing this, or if there is a better way.
I installed git on my centos vps and then setup my repository inside my website, location here:
/var/www/vhosts/server.userfarmer.com/userfarmer/userfarmer.git
The userfarmer folder before the .git folder is my main websites directory, I am trying to upload my website from my local machines via git to this folder. I set this up over ssh using:
mkdir userfarmer.git
cd userfarmer.git
git --bare init
I have then setup the git remote connection locally using:
git remote add origin root#serverip:/var/www/vhosts/server.userfarmer.com/userfarmer/userfarmer.git
now I can connect to this fine but each time I do I require my root password, is there anyway to save this so it is not needed each time I do a push.
Any advice greatly appreciated, this is all completly new to me,
Thanks,
Simon
On your server create a .ssh folder at root user home directory.
mkdir /root/.ssh/
Give it 700 permission.
chmod 7000 /root/.ssh/
Create a file named "authorized_keys" inside the .ssh folder and give it permission 600
touch /root/.ssh/authorized_keys
chmod 600 /root.ssh/authorized_keys
Now from your laptop:
Append your public key i.e loptop.pub content to authorized_keys.
cat ~/.ssh/laptop.pub | ssh root#serverip "cat >> ~/.ssh/authorized_keys"
you can open the file
.git/config
look for line
url=root#serverip:/var/www/vhosts/server.userfarmer.com/userfarmer/userfarmer.git
and add your password in format:
url=root:password#serverip:/var/www/vhosts/server.userfarmer.com/userfarmer/userfarmer.git
next time you open a repo, add it by writing
git remote add origin root:password#serverip:/var/www/vhosts/server.userfarmer.com/userfarmer/userfarmer.git

Resources