Auto Deployment with Git - linux

Ok, I am trying to auto deploy my node.js application with git.
This is what I have on my server:
/home/git/myproj.git - bare repo
/home/me/public_html - cloned repo of the aforementioned bare repo
Now the problem is, I cannot pull from myproj.git/hooks/post-receive because the push to the repo is made as the git user so I have permissions problem. I have also tried some deployment scripts but I have been facing a lot of permissions issues.
I have heard of tools like capistrano, fabric, gitolite, but I guess it's too complex for me atm, and I want something easier (very easy to setup, and keep on replicating in multiple projects).
I hope I made myself clear. I think this problem is related - Auto deployment PHP script using Gitolite - but I am not using gitolite and his answer doesn't make sense to me (probably because his English is not too well).

I first started with git and post-recieve hooks myself but didn't really like them. I then switched to a very simple bash script. Even if you don't use the script it's only like 200 lines in total so it's great to steal some snippets from.

Related

What workflow should I use to deploy a NodeJS app to a fixed server?

I work at a tiny company where deployment is mainly done by pulling master to a production server and running several scripts. We use PM2 which has some simple deployment features, but I'm not sure how to arrange the moving parts together to make everyone happy.
What I want to accomplish is having one fire-and-forget command to run on my dev machine that will result in everything being where it's supposed to be. What my boss wants to accomplish is to at no point have to do any lengthy step on the server to minimize downtime - so no building and no NPM install. To this end he wants builds and node_modules in Git and I think that's an abomination before the gods, so I'm trying to figure out how to avoid those.
Things that are a no-go for now: building and deploying Docker images, CI/CD. We don't have the infrastructure set up for those, and I'm more interested in using tools we already have and getting rid of human intervention in those. (Their popularity unfortunately also makes it hard to research best practices for more legacy environments.)
I'm not very familiar with PM2 and most of my experience is in environments where deployment was somebody else's job, so what I'm looking for is an outline of what gets done to what where provided by somebody that actually knows what they're talking about.
My current rough idea is:
Switch the project to Yarn 2 and use it's zero-install capability to have dependencies in Git but sane.
In a Docker container on the dev machine, do a clean checkout, install, and build. (This is to get rid of "works on my machine" issues and avoid inadvertently checking in macOS binaries.)
Push this to a release branch - these are the only ones where build outputs and such are allowed.
Then use PM2 to pull this specific branch and reload on the target machine.
Is this something that looks workable? Am I missing something? Is it possible to somehow avoid the release branches?

Good practices for pulling from git repo into production server

I have a DigitalOcean VPS with ubuntu and a few laravel projects, for my projects initial setup I do a git clone to create a folder with my application files from my online repository.
I do all development work in my local machine, where I have two branches (master and develop), what I do is merge develop into my local master, then I push from master into my local repository.
Nw back into my production server, when I want to add all the changes added into production I do a git pull from origin, so far this has resulted into git telling me to stash my changes, why is this?
What would be the best approach to pull changes into production server? take in mind that my production server has no working directory perse, all I do in my VPS is either clone or push upgrades into production.
You can take a look at the CI/CD (continuous integration / continuous delivery) systems. GitLab for example offer free-to-use plan for small teams.
You can create a pipeline with a manual deploy step (you have to press a button after the code is merged to the master branch) and use whatever tool you like to deploy your code (scp, rsync, ftp, sftp etc.).
And the biggest benefit is that you can have multiple intermediate steps (even for the working branches) where you can run unit tests which would prevent you to upload failing builds (whenever you merge non-working code)
For the first problem, do a git status on production to see which files that git sees as changed or added and consider adding them to your .gitignore file (which itself should be a part of your repo). Laravel generally has good defaults for these, but you might have added things or deviated from them in the process of upgrading Laravel.
For the deployment, the best practice is to have something that is consistent, reproducible, loggable, and revertable. For this, I would recommend choosing a deployment utility. These usually do pretty much the same thing:
You define deployment parameters in code, which you can commit as a part of your repo (not passwords, of course, but things like the server name, deploy path, and deploy tasks).
You initiate a deploy directly from your local computer.
The script/utility SSH's into your target server and pulls the latest code from the remote git repo (authorized via SSH key forwarded into the server) into a 'release' folder.
The script does any additional tasks you define (composer install, npm run prod, systemctl restart php-fpm, soft-linking shared files like .env, and etc.)
The script soft-links the document root to your new 'release' folder, which results in an essentially zero-downtime deployment. If any of the previous steps fail, or you find a bug in the latest release, you just soft-link to the previous release folder and your site still works.
Here are some solutions you can check out that all do this sort of thing:
Laravel Envoyer: A 1st-party (paid) service that allows you to deploy via a web-based GUI.
Laravel Envoy: A 1st-party (free) package that allows you to connect to your prod server and script deployment tasks. It's very bare-bones in that you have to write all of the commands yourself, but some may prefer that.
Capistrano: This is (free) a tried-and-tested popular ruby-based deployment utility.
Deployer: The (free) PHP equivalent of Capistrano. Easier to use, has a lot of built-in tasks (including a Laravel one), and doesn't require ruby.
Using these utilities is not necessarily exclusive of doing CI/CD if you want to go that route. You can use these tools to define the CD step in your pipeline while still doing other steps beforehand.

import entire GitLab Cloud project to new GitLab instance

I have some projects set up on GitLab Cloud, complete with issues, wiki pages, etc. I've recently set up an internally hosted gitlab instance. I'd like to bring these projects over from GitLab Cloud to the internal GitLab instance.
Bringing over the git repos seems easy enough (change the remote and push), but I don't see how to bring over the wikis and issues.
In general it seems like this isn't possible. (There's a GitLab Feedback for it here.)
However, the project wiki's seem to be their own git repos, which you can see on the Git Access tab. While that doesn't solve issues/snippets, it gets you part of the way there.
I don't know how to transfer over issues as I have not had to do that yet, but passing over the wiki is not that difficult.
On your old gitlab instance you will notice two repositories for your project (let's pretend your wiki is oldproject), one will say something like oldproject.git and oldproject.wiki.git.
The general path to the repositories where you can see the names I am talking about (let's assume user-name is "myaccount") can be found here:
/home/git/repositories/myaccount/
or (if using the omnibus installer):
/var/opt/gitlab/git-data/repositories/myaccount/
I presume you already know how to transfer over oldproject.git. You do the exact same thing with the wiki, only you create a bundle file out of oldproject.wiki.git:
git clone http://gitlab-instance-ip/user-name/oldproject.wiki.git
cd oldproject.wiki
git bundle create oldproject-wiki.bundle --all
Now initialize your new project in gitlab...I presume you already know how to do that as you suggested in your question that you know how to import the files from your project over to the new instance without problem. Now repeat for the wiki:
git clone http://new-gitlab-ip/user-name/newproject.wiki.git
cd newproject.wiki
git pull /path/to/oldproject-wiki.bundle
git push -u origin master
I had a very similar problem to yours where I didn't see that anything was actually "pushed". When I went back to the gitlab project I noticed that it was in fact updated with the wiki. See here if you think it will help: Importing Gitlab Wiki to a new Gitlab Instance
Good luck!

How to deploy a node app to azure if the node app is buried in repo directory

I am trying to deploy a project to azure, via the "remote git repo" method. But in my repo, the actual node application is a few directories in. Thus, Azure does not do anything when the repo is pushed.
Is there some way to configure the azure website to run from a directory buried in the repo?
There's a super easy way actually. This scenario was anticipated by the Azure team and there's a good way to handle it. It's simple too.
You simply create a text file at the root of your project called .deployment. In the .deployment file you add the following text...
[config]
project = mysubfolder
When you either Git deploy or use CI to deploy from source control, the entire repository is deployed, but the .deployment file tells Kudu (that's the engine that handles your website management) where the actual website (or node project) is.
You can find more info here.
Also, check out this post where I mention an alternative strategy for project repos in case that helps.
This isn't so much an Azure question as a Git question. What you want to know is if there is a way to clone only a sub-directory or branch of a project. From doing some research on this just a couple of weeks ago, the best I could find were solutions for how to do a sparse clone, which does allow one to restrict the files cloned (almost there) but does so within the entire project's directory structure (denied).
A couple of related SO questions & answers which you might find helpful:
How do I clone a subdirectory only of a Git repository?
(Short answer 'no')
Checkout subdirectories in Git?
(Answer describes the sparse checkout ability).
I would love to see if a git guru might have a better answer based on updates to git, etc.
Good luck with it - I ended up just putting my node app in its own Git project as it seemed the most straightforward approach overall, though not ideal.

remote deploy scripts for nodejs?

I am looking for a way to easily deploy a nodejs app via a command line script.
I found one solution:
https://github.com/Skookum/nimbus
I also heard that the whole thing can be done with git and post commit hooks.
What would people recommend?
edit: i am deploying it to my own box where i have root
You have two options on a self hosted setup.
Do it all yourself
This entails git post-receive hooks. In short you setup your production box to host a copy of your repository, on your local machine you setup a remote, let's call the remote production.
Now when you run git push production master on your local machine, the updates are sent and the server executes the post-receive hook on your server which runs whatever you wish.
Actions you may want are: checking out/writing the data in the repo to files/folders (the git repo on the server is stored as a bare repo); restarting your webserver; notifying you that there's been a deployment etc.
I'd suggest reading up on it at http://git-scm.com/book/en/Customizing-Git-Git-Hooks and taking a look at a few tutorials, this one (http://ryanflorence.com/deploying-websites-with-a-tiny-git-hook/) looks prety legit.
Use a service to manage it for you, http://www.deployhq.com/ is the only one that springs to mind but I'm sure there's other.
Good Luck and Happy Hacking :)
There is a tool called shipit.js (https://github.com/shipitjs/shipit) which allows you to perform different deployment tasks like:
moving code from the repo to the server
restarting server
installing node_modules
etc.
You create a config file, and then runs: npx shipit deploy and all tasks you specify are performed. In case of failure, it has a rollback mechanism.
There is a nice screencast about it: https://youtu.be/8PpBySjkWEM.

Resources