One-click deployment solution for my Node.js application (with pm2) - node.js

So basically, I have Node.js application locally (WebStorm project) and constantly pushing changes to git. I would like to create one-click desployment solution executable from my WebStorm project what will:
Clone/Pull project from git
Install dependencies
Edit config (database connection, etc...)
Based on config make sure database is ready and if not initialize new one
I achieved parts 1 and 2 by using pm2 deployment function. If I run command, everything is updated from git and dependencies are installed.
Part 3 is problem, because I do not want to push production config into git (even it is private repo) and I have only example config in git repo. I was thinking about script what will copy example config and then prompt me somehow to edit it, but I am not sure now to achieve this without connecting manually into production server and second problem is I need to copy new example config every new version, because there might be changes in config structure between versions.
Any suggestions how to achieve this easily and without need to log into production server manually?
Part 4 is nice to have feature if anyone have suggestions how to achieve this, too.

Related

Concurrently JS application pipeline install and build hangs (Express js for server, Create-React-App for Client)

Problem: I have a project with a server (Express Server that handles file uploading and deleting) and client (Front End Create-React-App). The project structure looks like follows:
Root Folder With Server
Client Folder
Each folder has it's own package.json. Server Package.json. Client package.json
I'm trying to build and deploy onto azure however the pipeline hangs on "npm install and build".
It seems like the build succeeds but this phase just hangs. Here is my server.js (the routes are not included) file and yaml file just in case.
I'd appreciate any kind of help. Thank you!
Troubleshooting suggestions:
In the case of ensuring that the code in github is consistent with the local code, if an exception occurs, it is recommended to replace the linux platform and redeploy.
It is recommended to use my suggestion to recreate the repository, and then check the Action status in github.
Sum up:
In general, it is more appropriate to use Linux in azure than windows. For example, Linux supports npx, and may also support other packages and commands.
When the local code can run normally, there is generally no problem when deploying to github, unless there may be modifications, which we have ignored. So make sure the code is consistent.
General correct deployment steps:
First in the portal, make sure to create a web app application (not a static web app), and select the node environment.
Make sure that the sever program can run normally locally. Create a new repository in github.
->git init
->git add.
->git commit -m'init'
->git remote add origin https://github.com/{your name}/newAppname.git
->git push -u origin master
Connect in the Portal's Deployment center.
Then check the status of Action in github.

Good practices for pulling from git repo into production server

I have a DigitalOcean VPS with ubuntu and a few laravel projects, for my projects initial setup I do a git clone to create a folder with my application files from my online repository.
I do all development work in my local machine, where I have two branches (master and develop), what I do is merge develop into my local master, then I push from master into my local repository.
Nw back into my production server, when I want to add all the changes added into production I do a git pull from origin, so far this has resulted into git telling me to stash my changes, why is this?
What would be the best approach to pull changes into production server? take in mind that my production server has no working directory perse, all I do in my VPS is either clone or push upgrades into production.
You can take a look at the CI/CD (continuous integration / continuous delivery) systems. GitLab for example offer free-to-use plan for small teams.
You can create a pipeline with a manual deploy step (you have to press a button after the code is merged to the master branch) and use whatever tool you like to deploy your code (scp, rsync, ftp, sftp etc.).
And the biggest benefit is that you can have multiple intermediate steps (even for the working branches) where you can run unit tests which would prevent you to upload failing builds (whenever you merge non-working code)
For the first problem, do a git status on production to see which files that git sees as changed or added and consider adding them to your .gitignore file (which itself should be a part of your repo). Laravel generally has good defaults for these, but you might have added things or deviated from them in the process of upgrading Laravel.
For the deployment, the best practice is to have something that is consistent, reproducible, loggable, and revertable. For this, I would recommend choosing a deployment utility. These usually do pretty much the same thing:
You define deployment parameters in code, which you can commit as a part of your repo (not passwords, of course, but things like the server name, deploy path, and deploy tasks).
You initiate a deploy directly from your local computer.
The script/utility SSH's into your target server and pulls the latest code from the remote git repo (authorized via SSH key forwarded into the server) into a 'release' folder.
The script does any additional tasks you define (composer install, npm run prod, systemctl restart php-fpm, soft-linking shared files like .env, and etc.)
The script soft-links the document root to your new 'release' folder, which results in an essentially zero-downtime deployment. If any of the previous steps fail, or you find a bug in the latest release, you just soft-link to the previous release folder and your site still works.
Here are some solutions you can check out that all do this sort of thing:
Laravel Envoyer: A 1st-party (paid) service that allows you to deploy via a web-based GUI.
Laravel Envoy: A 1st-party (free) package that allows you to connect to your prod server and script deployment tasks. It's very bare-bones in that you have to write all of the commands yourself, but some may prefer that.
Capistrano: This is (free) a tried-and-tested popular ruby-based deployment utility.
Deployer: The (free) PHP equivalent of Capistrano. Easier to use, has a lot of built-in tasks (including a Laravel one), and doesn't require ruby.
Using these utilities is not necessarily exclusive of doing CI/CD if you want to go that route. You can use these tools to define the CD step in your pipeline while still doing other steps beforehand.

How to update repository with built project?

I’m trying to set up GitLab CI/CD for an old client-side project that makes use of Grunt (https://github.com/yeoman/generator-angular).
Up to now the deployment worked like this:
run ’$ grunt build’ locally which built the project and created files in a ‘dist’ folder in the root of the project
commit changes
changes pulled onto production server
After creating the .gitlab-ci.yml and making a commit, the GitLab CI/CD job passes but the files in the ‘dist’ folder in the repository are not updated. If I define an artifact, I will get the changed files in the download. However I would prefer the files in ‘dist’ folder in the to be updated so we can carry on with the same workflow which suits us. Is this achievable?
I don't think commiting into your repo inside a pipeline is a good idea. Version control wouldn't be as clear, some people have automatic pipeline trigger when their repo is pushed, that'd trigger a loop of pipelines.
Instead, you might reorganize your environment to use Docker, there are numerous reasons for using Docker in a professional and development environments. To name just a few: that'd enable you to save the freshly built project into a registry and reuse it whenever needed right with the version you require and with the desired /dist inside. So that you can easily run it in multiple places, scale it, manage it etc.
If you changed to Docker you wouldn't actually have to do a thing in order to have the dist persistent, just push the image to the registry after the build is done.
But to actually answer your question:
There is a feature request hanging for a very long time for the same problem you asked about: here. Currently there is no safe and professional way to do it as GitLab members state. Although you can push back changes as one of the GitLab members suggested (Kamil Trzciński):
git push http://gitlab.com/group/project.git HEAD:my-branch
Just put it in your script section inside gitlab-ci file.
There are more hack'y methods presented there, but be sure to acknowledge risks that come with them (pipelines are more error prone and if configured in a wrong way, they might for example publish some confidential information and trigger an infinite pipelines loop to name a few).
I hope you found this useful.

How do I package my node project files and deploy them to my server?

I have a nodejs non open source web application project that I want to deploy to a production server and/or a staging server. Is there a default way for this, or some tool that does it? I want to package all files needed and exclude the files not needed like the .git folder, the tests and other files like Gruntfile, package.json and so on.
I could of course manually package the files in a tar.gz file and send them to the correct server. But I was hoping to find a more complete and configurable tool that can do it for me.
Might be not exactly what you're asking for, but i like git for automated deployment.
You could have branches like staging and production, which are checked out on the remote server.
You can set up a git hook like post-receive to update those remotely, every time you merge changes into those branches.
Here's a tutorial: http://wekeroad.com/2011/09/17/deploying-a-site-with-git-hooks

remote deploy scripts for nodejs?

I am looking for a way to easily deploy a nodejs app via a command line script.
I found one solution:
https://github.com/Skookum/nimbus
I also heard that the whole thing can be done with git and post commit hooks.
What would people recommend?
edit: i am deploying it to my own box where i have root
You have two options on a self hosted setup.
Do it all yourself
This entails git post-receive hooks. In short you setup your production box to host a copy of your repository, on your local machine you setup a remote, let's call the remote production.
Now when you run git push production master on your local machine, the updates are sent and the server executes the post-receive hook on your server which runs whatever you wish.
Actions you may want are: checking out/writing the data in the repo to files/folders (the git repo on the server is stored as a bare repo); restarting your webserver; notifying you that there's been a deployment etc.
I'd suggest reading up on it at http://git-scm.com/book/en/Customizing-Git-Git-Hooks and taking a look at a few tutorials, this one (http://ryanflorence.com/deploying-websites-with-a-tiny-git-hook/) looks prety legit.
Use a service to manage it for you, http://www.deployhq.com/ is the only one that springs to mind but I'm sure there's other.
Good Luck and Happy Hacking :)
There is a tool called shipit.js (https://github.com/shipitjs/shipit) which allows you to perform different deployment tasks like:
moving code from the repo to the server
restarting server
installing node_modules
etc.
You create a config file, and then runs: npx shipit deploy and all tasks you specify are performed. In case of failure, it has a rollback mechanism.
There is a nice screencast about it: https://youtu.be/8PpBySjkWEM.

Resources