How to manage node_modules in Continous deployment via jenkins? - node.js

Trying to develop node.js application since it is under development requires lot of deployments to be done to production environment.During every releases do we have to move entire node_modules to production ?
Note : production environment is restricted from internet access so cannot use NPM install there.

You should check out npmrc file, which, among other things, decides source of all your node modules.
Your production server should have such file in its project root, and it must store the location of your npm packages that must be installed.
The registry value inside npmrc file is where you can put your packages. You can read more about it here.

Related

How do you deploy a continuous running app when using npm-cli

I have a largish nodejs based web app, with both server and client components working together. I am currently deploying the app, but using git pull to take my latest production branch from the server repository. A git post-commit hook runs to do a npm install and a rebuild of the servers .env file, and PM2 is monitoring the various processes (3 web servers) using a change in the .env file to restart them.
node_modules is at the highest level of the project with separate server and client subdirectories. Since this is using http2 on a fast lan, I don't bother compressing the client files with web-pack or the like, although I do use rollup on lit-element and lit-html to sort out the import statements (they are not relative or absolute) that they have embedded in them.
I've just been reading that I should really have been doing an npm ci for my node dependencies, but reading the instructions for that it says it blows away the node_modules directory and starts again (whereas npm install doesn't). Since this is all running on a raspberry pi its not instantaneous.
I am not sure a temporary loss of node_modules should effect a running app too much - after all I believe the modules will all have been cached into memory, but it just might have not and there also a possibility that one of the servers falls over and pm2 restarts it, so I am wondering ....
So what is best practice here. Is it possible for instance to copy package.json, package-lock.json to a special build subdirectory, build the node_modules directory there and then move it back into place. Once built. Or is there a better way?

Best practice for nodejs deployment - Directly moving node_modules to server or run npm install command

What is the best practice for deploying a nodejs application?
1) Directly moving the node_modules folders from the development server to production server, so that our same local environment can be created in the production also. Whatever changes made to any of the node modules remotely will not affect our code.
2) Run npm install command in the production server with the help of package.json. Here the problem is, any changes in the node modules will affect our code. I have faced some issues with the loopback module (issue link).
Can anyone help me?
Running npm install in production server cannot be done in certain scenario (lack of compiling tools, restricted internet access, etc...) and also if you have to deploy the same project on multiple machines, can be a waste of cpu, memory and bandwidth.
You should run npm install --production on a machine with the same libraries and node version of the production server, compress node_modules and deploy on production server. You should also keep the package-lock.json file to pinpoint versions.
This approach allows you also to build/test your code using development packages and then pruning the node_modules before the actual deploy.
Moving node_modules folder is overkilled.
Running npm install might break the version dependencies.
The best approach is npm ci. It uses the package_lock file and installs the required dependencies without modify the versions.
npm ci meant for continuous integration projects. LINK
I am an ASP.NET Core developer but I recently started working with Node.js apps. For me this was one of the challenges you mentioned to move the node_modules folder to production. Instead of moving the whole folder to production or only running the npm install command on production server, I figured out and tried a way of bundling my Node.js app using Webpack into a single/multiple bundles, and I just got rid of the mess of managing node_modules folder. It only picks up the required node_modules packages that are being used/referred in my app and bundles up in a single file along with my app code and I deploy that single file to production without moving the entire node_modules folder.
I found this approach useful in my case but please suggest me if this is not the correct way regarding the performance of the app or if any cons of this approach.
Definitely npm install. But you shouldn't do this by your own hand when it comes to deploying your app.
Use the tool for this like PM2.
As for your concern about changes in packages, the short answer is package-lock.json.
My guess is that by asking this question you don't really understand the point of the package.json file.
The package.json file is explicitly intended for this purpose (that, and uploading to the npm registry), the transfer of a node package without having to transfer the sizeable number of dependencies along with it.
I would go as far as to say that one should never manually move the node_modules directory at all.
Definitely use the npm install command on your production server, this is the proper way of doing it. To avoid any changes to the node_modules directory as compared to your local environment, use the package lock file. That should help with minimising changes to the source code in node_modules.
I mean no bad intent by saying this

Easy Install of Node js package on without publishing

I am looking for a way to deploy a node js app to multiple machines locally.
Is there some way to create a batch file to zip, or installer file, that will put my node js application and all its dependencies, and possibly get node js too easily on multiple machines by sending one or more files to install?
Also, is there some way to provide updates if the code is updated to all these machines?
Basically, I want to be able to install my node js package/application on multiple locations locally without having to publish my work to npm. Any ideas? cant seem to find anything out there except for putting node js on a web server, or publishing to npm?
This is quite vast. Without using advanced tools these two could work :
git pull origin master
npm install
or a solution with rsync
node js application and all its dependencies
Run an npm install where you're developing your application. Then, just tarball the whole thing, including the node_modules directory. When you deploy your tarball to another machine, be sure to run npm rebuild so that any binary dependencies are built for the platform you just deployed to. If you do your initial npm install on the same platform type, you can usually skip the rebuild step.
Also, is there some way to provide updates if the code is updated to all these machines?
There are an infinite number of ways, and what you pick depends on your needs. You could check-in your whole project including node_modules to version control and just have a Bash script regularly pull from a branch and bounce things as necessary for your specific needs. Beware though that node_modules tends to be huge... it's usually left out of version control. Perhaps stick to the tarball on a server and pull that as necessary.
and possibly get node js too
Keep that separate. You don't need to deploy Node.js every time you deploy your application.

NPM errors and control in Azure Websites

I want to build my Node.JS application in a Azure Website.
There will be an usage of different NPM packages via my packages.json file.
My problem is that I often receive error messages which are related to missing NPM files.
Normally I put my files via FTP or edit them per VS Studio 15 Azure plugin directly on the server. This may be the reason why NPM isn't triggering as Microsoft intended it.
I would prefer a way in which I can just run commands with elevated privileges to have full control over NPM by myself.
Which ways are possible to avaid these problems?
If you're publishing your nodeJS application 'manually' via FTP there are little concerns about that.
First of All, 'manually' means manually.
Git
If you use continuous deployment via Git the final deployment step is to call npm install in your current application folder, this will install all the packages listed in package.json file.
The node_modules folder is excluded by default in .gitignore file, so all packages are downloaded by the server
Web deployment
If you're using web deployment from visual studio or command line, all the files contained by your solution are copied to Hosting environment including node_modules folder , because of this the deployment would take a long time to finish due the huge amount of dependencies and files that the folder contains.
Even worst: this scenario could take you to the same scenario you're facing right now.
FTP deployment
You're copying everything yourself. So the same thing occurs in Web Deployment is happen in FTP deployment method.
--
The thing is that when you copy all those node_modules folder contents you're assuming that those dependencies remains the same in the target enviroment, most of the cases that's true, but not always.
Some dependencies are platform dependent so maybe in you're dev environment a dependency works ok in x86 architectures but what if your target machine or website (or some mix between them) is x64 (real case I already suffer it).
Other related issues could happen. May be your direct dependencies doesn't have the problem but the linked dependencies to them could have it.
So always is strongly recommended to run npm install in your target environment and avoid to copy the dependencies directly from your dev environment.
In that way you need to copy on your target environment the folder structure excluding node_modules folder. And then when files are copied you need to run npm install on the server.
To achieve that you could go to
yoursitename.scm.azurewebsites.net
There you can goto "Debug Console" Tab, then goto this directory D:\home\site\wwwroot> and run
npm install
After that the packages and dependencies are downloaded for the server/website architecture.
Hope this helps.
Azure tweak the Kudu output settings, in local Kudu implementations looks the output is normalized.
A workaround -non perfect- could be this
npm install --dd
Or even more detailed
npm install --ddd
The most related answer from Microsoft itself is this
Using Node.js Modules with Azure applications
Regarding control via a console with elevated privileges there is the way of using the Kudu console. But the error output is quite weird. It's kind of putting blindly commands in the console without much feedback.
Maybe this is a way to go. But I didn't tried this yet.
Regarding deployment it looks like that Azure wants you to prefer Continuous Deployment.
The suggested way is this here.

How to package & deploy Node.js + express web application?

I am new to Node.js programming and I have recently created a sample working web application using (express, backbone & other complimentary view technologies, with mongoDB). Now i am at a point where I want to deploy the same on a staging environment and I am not sure how to package this application and distribute the same. [I can take care of mongoDb and setting it up seperately]
I am from Java world and in there we create jars for reusable libs and war/ear packages for web applications which is deployed in a servlet container. Now in this case since node.js itself acts as a web container as well, how do i package my webapp?
Is there any standard format/guidelines of packaging node webapps built using express? (Is there a similar jar/war packaging systems for node apps?)
How do I deploy it once packaged? Would it become an exe, since it is also its own container?
PS: As of now I am thinking of just manually copying all the required source files into the staging environment and run npm commands to download all dependencies on that machine and then use 'forever' or some other mechanism to run my server.js. (Also, add some sort of monitoring, just in case app crashes and forever fails) I am not sure if that is the right way? I am sure there must be some standardized way of addressing this problem.
Deploying Node.js applications is very easy stuff. In maven, there is pom.xml. Related concept in Node.js is package.json. You can state your dependencies on package.json. You can also do environmental setup on package.json. For example, in dev environment you can say that
I want to run unit tests.
but in production;
I want to skip unit tests.
You have local repositories for maven under .m2 folder. In Node.js, there is node_modules folder under your Node.js project. You can see module folders with its name.
Let's come to the grunt part of this answer. Grunt is a task manager for your frontend assets, html, javascript, css. For example, before deployment you can minify html, css, javascript even images. You can also put grunt task run functions in package.json.
If you want to look at a sample application, you can find an example blog application here. Check folder structure and package.json for reference.
For deployment, I suggest you heroku deployment for startup applciations. You can find howto here. This is simple git based deployment.
On project running part, simply set your environment NODE_ENV=development and node app.js. Here app.js is in your project.
Here is relative concept for java and nodejs;
maven clean install => npm install
.m2 folder => node_modules(Under project folder)
mvn test => npm test(test section on package.json)
junit, powermock, ... => mocha, node-unit, ...
Spring MVC => Express.JS
pom.xml => package.json
import package => require('module_name')
There is no standardized way, but you're on the right track. If your package.json is up to date and well kept, you can just copy/zip/clone your app directory to the production system, excluding the node_modules.
On your production system, run
npm install to install your dependencies, npm test if you have tests and finally NODE_ENV=production node server.js
Some recent slides I considered to be quite helpful that also include the topic of wrappers like forever, can be found here.
Hope this might be helpful for somebody looking for the solution,Packaging of Node js apps can be done using "npm pack" command.It creates a zip file of your application which can be run in production/staging environment.
Is there any standard format/guidelines of packaging node webapps
built using express? (Is there a similar jar/war packaging systems for
node apps?)
Yes, the CommonJS Packages specification:
This specification describes the CommonJS package format for
distributing CommonJS programs and libraries. A CommonJS package is a
cohesive wrapping of a collection of modules, code and other assets
into a single form. It provides the basis for convenient delivery,
installation and management of CommonJS components.
For your next question:
2. How do I deploy it once packaged? Would it become an exe, since it is also its own container?
I second Hüseyin's suggestion to deploy on Heroku for production. For development and staging I use Node-Appliance with VirtualBox and Amazon EC2, respectively:
This program takes a Debian machine built by build-debian-cloud or
Debian-VirtualBox-Appliance and turns it into a Node.js "appliance",
capable of running a Node application deployed via git.
Your webapp will not become an exe.
few ways to approach this:
Push your code into Git repository, excluding everything that isn't your code (node_modules/**), then pull it in your staging environment, run npm install to restore all dependencies
create an NPM package out of it , install it via npm in your staging environment (this should also take care of all of the dependencies)
manual copy/ssh files to your staging environment (this can be automated with Grunt), than restore your dependencies via npm
I used zeit's pkg module. It can create cross platform deliverables for linux/win/macos. Actually used it in production and works fine without any issues.
It takes in all the js scripts and packages it into a single file.
The reason I used it is because it helps in securing your source code. That way in production at customers environment they will have access to application but not the source code.
Also one of the advantages is that at production environment, you do not actually need to have the customer install node.js as the node binaries also get packaged inside the build.
https://www.npmjs.com/package/pkg

Resources