Easy Install of Node js package on without publishing - node.js

I am looking for a way to deploy a node js app to multiple machines locally.
Is there some way to create a batch file to zip, or installer file, that will put my node js application and all its dependencies, and possibly get node js too easily on multiple machines by sending one or more files to install?
Also, is there some way to provide updates if the code is updated to all these machines?
Basically, I want to be able to install my node js package/application on multiple locations locally without having to publish my work to npm. Any ideas? cant seem to find anything out there except for putting node js on a web server, or publishing to npm?

This is quite vast. Without using advanced tools these two could work :
git pull origin master
npm install
or a solution with rsync

node js application and all its dependencies
Run an npm install where you're developing your application. Then, just tarball the whole thing, including the node_modules directory. When you deploy your tarball to another machine, be sure to run npm rebuild so that any binary dependencies are built for the platform you just deployed to. If you do your initial npm install on the same platform type, you can usually skip the rebuild step.
Also, is there some way to provide updates if the code is updated to all these machines?
There are an infinite number of ways, and what you pick depends on your needs. You could check-in your whole project including node_modules to version control and just have a Bash script regularly pull from a branch and bounce things as necessary for your specific needs. Beware though that node_modules tends to be huge... it's usually left out of version control. Perhaps stick to the tarball on a server and pull that as necessary.
and possibly get node js too
Keep that separate. You don't need to deploy Node.js every time you deploy your application.

Related

How do you deploy a continuous running app when using npm-cli

I have a largish nodejs based web app, with both server and client components working together. I am currently deploying the app, but using git pull to take my latest production branch from the server repository. A git post-commit hook runs to do a npm install and a rebuild of the servers .env file, and PM2 is monitoring the various processes (3 web servers) using a change in the .env file to restart them.
node_modules is at the highest level of the project with separate server and client subdirectories. Since this is using http2 on a fast lan, I don't bother compressing the client files with web-pack or the like, although I do use rollup on lit-element and lit-html to sort out the import statements (they are not relative or absolute) that they have embedded in them.
I've just been reading that I should really have been doing an npm ci for my node dependencies, but reading the instructions for that it says it blows away the node_modules directory and starts again (whereas npm install doesn't). Since this is all running on a raspberry pi its not instantaneous.
I am not sure a temporary loss of node_modules should effect a running app too much - after all I believe the modules will all have been cached into memory, but it just might have not and there also a possibility that one of the servers falls over and pm2 restarts it, so I am wondering ....
So what is best practice here. Is it possible for instance to copy package.json, package-lock.json to a special build subdirectory, build the node_modules directory there and then move it back into place. Once built. Or is there a better way?

Best practice for nodejs deployment - Directly moving node_modules to server or run npm install command

What is the best practice for deploying a nodejs application?
1) Directly moving the node_modules folders from the development server to production server, so that our same local environment can be created in the production also. Whatever changes made to any of the node modules remotely will not affect our code.
2) Run npm install command in the production server with the help of package.json. Here the problem is, any changes in the node modules will affect our code. I have faced some issues with the loopback module (issue link).
Can anyone help me?
Running npm install in production server cannot be done in certain scenario (lack of compiling tools, restricted internet access, etc...) and also if you have to deploy the same project on multiple machines, can be a waste of cpu, memory and bandwidth.
You should run npm install --production on a machine with the same libraries and node version of the production server, compress node_modules and deploy on production server. You should also keep the package-lock.json file to pinpoint versions.
This approach allows you also to build/test your code using development packages and then pruning the node_modules before the actual deploy.
Moving node_modules folder is overkilled.
Running npm install might break the version dependencies.
The best approach is npm ci. It uses the package_lock file and installs the required dependencies without modify the versions.
npm ci meant for continuous integration projects. LINK
I am an ASP.NET Core developer but I recently started working with Node.js apps. For me this was one of the challenges you mentioned to move the node_modules folder to production. Instead of moving the whole folder to production or only running the npm install command on production server, I figured out and tried a way of bundling my Node.js app using Webpack into a single/multiple bundles, and I just got rid of the mess of managing node_modules folder. It only picks up the required node_modules packages that are being used/referred in my app and bundles up in a single file along with my app code and I deploy that single file to production without moving the entire node_modules folder.
I found this approach useful in my case but please suggest me if this is not the correct way regarding the performance of the app or if any cons of this approach.
Definitely npm install. But you shouldn't do this by your own hand when it comes to deploying your app.
Use the tool for this like PM2.
As for your concern about changes in packages, the short answer is package-lock.json.
My guess is that by asking this question you don't really understand the point of the package.json file.
The package.json file is explicitly intended for this purpose (that, and uploading to the npm registry), the transfer of a node package without having to transfer the sizeable number of dependencies along with it.
I would go as far as to say that one should never manually move the node_modules directory at all.
Definitely use the npm install command on your production server, this is the proper way of doing it. To avoid any changes to the node_modules directory as compared to your local environment, use the package lock file. That should help with minimising changes to the source code in node_modules.
I mean no bad intent by saying this

NPM errors and control in Azure Websites

I want to build my Node.JS application in a Azure Website.
There will be an usage of different NPM packages via my packages.json file.
My problem is that I often receive error messages which are related to missing NPM files.
Normally I put my files via FTP or edit them per VS Studio 15 Azure plugin directly on the server. This may be the reason why NPM isn't triggering as Microsoft intended it.
I would prefer a way in which I can just run commands with elevated privileges to have full control over NPM by myself.
Which ways are possible to avaid these problems?
If you're publishing your nodeJS application 'manually' via FTP there are little concerns about that.
First of All, 'manually' means manually.
Git
If you use continuous deployment via Git the final deployment step is to call npm install in your current application folder, this will install all the packages listed in package.json file.
The node_modules folder is excluded by default in .gitignore file, so all packages are downloaded by the server
Web deployment
If you're using web deployment from visual studio or command line, all the files contained by your solution are copied to Hosting environment including node_modules folder , because of this the deployment would take a long time to finish due the huge amount of dependencies and files that the folder contains.
Even worst: this scenario could take you to the same scenario you're facing right now.
FTP deployment
You're copying everything yourself. So the same thing occurs in Web Deployment is happen in FTP deployment method.
--
The thing is that when you copy all those node_modules folder contents you're assuming that those dependencies remains the same in the target enviroment, most of the cases that's true, but not always.
Some dependencies are platform dependent so maybe in you're dev environment a dependency works ok in x86 architectures but what if your target machine or website (or some mix between them) is x64 (real case I already suffer it).
Other related issues could happen. May be your direct dependencies doesn't have the problem but the linked dependencies to them could have it.
So always is strongly recommended to run npm install in your target environment and avoid to copy the dependencies directly from your dev environment.
In that way you need to copy on your target environment the folder structure excluding node_modules folder. And then when files are copied you need to run npm install on the server.
To achieve that you could go to
yoursitename.scm.azurewebsites.net
There you can goto "Debug Console" Tab, then goto this directory D:\home\site\wwwroot> and run
npm install
After that the packages and dependencies are downloaded for the server/website architecture.
Hope this helps.
Azure tweak the Kudu output settings, in local Kudu implementations looks the output is normalized.
A workaround -non perfect- could be this
npm install --dd
Or even more detailed
npm install --ddd
The most related answer from Microsoft itself is this
Using Node.js Modules with Azure applications
Regarding control via a console with elevated privileges there is the way of using the Kudu console. But the error output is quite weird. It's kind of putting blindly commands in the console without much feedback.
Maybe this is a way to go. But I didn't tried this yet.
Regarding deployment it looks like that Azure wants you to prefer Continuous Deployment.
The suggested way is this here.

What are the main differences of demeteorizer and meteor bundle?

Having gone through and used demeteorizer. I wonder what are the main differences between setting up meteor vs demeteorizer and running it via node; on own server?
meteor only
hot swappable code?
problem in maintaining packages similar from production and dev
same meteor versions running on prod and dev
hardcoded environment settings (i.e. mongo)
demeteorizer
platform independant as this auto bundles dependancies and uses pure nodejs
organise and maintain mongodb how you like (backup scripts etc)
I have been using demeteorizer (packaging->upload->running forever), but wonder if there are any performance or issues in the long run.
I have seen packages such as "authentication" running okay locally but very slow on the test server (hangs on submit, indicating sync problems?)
thanks in advance.
ref: https://twitter.com/SachaGreif/status/424908644590030848
Demeteorizer builds on top of meteor bundle with one small difference: Demeteorizer builds a package.json for you and deletes the node_modules directories.
Without demeteorizer you would have a bit of trouble deploying your app, particularly if it was on a different platform to the one you built your app on.
If you see meteor's own docs, you have to remove fibers and manage your npm modules yourself, manually. With a package.json you can run npm install and have them all installed for you, including ones from packages.
Why is this useful? For services like modulus it means you can upload an app and have it install all your dependencies for you without you having to think about it, as if it were an ordinary node-js app.
Everything that applies to meteor bundle will also apply to demeteorizer as it is still the same meteor bundled app, just with the package.json. So you can use forever, hard coded/environment based settings, etc the same way.

Understanding npm and Node.js install location for modules

I've been using Node.js and npm for a few weeks with great success and have started to question the best practice for installing local modules. I understand the Global vs Local argument, however, my question has more to do with where to place a local install. Let's say that I have a project located at ~/ProjectA/ which is version controlled and worked on by multiple developers. When initially playing with Node.js and npm I wasn't aware of the default local installation paths and just simply installed the necessary modules in a default terminal which resulted in a installation path of ~/node_modules. What this ended up doing is requiring all the other developers working on the project to install the modules on their own machines in order to run the application. Having seen where some of the developers ran npm install I'm still really surprised that it worked on their machines at all (I guess it relates to how Node.js and require() looks for modules), but needless to say, it worked.
Now that the project is getting past the "toying around" stage, I would like to setup the project folder correctly. So, my question is, should the modules be installed at ~/ProjectA/node_modules and therefore be part of the version controlled project files, or should it continue to be located at a developer-machine specific location...or does it not really matter at all?
I'm just looking for a little "best-practice" guidance on this one and what others do when setting up your projects.
I think that the "best practice" here is to keep the dependencies within the project folder.
Almostly all Node projects I've seen so far (I'm a Node developer has about 8 months now) do that.
You don't need to version control the dependencies. That's how I manage my Node projects:
Keep the versions locked in the package.json file, so everyone gets the same working version, or use the npm shrinkwrap command in your project root.
Add the node_modules folder to your VCS ignore file (I use git, so mine is .gitignore)
Be happy, you're done!

Categories

Resources