Running multiple nodeJS Applications - node.js

Is it possible to run for example 3 or more nodeJS Apps with a command via a shell script?
The idea is i have a shell script and navigate into app directories and type the npm command.
npm package concurrently is not an option.
#!/bin/sh
cd ./firstApp && npm start ...
cd ./seconndApp && npm run dev ...
cd ./thirdApp && npm run dev ...

I would also have suggested concurrently, you didn't give a reason for not using that.
Another possible solution might be Lerna
https://github.com/lerna/lerna
Lerna is a tool that optimizes the workflow around managing multi-package repositories with git and npm.
Lerna can also reduce the time and space requirements for numerous copies of packages in development and build environments - normally a downside of dividing a project into many separate NPM packages.
One of things lerna can do is run all packages at the same time. Read the doco and decide if it fits your use case.

Related

Running npm build before/after npm install

I'm not familiar with npm so I might be holding the wrong end of the shovel here...
There is a package on npm that I would like to modify and use in my own project. The package is angular-crumbs. I forked the source repo (https://github.com/emilol/angular-crumbs) into my own account (https://github.com/capesean/angular-crumbs) and then run npm install capesean/angular-crumbs -force. However, this produces a node_modules folder in my project that hasn't been built (and whatever else - as I understand it) with the commands in the source repo's package.json file:
"build": "npm run clean && npm run transpile && npm run package && npm run minify && npm run copy"
i.e. it doesn't have the types, the correct package.json file, etc.
So my question is, how do I get the properly-built files (including type definitions, etc.) from my own repo to install or build-after-installing in my target project?
I am not sure about what you're trying to do, are trying to work on the
angular-crumbs source code, or are you trying to use it in your own project as a dependencuy ?
Anyway, running npm install will install all your dependencies so that you can directly use them in your project, those dependencies don't need to be built after they are installed.
In your case you seem to have an angular application (which is completely different from node.js), usually to start an angular app you can run ng serve which will build your source code and run an angular server so you can access it on localhost.

Yarn install with multiple git dependencies results in "EINVAL: invalid argument, mkdir ..." error

Node (v14.2.0), Yarn (1.22.4), Windows 10
Context: I have several node projects hosted in a private git repo. I have several cross dependencies between projects, e.g. project C depends on projects A and B, and project D may depend on C and A (perhaps this is my problem?). I generally have my package.json files set up to use the git repos directly, and it works reasonably well for the projects with one or two dependencies.
One of my larger projects has many dependencies on my other projects. Running yarn install on this project gives me this error consistently:
EINVAL: invalid argument, mkdir [some C:\\...Yarn\\Cache\\... directory]
The install ends with that error, and node_modules not being created.
I worked around the issue by removing all (nine) git dependencies from my package.json and then adding them one-by-one and running yarn install each time. No issues, no errors, and in the end I have a fully functioning node project. Great success!
The question here then, is why can I not install (run yarn install) everything at once. I have tried the tricks I found googling - clear the yarn cache, use npm install, run npm adduser or npm login, run as administrator... every combination of those actions resulted in the same EINVAL error.
My guess would be that yarn is trying to do "too many things at once" and its resulting in filesystem errors (trying to mkdir a dir that is locked)... but why is this not documented, and more importantly, why is there not a way to tell yarn to install "one thing at a time"? If there is, and I missed it, I would love to know about it.
Cheers!

Is there good way to run a script only when the user install my npm package manually?(not installed by dependency)

I'm under developing my npm package and I want to run a specific script only when user installed my package manually like npm install my-package --save-dev or something npm command.
I'd not like to run the command when the package was installed via package dependency.
For example, My package is dependency of the other package other-package.
Even if the user installed other-package manually and npm should install my-package as dependency, I'd not like to run the script.
Is there good way to handle this?
npm has a set of scripts that will automatically run when npm is launched a particular way. The scripts you might be interested in are:
prepublish: Run BEFORE the package is published. (Also run on local npm install without any arguments.)
publish, postpublish: Run AFTER the package is published.
preinstall: Run BEFORE the package is installed
install, postinstall: Run AFTER the package is installed.
There is no event that exactly matches your criteria but you could use one of the install events and then have an intermediate script that detects the npm command line options before your actual script.
Due to the (imo horrible) way prepublish works, a number of people have written modules to do a similar task and these could easily be adapted to your requirements.
iarna/inpublish is a good example. It check's process.env['npm_config_argv'] for the existence of /^i(n(s(t(a(ll?)?)?)?)?)?$/
Using the following package.json setup:
"scripts": {
"postinstall": "my-install && install-manual-tasks || not-my-install"
}
If my-install uses process.exit(0) then install-manual-tasks will run. If you process.exit(1), not-my-install will clean up so the npm task doesn't fail.
I think this setup actually has an issue. If your install-manual-tasks fails, the exit status is silenced and the npm task won't fail but it's a start at least. You could work around this by doing all your checks in the install-manual-tasks script, then you don't need use the shell tricks to run multiple scripts.
You can take a look at https://superuser.com/a/105389/627275 to see how to create a shell function that acts as an alias to a command. This way, npm install would act as an alias to whatever you want to run in reality. It could also be an alias to bash something.sh & npm install.
Hope that answers your question!

Speeding up the npm install

I am trying to speed up the npm install during the build process phase. My package.json has the list of packages pretty much with locked revisions in it. I've also configured the cache directory using the command
npm config set cache /var/tmp/npm-cache --global
However, on trying to install using npm install -g --cache, I find that this step isn't reducing the time to install by just loading the packages from cache as I would expect. In fact, I doubt if it's even using the local cache to look up packages first.
Proposing two more modern approches:
1) npm ci
Use npm ci, which is available from npm version 5.7.0 (although I recommend 5.7.1 and upwards because of the broken release) - this requires package-lock.json to be present and it skips building your dependency tree off of your package.json file, respecting the already resolved dependency URLs in your lock file.
A very quick
boost for your CI/CD envs (our build time was cut down to a quarter of the original!) and/or to make sure all your developers sit on the same versions of dependencies during development (without having to hard-code strict versions in your package.json file).
Note however that npm ci removes the node_modules/ directory before installing, so it won't benefit from any caching strategies.
2) npm i --prefer-offline
Use the --prefer-offline flag with your regular npm install / npm i. With this approach, you need to make sure you've cached your node_modules/ directory between builds (in a CI/CD environment). If it fails to find packages locally with the specific version, it falls back to the network safely.
You can also add --no-audit --progress=false to reduce pre-install checks and remove the progress bar (latter is only a very slight improvement)
For pure npm solution, you may try
npm install --prefer-offline --no-audit --progress=false
Prefer offline may not be useful for the first run.
As suggested by #Daniel Serodio
You could also include your node_modules folder inside your repository but you should probably zip it first than add to repo, and while installing you can unzip it and just
npm rebuild
(which works cross platform) it is quite fast.
This would also give you the benefit of full control over all your dependencies.
Also you can set the process flag to false to increase your speed by 2x.
npm set progress=false
Read source for more info
Update:
You can also use pnpm for this
npm i -g pnpm
This basically use local cached modules (i have heard its better then YARN)
It's better to install pnpm package using the following command:
npm i -g pnpm
pnpm uses hard links and symlinks to save one version of a module only ever once on a disk. When using npm or Yarn for example, if you have 100 projects using the same version of lodash, you will have 100 copies of lodash on disk. With pnpm, lodash will be saved in a single place on the disk and a hard link will put it into the node_modules where it should be installed.
As an example I can mention that whenever you want to install the dependencies of package.json file, what you should do is simply that enter the pnpm i and it handles the other things by itself.
UPDATE: The original answer is from 2014. I wouldnt recommend checking in node_modules, as there are definitly better options around speeding up the install especially for a ci pipeline, eg. npm ci --only=production
You could also include your node_modules folder inside your repository (you are probably using git), and just npm rebuild (which works cross platform) on build/deploy processes, and is pretty fast.
This would also give you the benefit of full control over all your dependencies (I know that's what shrinkwrap usually should be used for)
Edit:
Also you can set the progress flag to false to increase your speed by at least 20%. This works only with npm#v3.x.x, and there will be hopefully fixes for that soon (see second link)
npm set progress=false
Tweet about finding
Github Issue Cause identification
As very modern solution you can start to use Docker.
Docker allows you virtualize and pre-define as image the current state of your code, including installed npm-modules and other goodies.
Once the docker image for your infrastructure/env is built locally, or retrieved from remote repository, it will be stored on the host machine, and you can spin server in seconds.
Another benefit of it is that you use same virtualized code infrastructure on any machine where you deploy your code.
Docker speeds up install/deployment processes and is widely used technology.
To start using docker is enough to (all the snippets are just mock/example for pre-setup and are not by any means most robust/elegant solution) :
1) Install docker and docker-compose using manuals and get some basic understanding of it at docker.com
2) Write Dockerfile file in root of your application
FROM node:6.9.5
RUN mkdir /usr/local/app
WORKDIR /usr/local/app
COPY package.json package.json
RUN npm install
3) create docker-compose.yml in the root of your project with such content:
version: "2"
server:
hostname: server
container_name: server
image: server
build: .
command: sh -c 'NODE_ENV=development PORT=8080 node app.js'
ports:
- "8080:8080"
volumes: #list of folders and files to use
- ${PWD}/server:/usr/local/server
- ${PWD}/app.js:/usr/local/app.js
4) To start server you will need to docker-compose up -d. To see the logs docker-compose logs -f server. If you will restart your server it will do it in seconds once it built the image already at once.
Then it will cache build layers locally so next run will take only few seconds.
I know this might be bit of a robust solution, but I am sure it is have most potential/flexibility and is widely used in industry. And while it requires some learning for anyone who did not use Docker before, in my humble oppinion, it is the best one for your problem.
Nothing helped me more than disabling antivirus (Windows Defender in my case) I got from 2:30 to 1 minute.
With npm-cache package I got to ~30 secs.
I tried to use yarn, which is very fast, but was randomly failing in my case.
We have been trying to solve this problem to speed up our deployments.
We have settled on using pac, which follows the principles in the other answers. It zips the npm modules and includues them in your repo so you don't have a million files in your commits and code reviews and you can just unzip/rebuild for the target machine.
https://www.npmjs.com/package/pac

Jenkins script quitting prematurely when using npm install on Windows

In my Jenkins job I want to build a JavaScript app using Grunt. The Jenkins build scripts creates a build directory (if it doesn't already exist), changes to that directory and runs:
npm install grunt
npm install grunt-zip
grunt --gruntfile=[something]
(Of course grunt-cli is installed globally.) When I build the job, the first statement causes Grunt and dependencies to be pulled down as expected. However, the job then terminates successfully:
Archiving artifacts
No emails were triggered.
Finished: SUCCESS
The second npm install is not run. Any idea why the script is terminating after running npm install instead of continuing to the subsequent statements?
So it turns out that npm is a batch file, not an executable, so it needs to be invoked using call from the Jenkins script:
call npm install grunt
i would recommend not using the local grunt / nodejs install, but instead getting jenkins to do this for you!
it's much easier and means there's less coupling to system specific installs and variables.
steps:
a) use nodejs jenkins plugin + get it to install nodejs on machine/grunt-cli -> Jenkins integration with Grunt
b) populate your package.json with any nodejs dependances required, eg grunt/grunt-zip etc
c) when running grunt just do a "npm update" before "grunt" command
that way your not doing explicit npm install, it's all configured from your package.json, and your build scripts will be less brittle, and your developers can use the same steps as the build server, eg "npm update;grunt" locally same as build server
For future googlers:
use command chaining for this.
This works:
npm install && npm install install grunt-zip
This wont work:
npm install
npm install grunt-zip

Resources