package.json scripts that work with npm and yarn? - node.js

I am using npm as a build tool and so in my package.json, and some of my scripts depend on other scripts:
{
"test": "npm run lint && mocha"
}
This hardcodes the npm package manager into package.json. How can make this approach to expressing dependencies work with both npm and yarn?

The $npm_execpath environment variable refers to the build tool, so just replace npm with the $npm_execpath:
{
"test": "$npm_execpath run lint && mocha"
}
Both npm test and yarn test will work, and will use the appropriate build tool.

While mjs' answer is great, there's also a small package that is purported to work on all environments including Windows: https://www.npmjs.com/package/yarpm
To use in a project, run yarn add yarpm --dev / npm i -D yarpm and then just use yarpm in your scripts like this:
{
"test": "yarpm run lint && mocha"
}
As the package README notes, you just need to make sure your commands would be suitable for passing through to either yarn or npm: you cannot use arguments/flags that only work on one package manager.

Related

npm install everything except named package

An extract from my Dockerfile which builds my Angular app:
# Do other stuff
RUN npm install
# next line executes ng build --prod
RUN npm run build:prod
The npm install is for build purposes. Some of the packages in devDependencies - particularly cypress - take ages to install and are not needed for the build. However, some packages in devDependencies are needed.
Can I, for example, do an npm install everything except cypress?
npm install --only=prod is not an option because some devDependencies are needed.
Thanks in advance for your thoughts.
Just a suggestion. Use install-subset, and can be installed globally with npm install -g install-subset
To use it, you build inclusion lists and exclusion lists for named installation subsets in your package.json like this:
"subsets": {
"build": {
"include": [
"babel-cli",
"dotenv"
]
},
"test": {
"exclude": [
"eslint",
"lint-rules",
"prettier"
]
}
}
Then run install-subset test
This will temporarily rewrite your package.json to not install those packages excluded, then restore it (very similar to how lerna operates), which depending on the packages can save a lot of time and bandwidth.

Passing arguments to combined npm script

I have in my package.json the following
"scripts": {
...
"prod": "gulp build --production && webpack --env.config=production"
}
I now want to pass a parameter "theme" to both gulp and webpack to be able to adjust the output of the build process from the command line.
I figured out how to pass it to webpack: npm run prod -- --env.theme=themename but gulp does not take care of this. I also played around with the yargs-package, processs.argv and bash string substitution by changing the npm script to "gulp build --production \"$1\" && webpack --env.config=production" but that did not work out either.
How can this be achieved? What am I missing? Any hints highly appreciated!
If you're using Bash you can use a function in your npm-script.
For instance:
"scripts": {
...
"prod": "func() { gulp build --production \"$1\" && webpack --env.config=production \"$1\"; }; func"
}
However, for a cross-platform solution you'll need to consider invoking a nodejs script which exec's the commands - in a similar way shown in Solution 2 of my answer here.

Run npm scripts using local deps

Currently I run npm scripts using local deps this way:
package.json:
"scripts": {
"test": "node ./node_modules/karma/bin/karma start",
"node-test": "node ./node_modules/jasmine/bin/jasmine",
"build": "node ./node_modules/gulp/bin/gulp build"
},
I don't want to use global deps, since I can forgot to add deps to the package.json. This way when a local dep is missing, then I got an error message and I don't have problems because some deps are not installed globally, e.g. karma plugins.
Is there a better (shorter) way to define npm scripts using the local libs? Is this travis compatible?
edit:
If it wasn't obvious I have the same libs installed globally, but I want to use the local installs by these projects. That means when I start karma with karma start then the globally installed version will start the karma server, which means that if I don't have all of the karma plugins globally installed, then I got error.
Another problem that I have windows, so the solutions described here: How to use package installed locally in node_modules? do not work. Windows does not recognize the #!/bin/sh and the #!/usr/bin/env node head sections and there is no sh command as far as I can tell. At least not in webstorm terminal. Git bash has the sh command, but I want to run these npm scripts from webstorm terminal.
One possible solution could be to fix somehow webstorm so it could use sh from terminal. After that I could use $(npm bin) I assume. But that's just a guess. I am not sure whether this can be done.
npm automatically puts prepends the path ./node_modules/.bin to your PATH env before it executes commands run by using npm run (including the two "magic" shortcuts npm start and npm test)
npm scripts docs
You can just set this up with:
"scripts": {
"test": "karma start",
"node-test": "jasmine",
"build": "gulp build"
}
Assuming that you have karma, jasmine and gulp-cli listed in either your devDependencies or dependencies (so that they're install when doing npm install)
And yes, it is travis-compatible. Here is an example of a package that is tested on travis using tap which is installed locally as a module:
https://github.com/scriptoLLC/couchdown/

How can I invoke npm on heroku command line (to install bower components)?

Bower is for client side Javascript what npm is for the server side and reads a component.json file to recognize dependencies that should be fetched at deploy time so I'd be happy it heroku would run it at slug compilation time.
Unfortunately I can not invoke npm or bower from a heroku console or one-off command (heroku run "npm help") (heroku run bash -> npm help) as it's possible with ruby's rake. I've put npm and node (latest/x versions) in my package.json but in the engines section, not the dependencies.
I think this could be solved by customizing the node buildpack but I consider this a little too heavy task just for activating something so obvious.
You can also setup a postintall command, something like this in your package.json
"dependencies": {
"bower": "0.6.x"
},
"scripts": {
"postinstall": "./node_modules/bower/bin/bower install"
}
Then npm install will also install bower dependencies.
Pros : one command to rule them all.
Cons : you unnecessarily embed bower as a dependency.
You can use run like this:
heroku run npm install git://github.com/webjay/kaiseki
You should declare NPM dependencies in the package.json file
All you install from shell will be deleted on exit shell. You are in a cloned instance.
You can use bower directly like this
"dependencies": {
"bower": "^1.7.9"
},
"scripts": {
"postinstall": "sudo bower install --allow-root "
}

npm script running for preinstall, but not for preupdate

I have a script referenced in packages.json for a node app.
The script is run fine when I do
npm install
but not for
npm update
The excerpt from packages.json is
"scripts": {
"start": "node app.js",
"preinstall": "node scripts/install.js",
"preupdate": "node scripts/install.js"
}
The whole file is at https://github.com/Pike/outreach/blob/master/package.json.
As I understand it they've disabled the scripts for npm update (preupdate/postupdate) -- something about best practice...
It's a complete pain in the ass -- they want you to use node-gyp and .gyp files for building node modules -- it has a dependency on python! No thanks!
I'm still banging my head on the keyboard over this -- since we cant install python on our production servers.
Update
Python is available on most Linux Distros, not too much of a big deal

Resources