How to make npm scripts refer to local packages? - node.js

I am working on a project and I want to use JSdoc for documentation. I have listed it in devDependencies and created a script to run it. My package.jsonlooks like:
"scripts": {
"doc": "jsdoc -c ./conf.json"
},
"devDependencies": {
"jsdoc": "^3.4.3"
}
Cloning my project on another machine and typing npm install installs jsdoc to the node_modules folder.
However, when I run npm run doc, I get a large error, the center of which is: 'jsdoc' is not recognized as an internal or external command
This is because JSdoc is installed in node_modules not my local path.
What is the best way to point npm-run commands to my locally installed modules? Also, what is the point of having dependencies such as JSdoc listed in devDependencies (thus installed in node_modules) when they can not be used from there?

Related

Google App Engine Standard Node JS how to run build script?

Does GAE standard for Node support a way to have build scripts? I tried using postinstall within package.json but that did not work.
My codebase has subdirectories with package.json within the subdirectories. In my root package.json there is
scripts: {
postinstall: cd vendor && npm install
....
}
However I'm not seeing any vendor packages installed so I'm inclined to believe the postinstall does not get triggered on GAE Node standard.
Is there any way for me to install subdirectory dependencies without having to copy and paste all my vendor/package.json dependencies to the root?
Note: I've also tried putting an "install" within the package.json scripts but that didn't seem to get triggered either.
In GAE standard, installation of dependencies are automatically managed. You should add them in your package.json.
As Google documentation mentioned :
When you deploy your app, the Node.js runtime automatically installs all dependencies declared in your package.json file using the npm install command.
{
"dependencies": {
"lodash": "^4.0.1"
}
}
Installation will be done during app deployment via :
gcloud app deploy
To add a build step, run the following:
gcloud beta app gen-config --custom
This will generate the default dockerfile and config that is run. In your .dockerfile, add your build step:
RUN npm run build --unsafe-perm || \
((if [ -f npm-debug.log ]; then \
cat npm-debug.log; \
fi) && false)
"prestart": "if [ ! -d build ]; then npm run build; fi",
" -d build" here is the build process generated folder, replace it to whatever you actually use.
Not sure if this will work for your case, but seems like GAE standard has added the ability to run a custom build step.
However it does state:
After executing your custom build step, App Engine removes and regenerates the node_modules folder by only installing the production dependencies declared in the dependencies field of your package.json file.
Maybe since the node_modules are in your vendor/ directory, GAE may not detect and remove them, thus accomplishing your goal. This is a pre-install step, unlike postinstall specified in your script. Not sure if it matters.

Package and distribute Bash script with NPM to allow installation globally into developer workstation's paths with npm install

I have some scripts that I want to distribute with npm for developers to be able to install globally on their workstations and then use the commands of the scripts on their computers in their development workflow.
I can't work out how to get npm to actually add the script in its package to the path though.
I see that the firebase tools have this in their package.json:
"preferGlobal": true,
"bin": {
"firebase": "./bin/firebase"
},
...but I can't quite work out how this relates to my project.
The first project I am trying to distribute with npm is for controlling a Belkin WeMo light switch, it includes an executable 'wemo' and an included functions.inc.sh file, this can be seen # https://github.com/agilemation/Belkin-WeMo-Command-Line-Tools.git
If anyone can point me in the right direction it will be really appreciated!!!
Thanks,
James
Any key/value pairs placed into the bin key of the package.json will be symlink'ed into the NPM bin-dir path.
The key is what you want the command to be named and the value part is the script in your package it should run.
Ergo, in your example, when npm install finishes running it'll create a symlink from [package-install-path]/bin/firebase to /usr/local/bin/firebase (or whatever bin directory prefix NPM is using (npm bin -g will tell you where this is).
If you only have one script you can also do:
{
"name": "my-awesome-package",
"bin": "./myscript.sh"
}
And it'll symlink myscript.sh to my-awesome-package
Although you should be wary of including bash scripts since they won't work on Windows.
Here are the docs for this.

How to reconcile global webpack install and local loaders

My package.json includes webpack and some loaders:
"devDependencies": {
"babel-core": "^5.2.17",
"babel-loader": "^5.0.0",
"jsx-loader": "^0.13.2",
"node-libs-browser": "^0.5.0",
"webpack": "^1.9.4"
}
When I run webpack it's not in my path so it doesn't show as found. I installed it globally npm install -g webpack so the binary would appear in my path, but then it can't find the loader modules that were installed in ./node_modules that it needs to process my dependency tree:
$ webpack --progress --colors --watch
10% 0/1 build modules/usr/local/lib/node_modules/webpack/node_modules/webpack-core/lib/NormalModuleMixin.js:206
throw e;
^
Error: Cannot find module 'jstransform/simple'```
What is the preferred solution here?
I can install my loaders globally, but I don't like that because of cross-project issues
I can try to run webpack out of node_modules (not sure how to be honest, add it to $PATH for every project?)
Or I can try to give my global webpack access to my node_modules folder, which also seems hacky.
Have I done something wrong, or is there a better community-approved way around this maybe common problem?
I have a blog post called Managing Per-Project Interpreters and the PATH that details my methodology for this. I'll summarize here to address some of your key questions.
What is the preferred solution here?
Never (really, literally never) use npm -g. Just install normally within your project. Use your shell to set the PATH appropriately. I use zsh to do this automatically as detailed in the blog post above so if I cd into a project with ./node_modules/.bin, it gets put on my PATH automatically.
There are other ways that work including making aliases like alias webpack="./node_modules/.bin/webpack" and so on. But really just embrace changing your PATH and you'll have the most harmonious long-term experience. The needs of a multiproject developer are not met by old school unix everything lives in either /bin or /usr/bin.
If you use npm scripts (the "scripts" key in your package.json), npm automatically includes ./node_modules/.bin in your PATH during those scripts so you can just run commands and they will be found. So make use of npm scripts and if you make use of shell scripts that's an easy place to just do export PATH="$PWD/node_modules/.bin:$PATH".
Any binaries that come with packages defined in your dependencies should be placed under ./node_modules/.bin, so you should be able to access there, i.e.: ./node_modules/.bin/webpack.
Alternatively, you could define a script inside your package.json, which would allow you to run something like npm run webpack as an alias.

NPM: Change the `bin` output directory for the node modules

Currently, if you are using a package.json file to manage your project's dependencies (whatever project it is, may it be a ruby, php, python or js app), by default everything is installed under ./node_modules.
When some dependencies have binaries to save, they're installed under ./node_modules/.bin.
What I need is a feature that allow me to change the ./node_modules/.bin directory for ./bin.
Simple example:
A PHP/Symfony app has a ./vendor dir for Composer dependencies, and all binaries are saved in ./bin, thanks to the config: { bin-dir: bin } option in composer.json.
But if I want to use Gulp to manage my assets, I create a package.json file, require all my dependencies and then run npm install.
Then, my wish is to run bin/gulp to execute gulp, but actually I have to run node_modules/.bin/gulp which is not as friendly as bin/gulp.
I've looked at package.json examples/guides on browsenpm.org and docs.npmjs.com, but none of them works, because they are here to define your own project's binaries. But I don't have any binaries, because I want to use binaries from other libraries.
Is there an option for that with NodeJS/NPM ?
You might consider adding gulp tasks to your package.json.
// package.json
{
"scripts": {
"build-templates": "gulp build-templates",
"minify-js": "gulp minify-js"
}
}
You can run any scripts specified in package.json by simply running the following:
$ npm run build-templates
$ npm run minify-js
You get the idea. You can use the gulp command inside the string without doing ./node_modules/.bin/gulp because npm is smart enough to put all scripts from ./node_modules/.bin/ into the path for that script execution.

Installing "global" npm dependencies via package.json [duplicate]

This question already has answers here:
Install dependencies globally and locally using package.json
(7 answers)
Closed 8 years ago.
I have a few "global" dependencies (jshint, csslint, buster, etc..) that I'd like to have automatically installed and executable via the command line when my package is installed via npm install. Is this possible?
Currently, I'm doing the following manually:
npm install -g <package_name>
from within my project: npm link <package_name>
Update:
Just came across this feature request for npm. It seems like the scripts config within package.json is the way to go?
Update Again:
Or, after reading the npm docs, maybe I'm supposed to use a .gyp file? I'm confused.
It's not possible to specify dependencies as "global" from a package.json. And, this is by design as Isaac states in that feature request you referenced:
Yeah, we're never going to do this.
But, "binaries" can still be used when a package is installed locally. They'll be in .../node_modules/.bin/. And, you should be able to queue them up with a preinstall script.
Though, if the series of commands is rather lengthy (as "jshint, csslint, buster, etc.." would suggest), you may want to look into using a build tool such as grunt to perform the various tasks:
{
// ...,
"scripts": {
"preinstall": "grunt"
}
}
I really like the pattern where you install local dependencies, then use a bash script that sets your PATH to ./node_modules/.bin.
File: env.sh
# Add your local node_modules bin to the path for this command
export PATH="./node_modules/.bin:$PATH"
# execute the rest of the command
exec "$#"
Then, you can use this script before any bash command. If you pair that with a Makefile or npm script:
File: Makefile
lint :
./env.sh csslint my_styles
File: package.json
"scripts": {
"lint": "./env.sh csslint my_styles"
}
This tasks in these files look like they reference csslint in some global location, but they actually use the version in your node_modules bin.
The really awesome benefit of this is that these dependencies can be versioned easily, just like your other node modules. If you stick with a global install solution, you could be clobbering some specific version on the user's system that is required for one of their other projects.
You should try this: https://github.com/lastboy/package-script
I've been using it to install global npm packages straight from the package.json. It works well for clients who aren't technically literate.
It even checks if the packages are already installed, if not install them!

Resources