How to automatically link npm packages at installation time? - node.js

I'm working on a larger project that is split into a number of npm packages. There are several dependencies between the packages. The whole code base is stored in a main directory like this:
main/
pkg1/
pkg2/
...
Suppose that pkg2 depends on pkg1, so in main/pkg2/package.json I have:
"dependencies": {
"pkg1": "^0.1.0"
}
I have linked my packages together using npm link. However when I start development on a new machine or for some reason I have to reinstall the packages I can't simply say npm install in pkg2/. It would fail because pkg1 couldn't be found. (It's not published, but anyway, I want the local version because I'm developing both packages).
Of course I can do all the linking manually than call npm install but it's a hassle. Is there any way to do it in a single step?
My previous research:
This question proposes writing a preinstall script, but I don't want to keep linking in production, only in the development environment, as another answer points it out.
I've also tried npm link in pkg1/ then npm install --link in pkg2/. According to the manual,
The --link argument will cause npm to link global installs into the local space in some cases.
Not in my case though.

You can use zelda for this. Written by Feross, it was designed for exactly this purpose.

I'm not a fan of doing it this way; I'd generally prefer running a local repository or using git URLs for dependencies like this.
That said, if you want to keep using npm link you can always use the preinstall script approach but not use the preinstall key.
"autolink": "cd ../project1 && npm link && cd ../project2 && npm link project1_name",
Then in your cli you can do $ npm run autolink when you first setup a dev env.

Related

How to install extra bin files when `npm install` is given `-g` flag?

Is there some way to install only part of a NodeJS project when run via npm install, but to install additional features when npm install -g is used?
I have a library with a command-line interface, however the CLI is not useful to any downstream projects using my library. So when those projects pull in my library, I don't want them to pull down dependencies like chalk that are only used for the CLI they will never touch.
However if an end user decides to install my library globally on their system with npm install -g then I want the CLI installed, and via the bin section in package.json, placed in their path so they can run it like any other program.
I can't work out how to do this without splitting the CLI into a separate package. The options I have investigated are:
Put the CLI dependencies as devDependencies. This prevents chalk etc. from being installed in downstream projects, but the drawback here is that the user must npm install -g in development mode, which means they get the test framework and linting tools installed as well even though they will never use them.
Put the CLI as a separate NodeJS package/module. The drawback here is that it makes testing difficult (as often the CLI and library are modified at the same time and used for testing new features), and developers wanting to contribute to the library will have to stuff around with linking the two packages so although it would work, it's less than ideal from a workflow perspective.
Put the CLI in a folder inside the main package, and create another package.json in there just for the CLI, pulling in the main project via npm install ... This works until you get to the install point, when you realise that there is no way to install the CLI once the package has been published. npm install #my/library will only install from the package.json in the project root, there's no way to say "oh also install the package in the cli subdirectory too."
Ideally what I would like is this:
npm install #my/library - run by a developer wanting to use the library in their project. Adds the library only to their project's dependencies, ignores both CLI and any dependencies the CLI needs.
npm install -g #my/library - run by an end-user, installs library and CLI globally on their system, including the CLI dependencies, and adds the CLI to the user's path via the package.json bin section.
npm install --dev - used by developer contributing to the library to install the test framework so they can run the unit tests before submitting their code for inclusion.
Not having to split the CLI into a separate project.
Is this possible?
Is there some way to install only part of a NodeJS project when run via npm install, but to install additional features when npm install -g is used?
You can write a postinstall script that uses is-globally-installed (or another similar package) to check if the module is installed globally, and then run whatever is appropriate to install the CLI (perhaps npm install -g for a separate package that just has the CLI).
Well I came up with a workaround.
I put the CLI back to development-only (with its requirements in devDependencies so it could be used by local devs, but they wouldn't get pulled in to downstream projects using the library). I then created another package for the CLI, but all it was is a package.json file and nothing else.
What this package does is depend on the main library, plus only the dependencies needed by the CLI. Then it uses the bin section to point to the CLI inside the main library, so no code is needed in this package - it is literally just one file. Like so:
{
"name": "#my/library-cli", // new module to install the CLI
"bin": {
"myprog": "./node_modules/#my/library/bin/myprog.js" // the command is inside the dependency
},
"dependencies": {
"#my/library": "*", // require the library itself where the CLI code sits
"command-line-args": "*" // dependency needed by the CLI
}
}
The only drawback with this is that this new package's dependencies need to be kept in sync with the main library's devDependencies (at least the deps required by the CLI) but as they won't change often in my case I can live with that.
I did try the postinstall hook as #Trott suggested but it didn't seem to work:
"scripts": {
"postinstall": "[ $npm_config_global == 'true' ] && (cd cli && npm install -g)"
}
For some reason it was incredibly slow and seemed to get stuck in a loop, trying and failing to install over and over. I'm also not sure it would've respected being launched via a command like npm install -g --prefix /my/install/folder as that non-standard prefix may not get passed through to the child npm process.

Is npm init needed?

I always thought that you should initialize npm first before installing any packages
npm init --yes
However I found out that I could just go straight to installing packages
npm i example-package
Then the package would be installed and package.json would be created at the same time.
Is there any reason I should be doing npm init first? Is it only required if I want to specify project details?
It is not required. You can install packages without, and everything will work.
npm init can do basically two things:
ask for basic project info to include in packages.json
create a specific type of project (for example React) by using npm init typeofproject
If you just want to use packages and don’t care about naming the project or using a template, just install packages.
npm init is there when you are installing the project very first time.
else you don't need to use npm init for installing any package
Well, kind of a late answer, but as far as I know (correct me if im wrong), one of the features is it gets set up with package.json which includes the dependencies list. That way, NPM can simply install the packages on the list (via the "npm init" if you have a situation that you want to clone the app into another machine), rather than copy pasting the whole project folder.
This isn't a direct answer to the question, but, if sheds some light at some point, why not.

How to get npm to favor local linked dependency over its published install

I've searched through other questions such as this one, but they all seem to be about a local npm link stopping working for another reason than mine. I assume this is a common use-case issue, so if I'm doing something methodically wrong, I'm more than happy to take suggestions on how I should be doing it.
Principally, I have a private npm module that I'm working on called #organisation/module. When working locally, I'll run npm link on it, and use it within my 'host' project as npm link #organisation/module — this all works great with hot-reloading, etc. I'll also import it as import module from '#organisation/module.
However, since I also want to publish my local changes to npm (as #organisation/module) from time to time, for build testing and production code, I need to run npm install #organisation/module on the host project.
This then seems to break the implicit npm link I set up earlier... I assume mainly because they are the same name, and npm favors an install over a link?
When I want to make live, local changes again, the only way I can currently get it to work is via npm uninstall #organisation/module and then to re-link it.
Is there a way to keep the published module installed (in order to avoid careless mistakes, like forgetting to reinstall it for build testing), but always favour the local, linked instance?
Diagram for ref:
Have you tried locally installing with the other method npm provides.
npm install /absolute/path/packageName
I believe this will change your entry in package.json to look like this:
"dependencies" {
...
"packageName": "file:../../path/to/packageName",
...
}
Since npm link creates a symlink in the global folder, while npm install is local to the project npm install takes precedence. You can read about npm link here: https://docs.npmjs.com/cli/link
To avoid this, my suggestion would be to use npm install <path to local> and when you need to use the production code use npm install #organization/module. This would update your node_modules per code basis. Read about npm install here: https://docs.npmjs.com/cli/install
Hope this helps :)
Go to the directory where your local package is located open package.json change the name from original_name to "original_name_local".
write npm link on terminal at the same location.
After this go to your working directory and write npm install <path to local>
Now whereever you're requiring or importing update the name to "original_name_local"
for example if it's require('space-cleaner') then change it to require('space-cleaner_local')
Like this you can have both local as well as production package just change the name wherever required.
Otherwise you can remove package by removing it from package.json and deleting from node_modules.
if local is needed go to local package directory and on terminal write npm link and then on your working directory write npm install ./path/to/package
if production then again delete the package as told above and write npm install package_name

Why is npm running prepare script after npm install, and how can I stop it?

Whenever I run npm install <package> it installs the package alright, but then it automatically runs the prepare script.
It's worth mentioning that I've already checked that there is no postinstall script in the package.json.
From https://docs.npmjs.com/misc/scripts:
prepare: Run both BEFORE the package is packed and published, and on
local npm install without any arguments (See below). This is run AFTER
prepublish, but BEFORE prepublishOnly.
Since NPM v5, prepare script is executed when you run npm install
The other answers are fine, but for some additional context, this is to support a workflow where you can use devDependencies to build assets or other generated content for your project.
For example, say you want to use node-sass (CSS preprocessor). You add "node-sass" as a devDependency, then you run the sass command in your "prepare" script, which generates your CSS.
So, when you run npm install, the following happens:
dependencies and devDependencies get installed
your "prepare" script generates your CSS
your project is ready to go with all necessary CSS
And when you run npm publish, something similar happens:
your "prepare" script generates your CSS
your code and generated CSS are published to the npm repo
So now when someone comes along and installs your package, they don't need node-sass or any of your devDependencies. They only need to runtime deps.
The prepare script runs on local install and when installing git dependencies:
prepare: Run both BEFORE the package is packed and published, on local npm install without any arguments, and when installing git dependencies (See below). This is run AFTER prepublish, but BEFORE prepublishOnly.
https://docs.npmjs.com/misc/scripts
You can avoid it with the --ignore-scripts flag:
$ npm install <package> --ignore-scripts
source: https://docs.npmjs.com/cli/install
From the doc https://docs.npmjs.com/misc/scripts
prepare: Run both BEFORE the package is packed and published, and on local npm install without any arguments (See below). This is run AFTER prepublish, but BEFORE prepublishOnly.
prepare script run before publishing and after npm install.
Now if you make an npm install and one of the packages has a prepare script, like for building, and it fails the whole install will fail.
We have two options:
Ignore scripts
npm install --ignore-scripts
That will run the ignore to all packages, which might be not the desired behavior. Imagine a third party package that needs to run prepare and build. If you run with --ignore-scripts this will get skipped.
Make the script optional (better option)
Add a package to the optionalDependencies:
{
optionalDependencies: {
"myPackage": "^1.0.0"
}
}
If a dependency can be used, but you would like npm to proceed if it cannot be found or fails to install, then you may put it in the optionalDependencies object. This is a map of package name to version or url, just like the dependencies object. The difference is that build failures do not cause installation to fail.
Entries in optionalDependencies will override entries of the same name in dependencies, so it's usually best to only put in one place.
Check the doc:
https://docs.npmjs.com/cli/v7/configuring-npm/package-json#optionaldependencies
Note: With this, only the chosen package is concerned. And if it fails the installation will continue. That's usually what you want.
Go with optionalDependencies
As per this answer in this thread:
https://github.com/npm/npm/issues/2817#issuecomment-368661749
the problem with --ignore-scripts is that is ignores all scripts. I just need to be able to ignore a script(s) for a particular package (the one where a build fails to compile on certain platforms). This option usually breaks my code because it has ignored ALL scripts in other packages that actually do need to run.
Anyway, to make this work like the OP I make the offending package optional. Then do a regular install, then a second install with --ignore-scripts. That way I get the scripts of other packages run first before ignoring them all (including the intended) the second time which then "fetches" the source of that package.
It's generally better to go with optionalDependencies. That will most likely suit your needs.

Project Makefile and conditional NPM linking

We're a small team and writing a webapp in node.js,express.js and it is bundled with a parser that is implemented in python.
I would like to use my npm link'd fork of some libraries but not interrupt my team's workflow. So if my forks exist, otherwise install local node packages.
I would like to play around with deployment scripts so I was writing a Makefile for the project. Part of the makefile's job is to use npm to get the node dependencies, so I have a target
node_modules:
##(cd $dir && npm install)
Which is all fine until I started hacking on some node libraries. Now, I have a few forks of some dependent libraries that I would like to use but don't want to interfere with the rest of my team's build.
The solutions I've seen are almost there but not quite. The --link related flags and options will install globally if the global package is not there, which is not what I want. I would like it to install locally.
npm link foo then npm install - sort of works, but npm will install foo globally if the link does not exist
devDependencies - would be good except we will all be building npm
Some sort of per user Makefile that I just keep locally that runs - this seems like an option that works, but will require some extra cruft that I'd rather not have to take care of.
I only have a bit of experience with Makefiles, so maybe there is a pattern for this already. Any ideas?

Resources