When and why should we use the option --no-bin-links when we install npm packages?
The official docs say:
The --no-bin-links argument will prevent npm from creating symlinks for any binaries the package might contain.
But it is still unclear to me for which scenarios we have to specify this.
What will be the impact of the specifying of this option on the functionality of the package? Will the package when used error out?
One scenario that I can think of is working with a virtual machine (i.e. vagrant w/ virtual box or VMware) on windows host.
You can't translate symlinks to a synchronized folder on Windows share, so you will need this option to go around it.
Use it for any filesystem that doesn’t support symbolic links.
So far one of the scenarios that I came across that --no-bin-links may help is when deploying packages in environments that may not have access to NPM Enterprise and you'd want to actually install all the packages and deploy the node_modules directly, in which case symlinks may cause some problem ( as you can't deploy them ), using this flag solve this problem.
Related
I'm working on a Javascript project, and as it so happens one of my dependencies pulls in puppeteer, which in turn downloads a whole copy of Chromium into my node_modules. My larger project is split into multiple Javascript packages, so I end up with multiple identical copies of Chromium among other stuff.
Is there a way to deduplicate these packages system wide? Note, npm dedupe seems to do something completely different to what I want.
I imagine there would be a module repository in my home directory which contains every package I need (in every version needed), and then in the local node_modules directories would contain only symlinks to the repository. This seems like an incredibly obvious optimisation, but I can't find any way to do it in npm. If not in npm, is it maybe possible in yarn?
As an added complication, this should also work on Windows (where symbolic link support has historically been not so good).
It seems the following command does what I want:
npm config set link -g
Then delete node_modules, and do npm install again. It should be much smaller now.
The documentation says:
If true, then local installs will link if there is a suitable globally installed package.
Note that this means that local installs can cause things to be installed into the global space at the same time. The link is only done if one of the two conditions are met:
The package is not already installed globally, or
the globally installed version is identical to the version that is being installed locally.
I am not sure if this has any negative side-effects - for example clobbering the global namespace with commands I don't want. For now, it seems to work fine.
I have a pretty common (i guess) problem. Many of my projects utilize nodejs, some for business logic, others only for some building task.
I need to have different runtimes in different projects, one of my electron apps requires node 7.10.0, a typical build suite requires node 8.x.
Now i know - i can use sudo n 7.10.0 or sudo n latest to switch the runtime globally on my computer (For those, who dont know this - have a look at "n")
Anyway, IMO this is not so convenient (some times, i need to rebuild all the modules after switching versions, often i forget to switch and so on). Is there a way of telling node which interpreter to use? Can i use a .npmrc file in a project directory to force a specific nodejs version within that subdirectory?
I searched exactly for this (npmrc node version) but was not lucky enough to find something.
Okay, i found a similar quesion:
Automatically switch to correct version of Node based on project
it seems you can install "avn" and use a .node-version file to do exactly that.
sudo npm install -g avn avn-n
avn setup
then you can create a .node-version file in your project and enter the desired version
echo 7.10.0 > .node-version
Then avn will detect that and activate the correct version
Unfortunately i get an additional permissions error. So to make this work, you need to install/configure "n" to work without sudo/root.
If you're fine with using another tool you could use nvshim.
pip install nvshim # this is all you need to do
It does not slow your shell startup or switching directories, instead moving the lookup of which node version to when you call node, npm or npx by shimming those binaries. More details in the docs.
Source, I wrote the tool.
NVM (Node Version Manager) allow us to use different versions of node quite easily on a single machine. You can have a look at here how to configure and use it.
Volta can used to manage multiple nodejs, npm or yarn versions on different projects on same machine. It's cross-platform.
For example you can run volta pin node#14 in project directory and this will set node to v14 if it exists otherwise it will download and then set it.
More information here https://docs.volta.sh/guide/
I'm trying to implement a developer workflow with docker, with the ability to develop offline (as in, not having to run npm install when you switch between branches that have differing dependencies)
The most intuitive way to do that is to store dependencies in source control. This has its own issues especially when using modules that compile dependencies. I have tried nearly everything I could think of and find:
npm packing my projects dependencies, storing in source but this doesn't store my dependencies' dependencies
storing node_modules in source, copying this to the container and running npm rebuild but it doesn't actually trigger a rebuild
running npm install --no-registry so t triggers a rebuild but doesn't try to call out, but it actually calls out to the public registry anyway
other solutions I've seen like Node-PAC seem abandoned
npmbox looks the most promising but it requires that it's installed on the target globally, which would work in a container I can build but not production, unless we start deploying containers in production.
Is this a fruitless effort? Lack of network access is rare and would only really be needed when installing a new module or moving between revisions that have differing dependencies
Another option is to setup a private npm repository and to configure it to cache public repository. There are several options to implement this, I would recommend to try Nexus: https://www.sonatype.com/nexus-repository-oss
And if so, I'm interested in knowing if there is any way to run them from /usr/local/lib folder instead of installing those modules in each project's folder I'm working on.
The node documentation says that installing global modules should be relegated to those that require command line access ( such as nodemon etc.. ).
https://nodejs.org/en/blog/npm/npm-1-0-global-vs-local-installation/
Which to choose
Just like how global variables are kind of gross, but also necessary
in some cases, global packages are important, but best avoided if not
needed.
In general, the rule of thumb is:
If you’re installing something that you want to use in your program,
using require('whatever'), then install it locally, at the root of
your project.
If you’re installing something that you want to use in
your shell, on the command line or something, install it globally, so
that its binaries end up in your PATH environment variable.
The only other related issue I can think of to your question is that you can install express generator globally as it allows you to create an instance of their web scaffolding.
npm install express-generator -g
But that is different and not the same thing as installing express itself globally.
So as far as I can tell the answer is no - their is no benefit to this.
Well, personally, no. I ended up messing up my repos trying to globally install express and other modules. I really can't see the benefit of sparing a few extra seconds when you know it can't mess up your system :D And just to add, don't try globally installing mongoDB, REALLY ended up badly.
I have a very special requirement from my client. We have been using npm to install karma and phantomjs for quite a while. Everything works fine until we have to move everything off the cloud to internal infrastructure. Now things get complicated. The internal infrastructure doesn't have internet access so we cannot use npm to resolve dependencies anymore. We tried to move node_modules folder dev machine to the internal infrastructure machine. It didn't work because dev machine is OSX and Windows and the server is Centos and phantomjs is OS specific but npm is able to workout the versioning. What options do we have to resolve dependencies? I just learn that node_modules name cannot be changed. I was thinking of checking in OS specific node_modules but that wouldn't work since npm only looks for node_modules folder.
I got the same error as this thread PhantomJS Crash - Exit Code 126 when I was trying to use node_modules from OSX in Centos.
Install all dependencies on first OS (i.e. OSX), assuming that you have package.json with all dependencies.
npm install
Rename created npm_modules to npm_modules_mac
Repeat steps above for different OS (i.e. Windows), rename node_modules to something like node_modules_windows.
On target OS, move folders created above to your app folder, create symbolic link (node_modules), which will point to appropriate folder (npm_modules -> npm_modules_mac in OSX)
Why don't you just host your private registry? You can store the registry in the internal infrastructure.
The defacto registry is #isaacs own npmjs.org. This can be found here:
https://github.com/isaacs/npmjs.org
It does require using CouchDB as the database, however, and that can be daunting. There are alternatives that allow you to do this. For example, reggie:
https://github.com/mbrevoort/node-reggie