How do I prevent npm install from removing packages? - node.js

I'm trying to set up a development environment with several packages, and as a result I need to manually install some dependencies. More specifically, I have some local changes in several packages which I need to test before I can push them to github, so I can't just npm install the top level because it won't pick up those change. So I run the first npm install manually on packages which are missing, and then try to run my node code and see which package it is still missing, then try to npm install what it says is missing.
However, when I go to install the second package, it ends up with this message:
added 3 packages from 4 contributors, removed 799 packages and audited 3 packages in 4.197s
The second install removed practically every package that was already installed! I didn't notice this until about the third time, when I realized that I seemed to be installing the same thing over and over.
However can I prevent this particularly naughty behavior and force npm to only install what I tell it to and leave everything else alone?

Have a look at npm link if you need to test against modified packages.
From npm link:
This is handy for installing your own stuff, so that you can work on it and test it iteratively without having to continually rebuild.
Say b is a dependency of a. You made changes to b and want to check if a still works with those changes. Instead of using b in node_modules installed from npm, use your local, modified version:
cd ~/projects/b # go into the package directory
npm link # creates global link
cd ~/projects/a # go into some other package directory.
npm link b # link-install the package
Now, any changes to ~/projects/b will be reflected in ~/projects/a/node_modules/b/.

If your development flow involves updating in parallel packages which depend on one another, you might consider switching your project's package manager to from npm to yarn to take advantage of yarn's workspaces feature.
Yarns's workspaces allow you to easily setup a single monorepo containing all your interconnected dependencies, and let yarn thinking how to link them together in your dev environment.

i had a similar problem today , & thought this might help someone in the future and l have found out that if you install simultaneouly it
npm install --save package1 package2 package3 ...
it worked as l had
npm install xlsx angular-oauth2-oidc
but if you install separately it will have issues
Edit 2 More infor by #Michael
installing multiple packages in the same command also prevents hooks from being installed multiple times

Remove "package-lock.json" file befor installing the new package.

Are you saving the dependencies to package.json?
To Save : npm install --save {package_name}. This will save the package to package.json and install using npm install.
You can't particularly control the dependencies(fully). The dependencies which you have installed might be using dependencies themselves.So when you remove a package, npm deletes all the package's dependencies and the package.

Related

Using npm to only install the packages needed in current project

I am just starting with node/npm and I have a lot of trouble with
the path to install the package
loading the package in node
I would like to have a package folder (no matter its path) with only the packages needed for my current project (I don't use a package.json just the normal npm install...). So instead of installing the package in the folder given by npm root, I thought I would install all the packages in a local folder with npm install --prefix ./node_modules pck_name.
If I install the packages globally, I am able to load the packages in Node with require('pck-nam'), but when I install in the local folder, I am unable to load the package in Node even by adding the folder path to node_path or with the full path of the packages in require:
const pck = require('C:/Users/Me/myproject/my_modules/node_modules/pck-name');
The error is Cannot find module 'pck-name'
Because I was stuck on this for a long time without finding a solution, I though of renaming the folder given by npm root and then doing a global install : because the folder is will be recreated from scratch, then I will just have the packages for my project. But after the install, I did npm list, and all my previous package are listed, including the one for current project.
I have read many questions/answers and many tuto but I am still unable to use npm/node the way I would like (I am used to python and I regularly use import for global/local modules so I may be thinking too much in a python way).
I can at least partially answer my question as I understand know why the previous package where still there after I renamed the folder. Although I didn't install from a package.json, npm install do create a package.json, or in my case a package-lock.json. And apparently when running a npm install package-name it will check package-lock.json re-install all the missing packages.
So it's not enough to rename the folder indicated by npm root, I also add to rename the package-lock.json. Now I am clear. I still think that I haven't found the best way to go but at least I have what I need.

Is npm init needed?

I always thought that you should initialize npm first before installing any packages
npm init --yes
However I found out that I could just go straight to installing packages
npm i example-package
Then the package would be installed and package.json would be created at the same time.
Is there any reason I should be doing npm init first? Is it only required if I want to specify project details?
It is not required. You can install packages without, and everything will work.
npm init can do basically two things:
ask for basic project info to include in packages.json
create a specific type of project (for example React) by using npm init typeofproject
If you just want to use packages and don’t care about naming the project or using a template, just install packages.
npm init is there when you are installing the project very first time.
else you don't need to use npm init for installing any package
Well, kind of a late answer, but as far as I know (correct me if im wrong), one of the features is it gets set up with package.json which includes the dependencies list. That way, NPM can simply install the packages on the list (via the "npm init" if you have a situation that you want to clone the app into another machine), rather than copy pasting the whole project folder.
This isn't a direct answer to the question, but, if sheds some light at some point, why not.

doing npm install for each project takes too much space in drive

is there any way to route npm install to a specific part of hard drive and when i do npm install it make node_module folder in that part of drive, and when i run any project it look for dependencies in that part of drive,
just like single pool for every project.
then if i have two projects with similar dependencies then i only need to npm install in one project so dependencies become available in pool, and no need to do npm install in another project just npm start
Thank you,
Inzamam Malik
You can achieve something close to what you are describing with the link option.
From https://docs.npmjs.com/misc/config#link:
If true, then local installs will link if there is a suitable globally installed package.
Note that this means that local installs can cause things to be installed into the global space at the same time. The link is only done if one of the two conditions are met:
The package is not already installed globally, or
the globally installed version is identical to the version that is being installed locally.
So you will still have some files in each project's node_modules, but you shouldn't have as large a folder.
To turn this behavior on, run:
npm config set link -g
Edit: There is no way you can avoid running npm install and having a node_modules folder. Node.js always looks in node_modules for dependencies (this behavior pre-dates npm itself). The link option will make npm create symlinks in node_modules, pointing to a common pool. That will reduce disc usage, but you cannot do away with node_modules.
You can use PNPM Package manager, It uses a global pool for dependencies.

Doesn't npm install check for a global version first?

I just setup a test, and tried to npm install express even though express already exists globally on my system. To my surprise, instead of using the global version, it ended up re-installing a version locally!? Isn't it supposed to use the global version... Or am I suppose to use -g every time, even when I only want to use the existing global version. Otherwise, what's the point of installing anything locally!?
The answer is "NO". It isn't supposed to use your global version.
If you want to use your global version, then you doesn't need to execute npm install at all because it is already installed.
If you do it then, obviously, you are saying "I want to install it locally to my project". And more than that: "I want to install its latest version unless it is declared in my package.json with other explicitly specified version".
In fact, the actual question is: Why in the hell would you want to not install a dependency of your project locally? To have more version mismatch issues?
As #anshuman_singh says, best practice is to always do an npm install --save.
You are able to use globally installed packages, of course. It could be handy for fast testing code that you will drop just after a few hours or so.
But, anyway: If you doesn't have really hard disk or network bandwidth issues, installing all dependencies locally will avoid you too much trouble in the future.
On the other hand, uploading that modules to your code repository is also a bad idea (maybe that is what you were trying to avoid) because, with different versions of node, most native modules won't work if not rebuild. But most VCS support ignoring files and or directories that must not be uploaded.
For example, in git (.gitignore file):
**/node_modules
In summary:
npm init (if you didn't already it).
npm install --save for all your project dependencies.
npm install --save-dev for dependencies not needed in production (testing stuff).
Don't upload node_modules to your VCS.
After new checkout: npm install or npm install --production (to not install dev-dependencies).
npm install -g only for tools you will use in console.
This way, you are sure that you will have in production (or other dev environments) the exact same version of each package.
And, finally, if you ever want to upgrade some package to its latest version, simply run:
npm install --save <pagkage_name>#latest.
If you’re installing something that you want to use in your program, using require('whatever'), then install it locally, at the root of your project.
If you’re installing something that you want to use in your shell, on the command line or something, install it globally, so that its binaries end up in your PATH environment variable.
The first option is the best in my opinion. Simple, clear, explicit. The second is really handy if you are going to re-use the same library in a bunch of different projects
Install locally-
npm install moduleName
install locally and save in package.json-
npm install moduleName --save
install globally-
npm install moduleName -g

How to prevent npm install <package> --save-dev from reordering devDependencies

Background
We're having issues with a Windows build system hitting the file path too long error when the node modules folder has items within it that have paths which are over 260 characters.
We've discovered adding a deeply nested dependency to the top of the devDependencies section fixes this issue. The assumption is that when npm sees a nested dependency C.1 require package A, which is already declared and available in devDependencies, npm will not add dependency A to dependency C.1's node_modules directory.
Issue
The problem I'm seeing on my local machine is that running npm install <package> --save-dev reorders the packages in devDependencies alphabetically, but the order npm process packages and their dependencies matters. If I check this in, then the build system will hit the same file path too long error.
ie If package A comes after package C and dependency C.1 requires package A, then npm will add package A to the node_modules folder of dependency C.1.
I'm not sure if this reordering is only on my machine since I haven't seen npm reorder dependencies on my home machine before.
Has anyone seen this before or know how to stop this behavior?
Versions
Node: v0.10.32
NPM: v1.4.28
Side note: I've read that npm 2.0 or future versions will analyze the dependency hierarchy, find duplicated packages, and only reference them once on the file system, but the upgrade to npm 2.0 is not in the picture at this time.
The only way I see this working is to have some sort of preinstall script which [hopefully] will run after the dependencies file has been updated but before the package is installed. From the npm site:
In the current version of node, the standard way to do this is using a
.gyp file. If you have a file with a .gyp extension in the root of
your package, then npm will run the appropriate node-gyp commands
automatically at install time
If that doesn't work, you will need to use MakeFile and rewrite the package.json file. This is not too out of the ordinary as some projects require some sort of pre-compilation - you would just instruct your team to run a separate command for installing npm packages.

Resources