I'm new to NPM (obviously) and I have forked a NPM module to test out some custom modifications to it - let's call that project dependsOnMe. Now i want to use DependsOnMe in my CustomApp project. I can't figure out how to do this.
In my DependsOnMe project I did a npm install -g . (whatever that does, it's now unclear to me), but when I go to my CustomApp project and add a dependency for DependsOnMe into my package.json NPM tells me it cant find DependsOnMe in the NPM remote repo (no kidding...).
Since I installed DependsOnMe globally I thought perhaps I don't need to declare a dependency in MyCustomApp, but when I try to include DependsOnMe in one of my project's .js files I get the error that DependsOnMe cannot be found.
This must be simple, what am I missing?
Related
I have installed the tar-module using
npm install -g tar
When I type
npm list -g --depth=0
I can see the entry tar#6.1.0 in the module-tree, however when I try to require it in a js-file const tar = require("tar"), I get the error message
Uncaught Exception:
Error: Cannot find module 'tar'
What am I missing?
The issue here is that you're trying to use something installed globally in a local project. You should be able to use your libraries if you install them inside the project with npm i tar.
The reason we install something globally is for use during development on many projects. This way, we don't have to install a tool on every project. With something you want to use inside a projects code however, you should install it on a project level. This way everything that the project needs to work lives inside of the project itself. You should see all dependencies listed inside of your package.json file
Not gonna advocate you do this, but if you really want to include the globally installed library, you can do something like this:
require('./../../.npm-global/lib/node_modules/tar'); // Relative path to library
Where you go up the file directory to your $HOME directory into the default global install location for node and bring it in. This is poor practice, please don't do it, but heres the info none-the-less.
I've searched through other questions such as this one, but they all seem to be about a local npm link stopping working for another reason than mine. I assume this is a common use-case issue, so if I'm doing something methodically wrong, I'm more than happy to take suggestions on how I should be doing it.
Principally, I have a private npm module that I'm working on called #organisation/module. When working locally, I'll run npm link on it, and use it within my 'host' project as npm link #organisation/module — this all works great with hot-reloading, etc. I'll also import it as import module from '#organisation/module.
However, since I also want to publish my local changes to npm (as #organisation/module) from time to time, for build testing and production code, I need to run npm install #organisation/module on the host project.
This then seems to break the implicit npm link I set up earlier... I assume mainly because they are the same name, and npm favors an install over a link?
When I want to make live, local changes again, the only way I can currently get it to work is via npm uninstall #organisation/module and then to re-link it.
Is there a way to keep the published module installed (in order to avoid careless mistakes, like forgetting to reinstall it for build testing), but always favour the local, linked instance?
Diagram for ref:
Have you tried locally installing with the other method npm provides.
npm install /absolute/path/packageName
I believe this will change your entry in package.json to look like this:
"dependencies" {
...
"packageName": "file:../../path/to/packageName",
...
}
Since npm link creates a symlink in the global folder, while npm install is local to the project npm install takes precedence. You can read about npm link here: https://docs.npmjs.com/cli/link
To avoid this, my suggestion would be to use npm install <path to local> and when you need to use the production code use npm install #organization/module. This would update your node_modules per code basis. Read about npm install here: https://docs.npmjs.com/cli/install
Hope this helps :)
Go to the directory where your local package is located open package.json change the name from original_name to "original_name_local".
write npm link on terminal at the same location.
After this go to your working directory and write npm install <path to local>
Now whereever you're requiring or importing update the name to "original_name_local"
for example if it's require('space-cleaner') then change it to require('space-cleaner_local')
Like this you can have both local as well as production package just change the name wherever required.
Otherwise you can remove package by removing it from package.json and deleting from node_modules.
if local is needed go to local package directory and on terminal write npm link and then on your working directory write npm install ./path/to/package
if production then again delete the package as told above and write npm install package_name
I have a simple question that i cant seem to find the answer for. I cloned a git repo to my local machine.
When attempting to start node, I receive an error because i don't have the required npm dependencies installed. However, they are located in the packages.json file that was cloned.
I was wondering if there was a simple way to install the dependencies located in that file, without having to npm install for every individual package.
Within the directory of the package.json file, just run npm install. It will read package.json and install all dependencies. If you want to limit it to only non-dev dependencies, use npm install --only=production.
I have an issue with a project where we are using node and brunch. The issue is current specific to brunch, but could occur for any module would be my guess.
The easiest way to currently reproduce this, is to do the following in a new folder:
npm init
npm install --save-dev brunch
The issue here is that brunch depends on loggy, which in turn depends on ansi-color, which no longer has an entry in the npmregistry:
https://registry.npmjs.org/ansi-color
I think this might be the github project: https://github.com/loopj/commonjs-ansi-color
In any case, I am unable to proceed, and all our builds fail because they are not able to fetch the given dependency.
I could perhaps use npm shrinkwrap in some way, but that depends on the modules already existing in node_modules, which I am currently missing.
So how can I force npm to use ansi-color from a different location, or ignore the dependency?
Not sure about npm 2 but you can fix this with beta npm 3. npm 3 has flat node_modules directory. So sub modules can sit in the top level. Read the Changelog.
The missing modules can be installed directly from their Github repo as a toplevel dependency in your project. If npm finds the module with the same version in node_modules directory, it won't look for it anymore in the registry.
Install npm 3:
npm install -g npm#3-latest
Then install depencies:
//install missing module from other location
npm install https://github.com/loopj/commonjs-ansi-color.git --save-dev
npm install --save-dev brunch
It looks like ansi-color is back on the npm registry ("https://registry.npmjs.org/ansi-color" is back online)
I'm debating how I should setup certain node modules.
Lets say I have a folder called "Projects". This will hold various code projects for node that I'll create under this going forward.
Now I can install stuff like cucumber, lodash, mocha, etc...stuff that I know I'll probably use across most all my projects:
1) npm install -g
- here, any package.json can find it on my PC I think
2) npm install [whatver] in the root of my "Projects" folder so that now I have an npm_modules folder sitting at the root so any projects created's package.json will able to find those type of modules at the root of my Projects folder
- here, I'd have to npm install once at the root of my Projects folder if not already installed globally and I didn't go with option #1
3) npm install into each project under projects. But this seems like it's not efficient. If I have to make people install stuff like cucumber every time they clone down a project, that means when they run npm install, it'll have to install cucumber again and again, for each project which seems stupid to me to do something like that if it's really a global package I plan on using across many projects
-- so here for example I might have several projects I create or clone: Projects**MyProject1**, Projects**MyProject2**, and so on. Each of those projects has its own package.json of course looking for dependencies like cucumber, mocha, etc. If I do it this way I'll have to wait for npm to install those into each's own node_module folder so for example Projects\MyProject1\node_modules\cucumber, Projects\MyProject2\node_modules\cucumber and so on. Seems stupid and duplication all over to do that...?
Suggestions on which option is best and why you think that based on your experience managing projects in node?
npm install -g - here, any package.json can find it on my PC I think
This won't work because global modules cannot be picked up by require in your node scripts.
npm install [whatver] in the root of my "Projects" folder so that now I have an npm_modules folder sitting at the root so any projects created's package.json will able to find those type of modules at the root of my Projects folder
This will work for sure as long as the projects in your "Projects" folder will always be there. If you publish a project then the dependencies for that project will have to go with it.
npm install into each project under projects. But this seems like it's not efficient. If I have to make people install stuff like cucumber every time they clone down a project, that means when they run npm install, it'll have to install cucumber again and again, for each project which seems stupid to me to do something like that if it's really a global package I plan on using across many projects
Why is this stupid? As long as you do npm install cucumber --save then your dependency on cucumber will be saved to your project's package.json file. All anyone who clones your project should have to do is this:
$ git clone project.git
$ cd project && npm install
npm install without any additional arguments will install all the dependencies listed in the package.json file for the project. It only has to do this once. After that all the dependencies are downloaded and installed within the node_modules directory for your project. The only time they'd need to run npm install again from the root of the project directory would be if they deleted the node_modules folder or you made a change and added a new dependency to package.json.
Installing modules in your "Projects" directory will make them available to any scripts requireing the module from within any subdirectories. Keep in mind that if I were to clone your repository I won't have your "Projects" directory. I'll just have the directory for your project, wherever I cloned it to. I need to get those dependencies somehow and the easiest way is for me to cd into the project and run npm install where you should have a package.json file that lists all the required dependencies.
PS - npm install [module-name] --save only saves the dependency version if you already have a package.json file in the root of your project. If you don't have one yet, then initialize one first.
$ npm init