NPM link seems cool, but what are the differences between NPM link and requiring the module by giving the path? Could you please elaborate the advantageous of each?
When you use npm link you can require it like:
var foo = require("foo");
but if you use the path, you require it like:
var foo = require("./lib/foo");
Thanks
Npm link is useful if you are developing some node_module which has dependency to other standalone node_module you are also developing simultaneously (which you then may upload to the npm when it is ready / releasable). Using this setup you will always get the freshest version of "other module" without need to push releases to npm.
It is better than using relative dependencies because relative paths can be individual (per developer), but npm link works as if it was required from npm (located in node_modules folder).
Conclusion: I usually use relative dependencies inside of module itself to require other files and npm link to specify dependencies between simultaneously developed standalone modules.
Related
My code uses core Node modules such as fs and path. Is there any reason to include them in package.json (npm i fs path)? The README for npm path says "This is an exact copy of the NodeJS ’path’ module published to the NPM registry." Why do these packages have around a million downloads a week?
Is there any reason to include them in package.json (npm i fs path)?
No. These packages have been bundled as part of every major Node release since its inception. They do not need to be installed separately or included in your package.json file.
Why do these packages have around a million downloads a week?
I suppose you could really only speculate, but it's likely that a nonzero number of newcomers aren't aware these modules are available as part of the core of Node and are running npm install fs, etc. following the same idea as some of the documentation or tutorials they're reviewing.
It's prudent to ensure that not only if one does include these modules that they work as intended, but to also prevent unscrupulous actors from namesquatting and enabling dependency confusion vulnerabilities. The npm page for the fs module even states explicitly why they've elected to publish it (emphasis mine):
This package name is not currently in use, but was formerly occupied by another package. To avoid malicious use, npm is hanging on to the package name, but loosely, and we'll probably give it to you if you want it.
As mentioned in similar question, you don't need install it, so you don't need specify it in package.json. Also fs, path are Node.js core modules.
Sorry for my bad english. Have a nice day!
I want to create an npm package that would be installed globally, with a binary to call. Calling that binary would set up the initial files for a new project, like your standard folders, standard license and layouts, a package.json with your common dependencies. Then you could call npm install to actually set up that project.
Like the Express application generator, or like rails new does in Ruby. The usage would be like
mkdir new_project
cd new_project
myCoolGenerator new
npm install
But I'm confused about how I'd implement this. The simple approach, which I'm doing now, is to create the standard vanilla folder, ship it with the generator package, and then just have the main binary use ncp to copy that folder into wherever the caller currently is.
My problem there is that I don't know how I can access the folder included in the globally-installed package to copy over. And additionally, npm lets you specify a files array in your package.json to specify files included with the package, but it's apparently hardwired to ignore package.json from that.
What's the recommended procedure for this sort of thing?
I have been searching in every node_modules in my project but I cannot find where http module is located.
I'm using Windows and cannot find it in this path either C:\Users\userx\AppData\Roaming\npm\node_modules.
Is there a way or a command in npm to know where a particular module is located given the current path?
In this particular case http is built-in so it's not part of node_modules/.
If it was a non-core library that you'd installed you'd find it somewhere in there, though it could be a sub-dependency so you may have to dig a little.
To see everything in your project, including dependencies:
npm list
That structure strongly mirrors how it's organised in the various subdirectories.
You can use npm list to see the installed libraries for your current location or npm list -g to see where global libraries are installed.
1) As #tadman said, http - a built-in module.
2) To search for locations not built-in module try to use the function require.resolve(). For example:
console.log( require.resolve('express') );
Is it possible to force an external npm dependency to use a different node.js package that offers the same API but a different implementation?
If you're willing to do that and that module is open source you could fork that on github, change their package.json to include the module you want and use github url for your own package.json like this:
"modulename": "git+https://git#github.com/user/repo.git"
You should be able to download the source of whatever module you would prefer and put that folder within your node_modules folder. From that point you simply require it within your Node.js app like any other NPM module.
I recommend downloading the code for the API you want, creating an src/assets folder, placing it in there, changing the package name in package.json to something not used in npm, then using 'require('newPackageName')' within your code.
If you decide to use some of package.json's capabilities to point towards a specific version (like using "1.4.7" as opposed to "^1.4.7") or if you point to a github address, be careful when you run npm update. It will replace your URL with the latest version in npmjs.org with that specific name. I don't know if it still does this in newer versions of npm, but in the version that works with Node.js 0.12, this is the default behavior.
I can tell you that node shrinkwrap will work, but it will prevent any other packages from being updated as well. No, you cannot just have one shrinkwrapped dependency, it has to be all of them, or npm update won't work.
I know that doing something like this in package.json :
....
...
"dependencies" : {
"some-node-module" : "*"
}
is a bad idea since you're basically telling node to always update this module to its latest version, even though your code might not be able to handle any other version other than the current one for this particular module.
So I should instead do something like this :
....
...
"dependencies" : {
"some-node-module" : "3.4.1"
}
Which basically tells node to always use the version of the module that my code was built around.
Question
I have an app which I've first tested locally. The app has now been built, and using the package.json dependencies, npm has installed all of the appropriate node modules locally under my app's root folder (as opposed to globally, in some obscure folder I don't have immediate access to and which is irrelevant to this app - I simply don't like global installations of node modules - I find them to.. "abstract").
Given that all of the node modules are now installed locally isn't the node modules dependencies part in my package.json now redundant ?
I mean, what if something happens and npm is not available or the specific version of a module can't be found?
Isn't it best to be independent of dynamic node module installations and just have everything installed locally the first time without having to use the package.json dependencies ?
npm install & update
"you're basically telling node to always update this module to its latest version"
Packages won't be automatically updated. The only time the "*" will be an issue is when you are installing the project for the first time via npm install or when you manually run an update via npm update.
I personally prefer to pick a specific version of a module rather than use any wildcards, but even then there are some gotchas...which is why npm shrinkwrap exists.
npm shrinkwrap
Next gotcha:
basically tells node to always use the version of the module that my
code was built around
Sorta true. Let's say you use version 1.2.3 of your favorite module and package.json reflects that, but in the module itself is a package.json dependency on another module and that uses "*"...so when you install, the new internal dependency and the wildcard can wind up breaking the module you thought was 'locked down'.
See the gotcha? Hard coding a version controls for the top level versions but does not enforce anything beneath that...and if a module author you depend upon (or a module they depend upon) uses wildcards, you can't be 100% sure things will be copacetic.
To strictly enforce a version, you'll want to use npm shrinkwrap. (The link there to the docs provides more background, which is good to understand if your project uses more than a few very simple modules.)
And now...your question.
You say:
I mean, what if something happens and npm is not available or the
specific version of a module can't be found?
Based on the first two parts of this answer, it should now be clear that it doesn't hurt to have the dependencies explicitly listed in the package.json because node isn't checking things every time the app runs. npm uses package.json when specific actions (install, update, etc) are called but even then, it is a manual trigger.
While situations vary, there are very few that I can imagine where omitting dependencies in package.json is a good idea. If you ever wind up having to rebuild the project, you'll be in trouble. If the project is so good you want to share it, you'll be in trouble. Heck, if this is something for work and you want to go on vacation and need to install it on another machine...you'll be in trouble.
So given the fact that after the initial install, dependencies have no negative impact...use --save or add the dependencies to your package.json. Your future self will thank you. :)