I have a project, which consists of one root node package containing subpackages linked together by npm link - these subpackages depend on each other (listed in package.json dependencies) and the structure basically looks like this:
-rootpackage
--subpackageA
--subpackageB
Lets say subpackageA has dependency on subpackageB, so I link them to avoid publishing/reinstalling subpackageB in subpackageA after every change in the source of subpackageB.
The link works just fine until I run npm update in subpackageA, which causes the subpackageB to be unlinked.
Now, I see two options:
I can theoretically run the npm link operation after each npm install or npm update to ensure the links are always present. This works with postinstall in case of installation, but in case of an update the postinstall is not called. I don't know any postupdate command for npm, which is to be called after update.
Maybe there is a way to do this more cleverly, perhaps with yarn, which I am also using, in a way, that it kind of prevents unlinking or excludes the update for my subpackages, so I don't lose the links between my subpackages, but right now I am not aware of such a way.
Is there any way to make one of those options work or any other way to solve this problem ? I need to keep this and other links so we don't have to run npm link after every installation/update. I can't really find information about this issue anywhere. Btw I am using Node 6.4.0 and NPM 3.10.3.
So the solution is to use Yarn Workspaces or maybe project like Lerna.
Yarn Workspaces is a utility that expects a structure similar to what was described in the question and which maintains the linking subpackages and root automatically. It is very easy to set up (just 2 lines in root package.json and executing yarn for the first time) and after it you don't have to worry about upgrade or install at all, the links stay in place unless you delete them manually.
Lerna expands on that and provides you with additional tooling for managing multipackage projects. It can use Yarn Workspaces internally for the linking if you use yarn but it is not a requirement and works fine with npm. Just make sure to have Git because last time I checked Lerna didn't work with SVN or other VCSs.
Related
I'm working on a Javascript project, and as it so happens one of my dependencies pulls in puppeteer, which in turn downloads a whole copy of Chromium into my node_modules. My larger project is split into multiple Javascript packages, so I end up with multiple identical copies of Chromium among other stuff.
Is there a way to deduplicate these packages system wide? Note, npm dedupe seems to do something completely different to what I want.
I imagine there would be a module repository in my home directory which contains every package I need (in every version needed), and then in the local node_modules directories would contain only symlinks to the repository. This seems like an incredibly obvious optimisation, but I can't find any way to do it in npm. If not in npm, is it maybe possible in yarn?
As an added complication, this should also work on Windows (where symbolic link support has historically been not so good).
It seems the following command does what I want:
npm config set link -g
Then delete node_modules, and do npm install again. It should be much smaller now.
The documentation says:
If true, then local installs will link if there is a suitable globally installed package.
Note that this means that local installs can cause things to be installed into the global space at the same time. The link is only done if one of the two conditions are met:
The package is not already installed globally, or
the globally installed version is identical to the version that is being installed locally.
I am not sure if this has any negative side-effects - for example clobbering the global namespace with commands I don't want. For now, it seems to work fine.
I want to try and make some changes to a package published in npm? (I've suggest some changes as an issue but I think they are simple enough for me to attempt them).
https://www.npmjs.com/package/bt-presence#contributing--modifying
The author supplies some information on how to modify the package, but not really enough for someone doing it for the first time.
Where should I clone the GitHub repo to? The folder where the package is installed? I tried it in my home folder and that would not build (unmodified).
The command npm run build - where is this run from? The root folder of the package where the package.json is?
Will I need to modify the package.json?
In general what is the best way to develop something like this for npm? I've worked on packages before but they were simply Javascript.
If you want to work on the bt-presence package in isolation, you can put the cloned repository anywhere. If you want to use your modified version of bt-presence in combination with an application, my recommended approach is to register bt-presence as a dependency in the application's package.json file with the version set to a relative path to your bt-presence repository; then running npm install in the application will make a symlink from node_modules/bt-presence in the application to your bt-presence repository.
npm run build should indeed be run from the root folder that contains the package.json of bt-presence.
If you just want to change the code of bt-presence, you won't need to modify its package.json. You would only modify the package.json if you need to change any of the settings in there, e.g, if you need to add additional dependencies to your version of bt-presence.
None of the above is really specific to TypeScript. (Some JavaScript packages have build processes too if they need to transform or package the JavaScript files in some way.)
I'm trying to write an npm package that will be published and used as a framework in other projects. The problem is -- I can't figure out a solid workflow for working on it at the same time as working on projects that depend on it.
I know this seems super basic and that npm link solves the issue, but this is a bigger one than just being able to import one local package from another.
I have my framework package scaffolded out; let's call it gumby, It exports a function that does console.log('hello from gumby'). That's all that matters for right now.
Now I'm ready to create a project that will use gumby. Let's call this one client. I set that up too and npm link gumby so client can import from it, etc. OK cool, it's working as expected.
So now it's time to publish gumby. I run npm publish and it goes out to npm as version 0.0.1.
At this point, how do I get the published, npm-hosted version of gumby into the package.json for client? I mean, I could just delete the symlinked copy from my node_modules and then yarn add gumby, but what if I want to go back and work on it locally again? And then run it against the npm version again? And then work on it some more? And then...
You get the point, I imagine. There's no obvious way to switch between the npm copy of a package that you're working on, and the local one. There's the additional problem of how to do that without messing with your package.json too much, e.g. what if I accidentally commit to it version control with some weird file:// dependency path. Any suggestions would be much appreciated.
For local development, having the package symlinked is definitely the way to go, the idea of constantly publishing / re-installing the package sounds like a total pain.
The real issue sounds more like you’re concerned about committing a dev configuration to prod - you could address that problem with something as simple as a pre-commit hook on your VCS e.g. block if it detects any local file references in the package.json.
I know that doing something like this in package.json :
....
...
"dependencies" : {
"some-node-module" : "*"
}
is a bad idea since you're basically telling node to always update this module to its latest version, even though your code might not be able to handle any other version other than the current one for this particular module.
So I should instead do something like this :
....
...
"dependencies" : {
"some-node-module" : "3.4.1"
}
Which basically tells node to always use the version of the module that my code was built around.
Question
I have an app which I've first tested locally. The app has now been built, and using the package.json dependencies, npm has installed all of the appropriate node modules locally under my app's root folder (as opposed to globally, in some obscure folder I don't have immediate access to and which is irrelevant to this app - I simply don't like global installations of node modules - I find them to.. "abstract").
Given that all of the node modules are now installed locally isn't the node modules dependencies part in my package.json now redundant ?
I mean, what if something happens and npm is not available or the specific version of a module can't be found?
Isn't it best to be independent of dynamic node module installations and just have everything installed locally the first time without having to use the package.json dependencies ?
npm install & update
"you're basically telling node to always update this module to its latest version"
Packages won't be automatically updated. The only time the "*" will be an issue is when you are installing the project for the first time via npm install or when you manually run an update via npm update.
I personally prefer to pick a specific version of a module rather than use any wildcards, but even then there are some gotchas...which is why npm shrinkwrap exists.
npm shrinkwrap
Next gotcha:
basically tells node to always use the version of the module that my
code was built around
Sorta true. Let's say you use version 1.2.3 of your favorite module and package.json reflects that, but in the module itself is a package.json dependency on another module and that uses "*"...so when you install, the new internal dependency and the wildcard can wind up breaking the module you thought was 'locked down'.
See the gotcha? Hard coding a version controls for the top level versions but does not enforce anything beneath that...and if a module author you depend upon (or a module they depend upon) uses wildcards, you can't be 100% sure things will be copacetic.
To strictly enforce a version, you'll want to use npm shrinkwrap. (The link there to the docs provides more background, which is good to understand if your project uses more than a few very simple modules.)
And now...your question.
You say:
I mean, what if something happens and npm is not available or the
specific version of a module can't be found?
Based on the first two parts of this answer, it should now be clear that it doesn't hurt to have the dependencies explicitly listed in the package.json because node isn't checking things every time the app runs. npm uses package.json when specific actions (install, update, etc) are called but even then, it is a manual trigger.
While situations vary, there are very few that I can imagine where omitting dependencies in package.json is a good idea. If you ever wind up having to rebuild the project, you'll be in trouble. If the project is so good you want to share it, you'll be in trouble. Heck, if this is something for work and you want to go on vacation and need to install it on another machine...you'll be in trouble.
So given the fact that after the initial install, dependencies have no negative impact...use --save or add the dependencies to your package.json. Your future self will thank you. :)
npm allows us to specify bundledDependencies, but what are the advantages of doing so? I guess if we want to make absolutely sure we get the right version even if the module we reference gets deleted, or perhaps there is a speed benefit with bundling?
Anyone know the advantages of bundledDependencies over normal dependencies?
For the quick reader : this QA is about the package.json bundledDependencies field, not about the package.
What bundledDependencies do
"bundledDependencies" are exactly what their name implies. Dependencies that should be inside your project. So the functionality is basically the same as normal dependencies. They will also be packed when running npm pack.
When to use them
Normal dependencies are usually installed from the npm registry.
Thus bundled dependencies are useful when:
you want to re-use a third party library that doesn't come from the npm registry or that was modified
you want to re-use your own projects as modules
you want to distribute some files with your module
This way, you don't have to create (and maintain) your own npm repository, but get the same benefits that you get from npm packages.
When not to use bundled dependencies
When developing, I don't think that the main point is to prevent accidental updates though. We have better tools for that, namely code repositories (git, mercurial, svn...) or now lock files.
To pin your package versions, you can use:
Option1: Use the newer NPM version 5 that comes with node 8. It uses a package-lock.json file (see the node blog and the node 8 release)
Option2: use yarn instead of npm.
It is a package manager from facebook, faster than npm and it uses a yarn.lock file. It uses the same package.json otherwise.
This is comparable to lockfiles in other package managers like Bundler
or Cargo. It’s similar to npm’s npm-shrinkwrap.json, however it’s not
lossy and it creates reproducible results.
npm actually copied that feature from yarn, amongst other things.
Option3: this was the previously recommended approach, which I do not recommend anymore. The idea was to use npm shrinkwrap most of the time, and sometimes put the whole thing, including the node_module folder, into your code repository. Or possibly use shrinkpack. The best practices at the time were discussed on the node.js blog and on the joyent developer websites.
See also
This is a bit outside the scope of the question, but I'd like to mention the last kind of dependencies (that I know of): peer dependencies. Also see this related SO question and possibly the docs of yarn on bundledDependencies.
One of the biggest problems right now with Node is how fast it is changing. This means that production systems can be very fragile and an npm update can easily break things.
Using bundledDependencies is a way to get round this issue by ensuring, as you correctly surmise, that you will always deliver the correct dependencies no matter what else may be changing.
You can also use this to bundle up your own, private bundles and deliver them with the install.
Other advantage is that you can put your internal dependencies (application components) there and then just require them in your app as if they were independent modules instead of cluttering your lib/ and publishing them to npm.
If/when they are matured to the point they could live as separate modules, you can put them on npm easily, without modifying your code.
I'm surprised I didn't see this here already, but when carefully selected, bundledDependencies can be used to produce a distributable package from npm pack that will run on a system where npm is not configured. This is helpful if you have e.g. a system that's not networked / not on the internet: bring your package over on a thumb drive (or whatever) and unpack the tarball, then npm run or node index.js and it Just Works.
Maybe there's a better way to bundle up your application to run "offline", but if there is I haven't found it.
Operationally, I look at bundledDependencies as a module's private module store, where dependencies is more public, resolved among your module and its dependencies (and sub-dependencies). Your module may rely on an older version of, say, react, but a dependency requires latest-and-greatest. Your package/install will result in your pinned version in node_modules/$yourmodule/node_modules/react, while your dependency will get their version in node_modules/react (or node_modules/$dependency/node_modules/react if they're so inclined).
A caveat: I recently ran into a dependency that did not properly configure its dependency on react, and having react in bundledDependencies caused that dependent module to fail at runtime.