I work on two repositories at once. One depends on the other (listed in package.json dependencies).
So I am using npm link ..\theOne in other to work on both modules at once. As a result I can test the modification on one module on the other. Problem is when doing npm shrinkwrap on this other module: it will generate errors like:
npm ERR! extraneous C:\other\node_modules\theOne\node_modules\{xxxx}
{xxxx} is a dev dependencies that appears as extraneous for npm.
Anyone has succeded to shrinkwrap a module with symlink to another modules?
NB:
npm 3.10.3
node 6.3.0
I post a solution here. It doesn't explain how to shrinkwrap symlinks but I found a better workaround to develop multiple modules at once with dependencies between those modules.
The solution is to use an alternative to npm link to handle the modules relations during development: Instead of having a folder A linked to another folder B (aka symlink), the solution is to have the modified files from B copied in folder A. This solution is very powerful because it avoid getting node module shrinkwrapping error due to unwanted modules from B. How to do it:
On Mac
You can use wml: Wml listens to changes in some folder (using Watchman) and copies changed files into another folder. I never use it but my teammate using Mac use it every day.
On Window
I use DSynchronize (after clicking this link, scroll down to see the executable). DSynchronize is a stand-alone utility that let you periodically synchronize two or more folders. You can exclude from copying some folders (like node_modules) or include others (like lib). The configuration file DSynchronize.ini can be edited with a text editor. For example:
Source0000=-C:\DEV\workspace\js-common
Destination0000=-C:\DEV\workspace\connexme\node_modules\js-common
Filter0000=
VolumeSerialOri0000=407325536
VolumeSerialDest0000=407325536
ExcludeFilter0000=0
NoSubDirectory0000=0
NoFilterDirectory0000=1
DateBeforeToExcludeFiles0000=00:00:00
FilterFolder0000=\.git|\.vscode|\node_modules
This configuration will copy file from C:\DEV\workspace\js-common to C:\DEV\workspace\connexme\node_modules\js-common. This way, when I npm shrinkwrap my project connexme, I'll doesn't get extraneous bad folder from js-common folder.
Just a three last things about editing DSynchronize.ini for DSynchronize configuration:
Using it or closing it will save the configuration in DSynchronize.ini so be careful when editing DSynchronize.iniand the closing it. It will override DSynchronize.ini.
Make sure each variables is unique. Example: Source0000, Source0001...
Make sure to have all the variables present for each Source/Destinations (see example above).
Related
I have a project, which consists of one root node package containing subpackages linked together by npm link - these subpackages depend on each other (listed in package.json dependencies) and the structure basically looks like this:
-rootpackage
--subpackageA
--subpackageB
Lets say subpackageA has dependency on subpackageB, so I link them to avoid publishing/reinstalling subpackageB in subpackageA after every change in the source of subpackageB.
The link works just fine until I run npm update in subpackageA, which causes the subpackageB to be unlinked.
Now, I see two options:
I can theoretically run the npm link operation after each npm install or npm update to ensure the links are always present. This works with postinstall in case of installation, but in case of an update the postinstall is not called. I don't know any postupdate command for npm, which is to be called after update.
Maybe there is a way to do this more cleverly, perhaps with yarn, which I am also using, in a way, that it kind of prevents unlinking or excludes the update for my subpackages, so I don't lose the links between my subpackages, but right now I am not aware of such a way.
Is there any way to make one of those options work or any other way to solve this problem ? I need to keep this and other links so we don't have to run npm link after every installation/update. I can't really find information about this issue anywhere. Btw I am using Node 6.4.0 and NPM 3.10.3.
So the solution is to use Yarn Workspaces or maybe project like Lerna.
Yarn Workspaces is a utility that expects a structure similar to what was described in the question and which maintains the linking subpackages and root automatically. It is very easy to set up (just 2 lines in root package.json and executing yarn for the first time) and after it you don't have to worry about upgrade or install at all, the links stay in place unless you delete them manually.
Lerna expands on that and provides you with additional tooling for managing multipackage projects. It can use Yarn Workspaces internally for the linking if you use yarn but it is not a requirement and works fine with npm. Just make sure to have Git because last time I checked Lerna didn't work with SVN or other VCSs.
Always feel stupid asking here because people are always confused with my questions, or I have a dumb problem, but, I'm working on a program in node.js and the text editor I'm using (NP++) doesn't seem to like to save files in the system32 directoy, (The directory where my modules are), and that is where my script is as well. (So I have .../.../node_modules/(modules) and .../.../node_modules/script.js) this becomes a pain when I want to edit the script, I have to clone the script to my desktop, then edit it, then overwrite the one in the node_modules directory. I tried saving the script to my desktop and running it, but it just gives me an error of module not found. (In my script I have the modules as var example = require('example.js')) Is there any way I can get it to get the modules from the node_modules directory, while keeping the script file somewhere easily accessible and editable? (i.e desktop?) (Sorry if this is confusing, not the best at these kind of things)
I'm not 100% sure that this is what's happening because I haven't used npm on Windows, but it sounds to me like you're installing your dependencies globally using npm -g. The more proper way to use Node is to install your dependencies locally, using npm without the -g flag. That way your dependencies get installed in your current working directory.
For example, let's say you've saved your project in a directory on your Desktop, and your script uses require("lodash"). If you cd to your directory and run npm install lodash, then the lodash module will be available to your script.
Recently started working with Gulp and I can't figure out is it really necessary to have a copy of node_modules directly in folder with current project?
E.g. I have this structure:
mysite
└─builder
└──node_modules
└─work
└─work2
How can I access node_modules in folder 'builder' from folder 'work' or 'work2' without copying it? It is quite large, about 100mb, and seems to me it has no sense to have a copy of it for every new project.
I tried this line export NODE_PATH='D:\OpenServer\domains\mysite\build' in file package.json and then tried command gulp but it replied[10:24:27] Local gulp not found in d:\OpenServer\domains\mysite\work
[10:24:27] Try running: npm install gulp
Short answer
Don't do it. Let NPM work the way it's designed to. However, to save space, you can delete the node_modules folder on projects that are currently dormant, and recreate it with a single shot of npm install when you switch back to them.
Justification
Even if you share your node_modules, you'll probably have redundancies in it anyway. What will you do about them next ?
It is the essence of NPM to replicate modules per project. If you dig into the node_modules folder tree, you may notice that it can even contain several replications of a same library under one given dependencies tree. Say you requested two modules explicitely, and both these modules themselves pulled a dependency that takes care of a lot of things, and is therefore called lib_DADDYMUMMY :
node_modules
+ a_module_you_use v0.5
+ lib_DADDYMUMMY v0.1 (pulled as a dependency of this module)
+ another_module_that_you_requested v0.3
+ lib_DADDYMUMMY v0.1 (again ! pulled as a dependency of this other module)
This comes in handy when your two module start needing different versions of lib_DADDYMUMMY. This comes in handy when you maintain long-lived projects ! And hell knows that in the JavaScript world, with fast changing APIs, you can consider most any decent project as long-lived. :)
One could imagine having all dependencies being shared by everyone, living in a flat structure, with several versions of a library living next to each other and every one finding what he needs there. That repository could be called, say, .m2. But that's just not the way NPM works unfortunately.
NPM considers that storage space is cheap. That's its price for helping you manage versions in dependencies, dependencies of dependencies, and dependencies of dependencies of dependencies. I consider that it's an affordable price for taking care of the dirty jobs the day when work and work2, as their lives go on, take diverging maintenance paths. I wouldn't try getting in its way by forcing a half-Maven-like folder model.
Maybe you should put your package.json into your root directory(mysite/package.json),
then try to install node_modules on the root.
In addition, you write gulpfile on the same dir.
eg.
mysite
|- package.json
|- node_modules
|- gulpfile.js
└─builder
└─work
└─work2
However, I recommend that you write one single gulpfile for each project.
One problem why you shouldn't do this is because of versioning. If your modules require different versions of the same package, you're going to run into problems. One package is going to win, and it might break another package.
Further, you get into the problem of having to merge the dependency lists in some way - meaning, you'll have to get the dependencies from work/package.json, work2/package.json, etc. and then install all of them at once.
Merging node_modules/ won't solve your problem, either - believe me, don't try.
Paste the node_modules folder inside your mySite directory.
All npm packages such as gulp will work in your work or work2 directory.
But, now(your folder structure) work folders can't find node_modules in their parent directory.
I would like to download node module packages (listed in a package.json file, in the present working directory) source code to a node_modules subdirectory of the present working directory, without compiling, or installing those modules. Now I have seen the related question download source from npm without npm install xxx but that question dealt with downloading the source code of individual modules specified to NPM directly (i.e., without using a package.json file). The reason why I want to do this is because I am working on developing an Atom package for the Open Build Service (OBS) of openSUSE and this seems like one of the necessary steps I need to go through in order to achieve this.
The source code is not shipped with the npm distributed code. The best you could do is read the package.json and look for the { repository: url { } } key if it exists and if it's a git repo (which most of them will be) clone it.
However be aware that the source code often requires a build step before it can be used, as in an npm prepublish step defined in the source code. In modern Javascript projects a common example of this is transpiling ES6 code to ES5 code for use in NodeJS and the browser.
I have not made an Atom package but I'm fairly certain you don't need to do any of this.
I want to use gulp on my windows machine and it actually works pretty fine, unless I try to use the created files (like pushing to github or deleting). Then it breaks, because the filepaths are too long and it seems to be a fairly common problem. https://github.com/joyent/node/issues/6960#issuecomment-45569604
I understand that the problem arises through npm's nested directories, which extend the maximal char count for Windows directories, but in my understanding there is not any solution yet.
As I see it right now I have three options:
Try to reduce the chars of npm's directories, by changing the default from 'node_modules' to 'n_m' and hope that problem ist postponed. Like suggested here:
https://github.com/joyent/node/issues/6960#issuecomment-45569604
Then it is my question, how exactly do I change the default 'node_modules' directory name?
Change my development environment to Ubuntu, which is frankly a solution I dislike, because I've never used Ubuntu.
Stop using gulp overall.
So, how do I change the default 'node_modules' directory created through npm or what solution do you actually suggest?
There is one more tricky option.
Main problem is that gulp has a lot of nested dependencies and it creates very long nested file pathes.
But if you install some of npm modules that gulp requires in your main node_modules directory gulp will not download them as nested.
Currently you have something similar to (this may be not real path you have but idea is the same):
\node_modules\gulp\node_modules\lodash.bind\node_modules\lodash._createwrapper...
If you will add "lodash.bind" module to your project's package.json as dependency it will be installed in one level with gulp and not as gulp's dependency
\node_modules\gulp
\node_modules\lodash.bind\node_modules\lodash._createwrapper
And this will shorter all urls. You will need to fix only one or two with the longest pathes and it will work.
In my project it was enough to add this dependencies: “lodash.createcallback” and “lodash.bind” to package.json to fix everything.
Take in mind that befor doing this you probably would need to clear current node_modules folder. If you are not able to do that because off too long url you can create symbolic link to temporary short file path and delete it.