I'm working in a web application (JavaScript/C#, version controlled by TFS) and our team wants to start using Visual Studio 2015. Microsoft is moving developers to use existing popular tools like Gulp for automated tasks, so I've written a few Gulp tasks that will run on the server.
My problem is that our automated builds generate new project folders on the build server, so I can't run gulp myBuildTask without first running npm install. The npm install adds over 2 minutes to the build process, and it seems very inefficient to download the same dependencies for every build (since they will change rarely).
Is there anyway I can run a Gulp task on a new project folder without first running npm install?
Options I've considered:
Include node_modules in TFS. I couldn't add the node_modules folder to TFS (which would cause it to exist in each new build folder) because bower's nested dependencies have file paths that are too long for Windows. I could go this route without bower, but I'm not certain I want all those files in my solution (much of which is not needed, like readme's and test files).
Run npm install after each automated build.
As already mentioned, I don't want to do this because it adds several minutes to the build process.
Install NPM modules globally.
I'm not sure if this is even possible, but I'm wondering if I can install all project dependencies globally on the build server (avoiding having to install at the project level). My concern with an approach like this is that I don't want to have to manually update the build server's globally installed NPM modules every time we add a gulp plugin.
Ideally, the solution would be something like #3. The modules would install globally, but every build could run an npm install which would verify every module is installed. If a new npm module was added to the package.json, it would be downloaded. This npm install would be pretty fast since in most cases, all modules would already exist (globally installed on the build server).
There are a few things you might do:
Make npm install run faster. For this purpose, use newest npm (if possible) or use npm dedupe. Running dedupe may result in having less dependencies than with plain npm install. Then run npm shrinkwrap which creates npm-shrinkwrap.json file which contain 'freezed' info about what exactly gets installed (and in which version) during npm install.
Remember, node_modules is just a directory, if you can copy / rsync it to your installation, you can skip the npm install phase altogether
Node package resolution approach is to first try local node_modules directory and if not successful, (node_modules not there or dependency missing in node_modules) check out node_modules of the parent directory, then grandparent directory and so on. This means, you don't have to install packages globally, semi-global installation is quite sufficient
:
my_project
node_modules/
dependency1
dependency2
build_001/
build_002/
build_00x/
no node_modules here,
no deps here
Note however, that this, naturally, works only if your dependencies are really not changing. Since in real life you install something new from time to time, slightly enhanced approach might be helpful: organize your directories as follows:
my_project
ver_af729b
node_modules
build_001
build_002
ver_82b5f3
node_modules
build_003
build_004
af729b and 82b5f3 being (prefixes of) sha hashes of your npm-shrinkwrap.json file. If you then add new dependency, shrinkwrap file gets updated, build script creates new ver_something directory and executes npm install in it. Doing all this would naturally require extra work, but it should work great.
------------------ EDIT -------------------
If you are not trying to avoid npm install completely (you just want it to be quick) you can stick to the typical scenario: you checkout the sources always to the same directory, and let npm install re-use the old node_modules as much as possible.
If you want always to create a new directory for your build, you may still create a node_modules symlink to the older version of node_modules - also in this scenario, npm will reuse as much as possible from symlinked folder.
Related
I am a total Javascript newbie aiming to configure my Mac nicely for development.
NPM is installed.
I notice that folder node_modules exists in my Users/MyName directory.
I think this is a result of having either installed Node/NPM or specifically run npm install airtable the other day, which I did at the time in Users/MyName.
When I npm uninstall airtable, it removes airtable and its dependency folders from nodule_modules, leaving the following: #types and package-lock.json (hidden).
If I cd to new project-specific directory Users/MyName/Projects/Code/myusername/airtable-test and run npm install airtable from there, I expected the packages may get installed in that folder. However, again, they get installed up at Users/MyName/node_modules.
In both cases, .package-lock.json (non-hidden) and package.json are in Users/MyName, which seems messy to me. (I haven't done anything non-standard in install).
Is this the way things should be?
Attempts to solve:
I seem to read, including from questions on Stackoverflow, that storing modules at Users/MyName/node_modules is effectively storing them globally, accessible to any app, and such that projects don't have to get committed to server with all dependencies in tow - the idea being that, after you deploy your app, you then run npm install whilst in its folder, prompting it to install all dependencies.
Is this right? Should I be looking at storing all dependency modules in a project folder, or above and outside of it?
(If the answer to this question is opinion-based, I wasn't aware of that).
Here is what I believe is happening. You have your package.json in folder Users/MyName and you are running npm install in Users/MyName/Projects/Code/myusername/airtable-test. But the problem is you do not have package.json file in the folder Users/MyName/Projects/Code/myusername/airtable-test. So npm goes up in the directory to find the package.json and it found it in Users/MyName so it is installing the package there.
This is happening because the way npm identifies a project is by looking for package.json. If it does not find it in current directory than it assumes that you must be inside some sub directory of the project and start searching upwards in the folder hierarchy to find the package.json.
solution
Do npm init in the folder Users/MyName/Projects/Code/myusername/airtable-test. This will initialize the folder as a npm package (by creating package.json).
I am learning Angular and each time it takes like 5 minutes to create a new project because of 100MB folder "node_modules" which CLI creates. And the files in this folder are always the same (unless you add some dependencies which I never do). Is there a way to use one node_modules folder for every project?
Have a look in to https://yarnpkg.com/blog/2017/08/02/introducing-workspaces/
Yarn Workspaces is a feature that allows users to install dependencies
from multiple package.json files in subfolders of a single root
package.json file, all in one go.
Making Workspaces native to Yarn enables faster, lighter installation
by preventing package duplication across Workspaces. Yarn can also
create symlinks between Workspaces that depend on each other, and will
ensure the consistency and correctness of all directories.
npm install -g yarn
You can install all dependencies globally or create a symlink from one place to every project.
BUT it is bad practice, correct way is to use separate node_modules for each project, even if you are using same packages. Once you will need use different versions of same package in different projects and common node_modules will cause a lot headache.
Try to use npm cache and npm install --prefer-offline if you just want to install package faster and don't care about version very match. I didn't use it but believe it should work.
Only packages installed by node (npm install) can be in the node_modules folder. This is because, if someone wants to install your project, instead of downloading the whole project with the node_modules included. They type npm install.
Based on the packag.json the node_modules will now be downloaded to the node_modules folder.
So you can put angular in the node_modules folder if it is an npm package. No you cannot put your own files in this folder.
So you can just copy your package.json to every project and run npm install. Then all the node_modules will be the same.
is there any way to route npm install to a specific part of hard drive and when i do npm install it make node_module folder in that part of drive, and when i run any project it look for dependencies in that part of drive,
just like single pool for every project.
then if i have two projects with similar dependencies then i only need to npm install in one project so dependencies become available in pool, and no need to do npm install in another project just npm start
Thank you,
Inzamam Malik
You can achieve something close to what you are describing with the link option.
From https://docs.npmjs.com/misc/config#link:
If true, then local installs will link if there is a suitable globally installed package.
Note that this means that local installs can cause things to be installed into the global space at the same time. The link is only done if one of the two conditions are met:
The package is not already installed globally, or
the globally installed version is identical to the version that is being installed locally.
So you will still have some files in each project's node_modules, but you shouldn't have as large a folder.
To turn this behavior on, run:
npm config set link -g
Edit: There is no way you can avoid running npm install and having a node_modules folder. Node.js always looks in node_modules for dependencies (this behavior pre-dates npm itself). The link option will make npm create symlinks in node_modules, pointing to a common pool. That will reduce disc usage, but you cannot do away with node_modules.
You can use PNPM Package manager, It uses a global pool for dependencies.
I'm debating how I should setup certain node modules.
Lets say I have a folder called "Projects". This will hold various code projects for node that I'll create under this going forward.
Now I can install stuff like cucumber, lodash, mocha, etc...stuff that I know I'll probably use across most all my projects:
1) npm install -g
- here, any package.json can find it on my PC I think
2) npm install [whatver] in the root of my "Projects" folder so that now I have an npm_modules folder sitting at the root so any projects created's package.json will able to find those type of modules at the root of my Projects folder
- here, I'd have to npm install once at the root of my Projects folder if not already installed globally and I didn't go with option #1
3) npm install into each project under projects. But this seems like it's not efficient. If I have to make people install stuff like cucumber every time they clone down a project, that means when they run npm install, it'll have to install cucumber again and again, for each project which seems stupid to me to do something like that if it's really a global package I plan on using across many projects
-- so here for example I might have several projects I create or clone: Projects**MyProject1**, Projects**MyProject2**, and so on. Each of those projects has its own package.json of course looking for dependencies like cucumber, mocha, etc. If I do it this way I'll have to wait for npm to install those into each's own node_module folder so for example Projects\MyProject1\node_modules\cucumber, Projects\MyProject2\node_modules\cucumber and so on. Seems stupid and duplication all over to do that...?
Suggestions on which option is best and why you think that based on your experience managing projects in node?
npm install -g - here, any package.json can find it on my PC I think
This won't work because global modules cannot be picked up by require in your node scripts.
npm install [whatver] in the root of my "Projects" folder so that now I have an npm_modules folder sitting at the root so any projects created's package.json will able to find those type of modules at the root of my Projects folder
This will work for sure as long as the projects in your "Projects" folder will always be there. If you publish a project then the dependencies for that project will have to go with it.
npm install into each project under projects. But this seems like it's not efficient. If I have to make people install stuff like cucumber every time they clone down a project, that means when they run npm install, it'll have to install cucumber again and again, for each project which seems stupid to me to do something like that if it's really a global package I plan on using across many projects
Why is this stupid? As long as you do npm install cucumber --save then your dependency on cucumber will be saved to your project's package.json file. All anyone who clones your project should have to do is this:
$ git clone project.git
$ cd project && npm install
npm install without any additional arguments will install all the dependencies listed in the package.json file for the project. It only has to do this once. After that all the dependencies are downloaded and installed within the node_modules directory for your project. The only time they'd need to run npm install again from the root of the project directory would be if they deleted the node_modules folder or you made a change and added a new dependency to package.json.
Installing modules in your "Projects" directory will make them available to any scripts requireing the module from within any subdirectories. Keep in mind that if I were to clone your repository I won't have your "Projects" directory. I'll just have the directory for your project, wherever I cloned it to. I need to get those dependencies somehow and the easiest way is for me to cd into the project and run npm install where you should have a package.json file that lists all the required dependencies.
PS - npm install [module-name] --save only saves the dependency version if you already have a package.json file in the root of your project. If you don't have one yet, then initialize one first.
$ npm init
If I want to copy a node project: Does it make any difference if I just copy node_modules or install all the modules again from scratch via npm?
2017-05-12
I've updated this answer to reflect changes since the release of npm 3.x and new tools that are available.
npm v3 dependency installation is now non-deterministic meaning you may get different packages depending on the order in which packages have been installed over time. This isn't necessarily a bad thing, just something to be aware of.
Given this change I personally don't copy my node_modules directory around too much (it's still possible though!) and instead opt for a clean install most of the time.
There are new tools like Yarn Package Manager which can speed up the installation process if you are doing that a lot (but as of 2017-05-12 it's unclear how well it handles private npm organisations and private scoped packages).
So the takeaway is still pretty much the same: it won't hurt, but maybe err on the side of a clean install. If something weird does happen and you run into problems then you can just delete node_modules and run npm install.
Original answer from 2014-06-08:
In general it should be fine - I copy the node_modules directory sometimes from my other projects to speed up the setup process.
You can always copy node_modules and then run npm install or npm update in the new project to make sure you've got up-to-date versions. npm will use the files in node_modules as a cache and should only bring down newer content if required.
In short: it won't hurt. If something weird does happen and you run into problems then you can just delete node_modules and run npm install.
Linux command
To copy projects without node modules directory:
In the directory of your projects run this:
rsync -av --progress --exclude="node_modules" <source directory> <target directory>
for example:
rsync -av --progress --exclude="node_modules" . /mnt/d/omer/new-directory