I'm not regular node user, so my apologies if this is a stupid newbie question, but I haven't been able to find any clear documentation on this, and my feeble newbie node skills don't let me dig into myself.
I'm following along with these instructions for installing the Ghost blogging system, (a system built with NodeJS).
After telling me to open a terminal window in the just downloaded package folder, yhe instructions include the following line
In the new terminal tab type npm install --production
This confuses me. My understanding of npm is it's a package manager that, like perl's CPAN
Fetches packages from The Internet
Installs them into my local node system
That's clearly not what's happening above, but I don't know what is happening when I run that command, and since I don't run with a NodeJS crowd I don't know who to ask.
I'd like to know what NPM is doing. Specific questions
When I run npm install, it looks like it's downloading a number of packages (lots of npm http GET in the console). How does NPM know what to download?
Where is it downloading these module files to? How does npm know where to download the files?
What effect does the --production flag have on NPM's behavior?
Happy to have specific answers, or a meta-answer that points out where I can learn how npm works with (what appears to be) a application installs (vs. a system install, which is how I normally think of it)
npm has a few different installation modes. From within a module (with a package.json file) npm install installs the dependencies listed in the dependencies and devDependencies fields of the package.json file. Installation means that files the modules are downloaded, placed in the node_modules folder, then npm installed themselves, (but only their dependencies) placing modules their own node_modules folders. This continues until everything needed is installed. Use npm ls to see the tree of installed packages.
Most of the time this is what you want, because running npm install from within a module is what you would do when developing on it, and you'll want to run tests etc. (which is what devDependencies is for).
Occasionally though, you'll be coding a service that consumes modules, but should not necessarily be treated like one (not intended to be require'd). Ghost is such a case. In these cases, you need npm install --production, which only installs the dependencies, leaving the devDependencies.
When I run npm install, it looks like it's downloading a number of
packages (lots of npm http GET in the console). How does NPM know what
to download?
It reads the package.json configuration file in the current directory.
Where is it downloading these module files to? How does npm know where to download the files?
It will create and populate a node_modules directory within the current directory. The file structure is designed in to npm/node and is (mostly) intentionally not configurable.
What effect does the --production flag have on NPM's behavior?
Install just the dependencies without the devDependencies from package.json, meaning "give me what I need to run this app, but I don't intend do do development on this app so I don't need dev-only stuff".
npmjs.org has some docs, FAQ, and man pages, which are pretty good although they are mostly lacking basic introductory material.
Related
This whole "nvm" - "npm" fiasco is a disgusting mess. For one thing, they should have a big flashing red banner at the top of the npm Web-Site that says, "If you intend to do ANYTHING with Node.js, you better decide right now to get rid of the spaces from your folder names." I never saw any warnings to sanitize my pathnames. And it didn't help that I tried to go back later and delete those blank spaces out of the path. As far as npm is concerned, I committed a capital offense with those blank spaces, and I have paid dearly for that error.
For all I know, maybe I am still paying. Why is it that when you install "nodemon" as a development dependency npm install --save-dev nodemon ( as opposed to just installing it globally in a totally separate folder npm install -g nodemon ) why does it delete "npm"? Oh, the npm files are still sitting in the same place that they have always been, but when you go into the Command Prompt and type npm -v the Terminal acts like he never heard of npm ... like you must be speaking in Russian.
OK, so let's use the Node.js installation executable to "RE-install" the npm that has gone on sabbatical ( actually, the *.msi file calls it "repairing" the installation. ) Now, you get your npm back. Great. But now you have a new "npm" folder with hundreds of "node_modules" sitting in a sub-directory of npm. This is what they call repairing?
But your troubles aren't over yet. Let's install the very popular npm module: "dotenv" as a development dependency. This time, the "nodemon" folder ( that you just installed 1 minute ago ) has been deleted, and yes, ( yet again, ) npm has gone fishin'. That is to say, npm -v no longer works, even though the npm files haven't gone anywhere.
I would love to know. How many times am I going to have to re-install npm before I finish writing my first childishly simple Node.js Module?
It's a good thing the npm trash is free. They couldn't afford to give us our money back.
I will probably be crucified for answering my own question, but it is working for me.
Here is the bottom line: Everyone should be able to reasonably expect that when they install a software package, that it is not going to cannibalize other software that they have installed, nor is it going to chew-up their operating system. But that is what was happening to me with this npm trash.
And if you want to know if you are having the same problem, just run this test:
Install npm and Node.js Then, install about 5 more npm modules of your choice.
After the installation is complete, you should immediately UN-install all 5 npm packages, plus Node and npm.
After the installation and Un-installation is finished, you should be left with an empty folder. If the folder has ANYthing left, then your npm modules have been chewing on each other, and spitting out the pieces for you to have to clean up.
That is what was happening to me. And I suspect that whenever you hear someone on YouTube suggesting that you should install Nodemon 'Globally', my guess is that they are having the same problem that I had, but they don't know how to fix it, so they just push Nodemon off into a different folder where it can't destroy the other modules that they have already installed.
But their "solution" is actually not such a bad idea. Why not put EVERY npm module into its own separate folder ? You could sandbox each package, to protect it from other packages that you install later.
To make my solution clear, I will diagram my proposed folder structure. We start by installing npm and Node:
Node
|
|__node_modules
Now, npm wants you to continue installing all other npm packages into the node_modules folder. But you can force npm to put each npm package into a folder that you get to pick, so that the new structure looks like this:
Node-14
|
|
|_____myExpress
| |
| |__node_modules
|
|_____Joi-13
| |
| |__node_modules
|
|
|__node_modules ( should be NODE ONLY ...
nothing else. )
Here is the command that will allow you to achieve the outcome above. In the Command Prompt, you must first Navigate your way into the Node-14 folder shown above. Once you are in that folder, type the following command:
npm install --prefix ./myFolder npm-package
npm will happily create the folder called "myFolder", and then will install the latest version of the "npm-package" that you requested.
And here are a few examples:
npm install --prefix ./myExpress express
npm install --prefix ./NodeMon-dev nodemon
npm install --prefix ./Joi-13 joi#13.1.0
In the examples above, I added the -dev suffix onto the NodeMon folder name, to signify that this package is for Devlopment Purposes.
Also, the 'joi' example shows you how to intentionally install an old version of an npm package, for whatever reason.
There is one caveat to the instructions that I have given. I have said that you can choose whatever folder name you want. That is not 100% true. If you choose a folder name which is the exact same as the package name, then you run into a problem. For example,
npm install --prefix ./passport passport
If you type the command above, then npm will tell you that 'passport' is already installed, and is up to date.
That, of course, is a lie.
Just another example of npm talking trash.
But just to be painfully clear on this issue, in the command below:
npm install --prefix ./Passport passport
The folder name, 'Passport' is NOT exactly the same as the npm Package 'passport' ( because of the Capital Letter ), so that command will work fine.
In the interest of full disclosure, I must admit that there is a price to be paid for using the installation strategy above: when you "require" an npm package in the Node.js software that you are writing, it will no longer be short and sweet:
const express = require('express');
Instead, you are going to have to 'search' for each module individually, because each one is hiding in the sandbox that you placed it in:
const express = require ( '../Node-14/myExpress/node_modules/express' );
Depending on how deep your folder structure goes, it could easily be even worse that what I have shown. In some cases, you might even have to go up 2 or 3 levels from where you are sitting:
const express = require ( '../../../Node-14/myExpress/node_modules/express' );
But in my case, after installing and uninstalling hundreds of npm modules, this was a very small price to pay, for the comfort of knowing that you are not going to be sabotoged by yet another poorly-written pile of trash that acts like a playground bully.
I guess I should also admit that deleting Modules might be a tiny bit more complicated than you are used to.
If you go into the Command Prompt and Navigate to the Node-14 folder ( as you did when you installed the various npm Modules ), and then if you type the customary npm uninstall <npm-package>, you can watch while npm furiously fills up your screen with reams of paths and commands, looking like it is actually doing something.
That, of course, is another lie from npm. In truth, no action was taken.
To uninstall the package, you must Navigate to the Sandbox Folder that you selected, and THEN type the same command. A couple of examples :
Navigate to the "myExpress" folder, and type npm uninstall express
or Navigate to the "Joi-13" folder, and type npm uninstall joi
You will be delighted to find that when npm finishes with the un-install process, the folder will be empty, except for an occasional JSON file. This was a huge change from what I was used to seeing.
Now, you might think that I have gone on and on about npm in this posting, and you might be right. But the truth is, I have left a LOT out of this answer that there is not room to talk about.
For example, I have another mysterious issue going on with my computer that I have posted in a question here:
Running the contributed command: 'code-runner.stop' failed
I seriously doubt that anyone is going to attempt to answer that question, so I will probably have to answer it myself someday. And I predict, that when I finally find out what caused that bizarre issue that I described in the question, that it is going to have something to do with npm. That is my prediction, because I have nothing good to say about npm.
I always thought that you should initialize npm first before installing any packages
npm init --yes
However I found out that I could just go straight to installing packages
npm i example-package
Then the package would be installed and package.json would be created at the same time.
Is there any reason I should be doing npm init first? Is it only required if I want to specify project details?
It is not required. You can install packages without, and everything will work.
npm init can do basically two things:
ask for basic project info to include in packages.json
create a specific type of project (for example React) by using npm init typeofproject
If you just want to use packages and don’t care about naming the project or using a template, just install packages.
npm init is there when you are installing the project very first time.
else you don't need to use npm init for installing any package
Well, kind of a late answer, but as far as I know (correct me if im wrong), one of the features is it gets set up with package.json which includes the dependencies list. That way, NPM can simply install the packages on the list (via the "npm init" if you have a situation that you want to clone the app into another machine), rather than copy pasting the whole project folder.
This isn't a direct answer to the question, but, if sheds some light at some point, why not.
I'm getting closer to npm to manage the javascript of my project.
Looking at stackoverflow I saw this documentation: https://docs.npmjs.com/using-npm-packages-in-your-projects
but honestly I understood little and nothing...
I would like to install this package: npm install lightgallery lg-thumbnail lg-autoplay lg-video lg-fullscreen lg-pager lg-zoom lg-hash lg-share
running it he will put the package in node_modules.
always looking at the documentation I found: npm install <folder>, so I tried an told path to the directory at the end of npm install, but the install it inside the node_modules folder...
I'm using Laravel, and if I want to install something from npm to public/inc/plugins, which is the correct procedure? is it possible to indicate this in the packages.json file? If it is recommended to use the installation in the main node, how can I then use the js? with a reconstruction with webpack mix?
You shouldn't do that, because this is standard, and other developers may feel uncomfortable if you change this behavior.
In documentation, you can find:
Install the package in the directory as a symlink in the current project. Its dependencies will be installed before it’s linked. If folder sits inside the root of your project, its dependencies may be hoisted to the toplevel node_modules as they would for other types of dependencies.
Here you can find an answer on StackOverflow
is there any way to route npm install to a specific part of hard drive and when i do npm install it make node_module folder in that part of drive, and when i run any project it look for dependencies in that part of drive,
just like single pool for every project.
then if i have two projects with similar dependencies then i only need to npm install in one project so dependencies become available in pool, and no need to do npm install in another project just npm start
Thank you,
Inzamam Malik
You can achieve something close to what you are describing with the link option.
From https://docs.npmjs.com/misc/config#link:
If true, then local installs will link if there is a suitable globally installed package.
Note that this means that local installs can cause things to be installed into the global space at the same time. The link is only done if one of the two conditions are met:
The package is not already installed globally, or
the globally installed version is identical to the version that is being installed locally.
So you will still have some files in each project's node_modules, but you shouldn't have as large a folder.
To turn this behavior on, run:
npm config set link -g
Edit: There is no way you can avoid running npm install and having a node_modules folder. Node.js always looks in node_modules for dependencies (this behavior pre-dates npm itself). The link option will make npm create symlinks in node_modules, pointing to a common pool. That will reduce disc usage, but you cannot do away with node_modules.
You can use PNPM Package manager, It uses a global pool for dependencies.
I'm working in a web application (JavaScript/C#, version controlled by TFS) and our team wants to start using Visual Studio 2015. Microsoft is moving developers to use existing popular tools like Gulp for automated tasks, so I've written a few Gulp tasks that will run on the server.
My problem is that our automated builds generate new project folders on the build server, so I can't run gulp myBuildTask without first running npm install. The npm install adds over 2 minutes to the build process, and it seems very inefficient to download the same dependencies for every build (since they will change rarely).
Is there anyway I can run a Gulp task on a new project folder without first running npm install?
Options I've considered:
Include node_modules in TFS. I couldn't add the node_modules folder to TFS (which would cause it to exist in each new build folder) because bower's nested dependencies have file paths that are too long for Windows. I could go this route without bower, but I'm not certain I want all those files in my solution (much of which is not needed, like readme's and test files).
Run npm install after each automated build.
As already mentioned, I don't want to do this because it adds several minutes to the build process.
Install NPM modules globally.
I'm not sure if this is even possible, but I'm wondering if I can install all project dependencies globally on the build server (avoiding having to install at the project level). My concern with an approach like this is that I don't want to have to manually update the build server's globally installed NPM modules every time we add a gulp plugin.
Ideally, the solution would be something like #3. The modules would install globally, but every build could run an npm install which would verify every module is installed. If a new npm module was added to the package.json, it would be downloaded. This npm install would be pretty fast since in most cases, all modules would already exist (globally installed on the build server).
There are a few things you might do:
Make npm install run faster. For this purpose, use newest npm (if possible) or use npm dedupe. Running dedupe may result in having less dependencies than with plain npm install. Then run npm shrinkwrap which creates npm-shrinkwrap.json file which contain 'freezed' info about what exactly gets installed (and in which version) during npm install.
Remember, node_modules is just a directory, if you can copy / rsync it to your installation, you can skip the npm install phase altogether
Node package resolution approach is to first try local node_modules directory and if not successful, (node_modules not there or dependency missing in node_modules) check out node_modules of the parent directory, then grandparent directory and so on. This means, you don't have to install packages globally, semi-global installation is quite sufficient
:
my_project
node_modules/
dependency1
dependency2
build_001/
build_002/
build_00x/
no node_modules here,
no deps here
Note however, that this, naturally, works only if your dependencies are really not changing. Since in real life you install something new from time to time, slightly enhanced approach might be helpful: organize your directories as follows:
my_project
ver_af729b
node_modules
build_001
build_002
ver_82b5f3
node_modules
build_003
build_004
af729b and 82b5f3 being (prefixes of) sha hashes of your npm-shrinkwrap.json file. If you then add new dependency, shrinkwrap file gets updated, build script creates new ver_something directory and executes npm install in it. Doing all this would naturally require extra work, but it should work great.
------------------ EDIT -------------------
If you are not trying to avoid npm install completely (you just want it to be quick) you can stick to the typical scenario: you checkout the sources always to the same directory, and let npm install re-use the old node_modules as much as possible.
If you want always to create a new directory for your build, you may still create a node_modules symlink to the older version of node_modules - also in this scenario, npm will reuse as much as possible from symlinked folder.