Is npm init needed? - node.js

I always thought that you should initialize npm first before installing any packages
npm init --yes
However I found out that I could just go straight to installing packages
npm i example-package
Then the package would be installed and package.json would be created at the same time.
Is there any reason I should be doing npm init first? Is it only required if I want to specify project details?

It is not required. You can install packages without, and everything will work.
npm init can do basically two things:
ask for basic project info to include in packages.json
create a specific type of project (for example React) by using npm init typeofproject
If you just want to use packages and don’t care about naming the project or using a template, just install packages.

npm init is there when you are installing the project very first time.
else you don't need to use npm init for installing any package

Well, kind of a late answer, but as far as I know (correct me if im wrong), one of the features is it gets set up with package.json which includes the dependencies list. That way, NPM can simply install the packages on the list (via the "npm init" if you have a situation that you want to clone the app into another machine), rather than copy pasting the whole project folder.
This isn't a direct answer to the question, but, if sheds some light at some point, why not.

Related

How to get npm to favor local linked dependency over its published install

I've searched through other questions such as this one, but they all seem to be about a local npm link stopping working for another reason than mine. I assume this is a common use-case issue, so if I'm doing something methodically wrong, I'm more than happy to take suggestions on how I should be doing it.
Principally, I have a private npm module that I'm working on called #organisation/module. When working locally, I'll run npm link on it, and use it within my 'host' project as npm link #organisation/module — this all works great with hot-reloading, etc. I'll also import it as import module from '#organisation/module.
However, since I also want to publish my local changes to npm (as #organisation/module) from time to time, for build testing and production code, I need to run npm install #organisation/module on the host project.
This then seems to break the implicit npm link I set up earlier... I assume mainly because they are the same name, and npm favors an install over a link?
When I want to make live, local changes again, the only way I can currently get it to work is via npm uninstall #organisation/module and then to re-link it.
Is there a way to keep the published module installed (in order to avoid careless mistakes, like forgetting to reinstall it for build testing), but always favour the local, linked instance?
Diagram for ref:
Have you tried locally installing with the other method npm provides.
npm install /absolute/path/packageName
I believe this will change your entry in package.json to look like this:
"dependencies" {
...
"packageName": "file:../../path/to/packageName",
...
}
Since npm link creates a symlink in the global folder, while npm install is local to the project npm install takes precedence. You can read about npm link here: https://docs.npmjs.com/cli/link
To avoid this, my suggestion would be to use npm install <path to local> and when you need to use the production code use npm install #organization/module. This would update your node_modules per code basis. Read about npm install here: https://docs.npmjs.com/cli/install
Hope this helps :)
Go to the directory where your local package is located open package.json change the name from original_name to "original_name_local".
write npm link on terminal at the same location.
After this go to your working directory and write npm install <path to local>
Now whereever you're requiring or importing update the name to "original_name_local"
for example if it's require('space-cleaner') then change it to require('space-cleaner_local')
Like this you can have both local as well as production package just change the name wherever required.
Otherwise you can remove package by removing it from package.json and deleting from node_modules.
if local is needed go to local package directory and on terminal write npm link and then on your working directory write npm install ./path/to/package
if production then again delete the package as told above and write npm install package_name

How do I prevent npm install from removing packages?

I'm trying to set up a development environment with several packages, and as a result I need to manually install some dependencies. More specifically, I have some local changes in several packages which I need to test before I can push them to github, so I can't just npm install the top level because it won't pick up those change. So I run the first npm install manually on packages which are missing, and then try to run my node code and see which package it is still missing, then try to npm install what it says is missing.
However, when I go to install the second package, it ends up with this message:
added 3 packages from 4 contributors, removed 799 packages and audited 3 packages in 4.197s
The second install removed practically every package that was already installed! I didn't notice this until about the third time, when I realized that I seemed to be installing the same thing over and over.
However can I prevent this particularly naughty behavior and force npm to only install what I tell it to and leave everything else alone?
Have a look at npm link if you need to test against modified packages.
From npm link:
This is handy for installing your own stuff, so that you can work on it and test it iteratively without having to continually rebuild.
Say b is a dependency of a. You made changes to b and want to check if a still works with those changes. Instead of using b in node_modules installed from npm, use your local, modified version:
cd ~/projects/b # go into the package directory
npm link # creates global link
cd ~/projects/a # go into some other package directory.
npm link b # link-install the package
Now, any changes to ~/projects/b will be reflected in ~/projects/a/node_modules/b/.
If your development flow involves updating in parallel packages which depend on one another, you might consider switching your project's package manager to from npm to yarn to take advantage of yarn's workspaces feature.
Yarns's workspaces allow you to easily setup a single monorepo containing all your interconnected dependencies, and let yarn thinking how to link them together in your dev environment.
i had a similar problem today , & thought this might help someone in the future and l have found out that if you install simultaneouly it
npm install --save package1 package2 package3 ...
it worked as l had
npm install xlsx angular-oauth2-oidc
but if you install separately it will have issues
Edit 2 More infor by #Michael
installing multiple packages in the same command also prevents hooks from being installed multiple times
Remove "package-lock.json" file befor installing the new package.
Are you saving the dependencies to package.json?
To Save : npm install --save {package_name}. This will save the package to package.json and install using npm install.
You can't particularly control the dependencies(fully). The dependencies which you have installed might be using dependencies themselves.So when you remove a package, npm deletes all the package's dependencies and the package.

How to automatically link npm packages at installation time?

I'm working on a larger project that is split into a number of npm packages. There are several dependencies between the packages. The whole code base is stored in a main directory like this:
main/
pkg1/
pkg2/
...
Suppose that pkg2 depends on pkg1, so in main/pkg2/package.json I have:
"dependencies": {
"pkg1": "^0.1.0"
}
I have linked my packages together using npm link. However when I start development on a new machine or for some reason I have to reinstall the packages I can't simply say npm install in pkg2/. It would fail because pkg1 couldn't be found. (It's not published, but anyway, I want the local version because I'm developing both packages).
Of course I can do all the linking manually than call npm install but it's a hassle. Is there any way to do it in a single step?
My previous research:
This question proposes writing a preinstall script, but I don't want to keep linking in production, only in the development environment, as another answer points it out.
I've also tried npm link in pkg1/ then npm install --link in pkg2/. According to the manual,
The --link argument will cause npm to link global installs into the local space in some cases.
Not in my case though.
You can use zelda for this. Written by Feross, it was designed for exactly this purpose.
I'm not a fan of doing it this way; I'd generally prefer running a local repository or using git URLs for dependencies like this.
That said, if you want to keep using npm link you can always use the preinstall script approach but not use the preinstall key.
"autolink": "cd ../project1 && npm link && cd ../project2 && npm link project1_name",
Then in your cli you can do $ npm run autolink when you first setup a dev env.

Confused in starting a project in node.js with npm install

Hello I am just a noob and still learning. I have already downloaded and tried the chat tutorial of get-started part from socket.io. Now, I am again learning from another source. What's confusing me is that, do I always have to npm install in the beginning of every project after writing the dependencies in the package.json? Or is there any other way? I would be very glad if you could help me understand my confusion. Thank you!
Yes, before running, all dependencies must be installed. So you must run npm install.
When developing, you can use npm install --save <package_name> to install a dependency and automatically add it to package.json.
NPM means Node Package Manager. It is used to manage your dependencies to other node modules dynamically thanks to a configuration file called package.json. This way you can easily define the exact versions you need or a mask in order to always retrieve the stable ones for instance.
The command npm install allows to interpret your configuration file and then download the good versions (and this recursively).

NPM basics and Local Installs?

I'm not regular node user, so my apologies if this is a stupid newbie question, but I haven't been able to find any clear documentation on this, and my feeble newbie node skills don't let me dig into myself.
I'm following along with these instructions for installing the Ghost blogging system, (a system built with NodeJS).
After telling me to open a terminal window in the just downloaded package folder, yhe instructions include the following line
In the new terminal tab type npm install --production
This confuses me. My understanding of npm is it's a package manager that, like perl's CPAN
Fetches packages from The Internet
Installs them into my local node system
That's clearly not what's happening above, but I don't know what is happening when I run that command, and since I don't run with a NodeJS crowd I don't know who to ask.
I'd like to know what NPM is doing. Specific questions
When I run npm install, it looks like it's downloading a number of packages (lots of npm http GET in the console). How does NPM know what to download?
Where is it downloading these module files to? How does npm know where to download the files?
What effect does the --production flag have on NPM's behavior?
Happy to have specific answers, or a meta-answer that points out where I can learn how npm works with (what appears to be) a application installs (vs. a system install, which is how I normally think of it)
npm has a few different installation modes. From within a module (with a package.json file) npm install installs the dependencies listed in the dependencies and devDependencies fields of the package.json file. Installation means that files the modules are downloaded, placed in the node_modules folder, then npm installed themselves, (but only their dependencies) placing modules their own node_modules folders. This continues until everything needed is installed. Use npm ls to see the tree of installed packages.
Most of the time this is what you want, because running npm install from within a module is what you would do when developing on it, and you'll want to run tests etc. (which is what devDependencies is for).
Occasionally though, you'll be coding a service that consumes modules, but should not necessarily be treated like one (not intended to be require'd). Ghost is such a case. In these cases, you need npm install --production, which only installs the dependencies, leaving the devDependencies.
When I run npm install, it looks like it's downloading a number of
packages (lots of npm http GET in the console). How does NPM know what
to download?
It reads the package.json configuration file in the current directory.
Where is it downloading these module files to? How does npm know where to download the files?
It will create and populate a node_modules directory within the current directory. The file structure is designed in to npm/node and is (mostly) intentionally not configurable.
What effect does the --production flag have on NPM's behavior?
Install just the dependencies without the devDependencies from package.json, meaning "give me what I need to run this app, but I don't intend do do development on this app so I don't need dev-only stuff".
npmjs.org has some docs, FAQ, and man pages, which are pretty good although they are mostly lacking basic introductory material.

Resources