Is it best practice to use package.json in an app that is not a module but an end product? - node.js

Is it normal or hopefully best practice to use package.json in my project even though my project isn't a module/package? I'm currently using package.json for info, version, dep management and I include use 'private: true'

Many people are using package.json in applications these days. npm is still a great way to manage your dependencies while developing. As mentioned there are a few ways to ensure that your app doesn't get pushed to the public repository. And you still get the benefits of all the npm utilities.
If by "best practice" you're asking if there is a good reason not to do this, the answer is no. You should go for it.

Related

Should react and react-dom be devDependecies?

I use webpack to bundle all files for production on the web. Since no code runs on node during production, thus no package being required in production, should all dependencies be development only?
I don't want an answer based on opinions or best practices or intended usage, I simply want what makes sense.
It seems the only answer you will get to this question will be "based on opinions or best practices or intended usage". For those who were not looking for such answers, the answer is:
If there is no backend service in the same project where you are making the frontend app via a bundler, then put everything in devDependecies; otherwise only put the packages you will require in some node.js related code in dependencies.

Packaging a Node app from development to production

Apologies for the broad scope of the question and if there is a better sub for this question please let me know.
So my requirement is that we want to make sure that the Node app that is developed by the dev team goes through testing, QA, and production without any changes, very deterministically.
The strategy is simple:
Package the Node app and digitally sign it. This package will go through testing, QA, and production.
One of the requirements of this strategy is that when we go to the production server, we do not want to do npm install.
[Pkg][1] seemed to be the silver bullet until I found out it has trouble packaging the BigNumber library.
The other alternative seemed was doing npm pack. However, I found out that when I do npm install package.tgz it actually downloaded from the internet, and not actually package everything in it.
So I would like to ask for advice on how we can move a Node app from dev to production guaranteeing that we deploy the exact same package/binary that was built originally.
I understand Docker is the obvious choice, but we are trying to find out other ways that going down the Docker route.
Thank you very much.

How can I run a script before install a new nodejs dependecy

I try using preinstall npm scripts, but it only run when I checkout the project into a new space, and run "npm i" standalone
I need a solution to run a script before the new dependency writed into package.json. It doesn't depend the type of dependency: dev or prod. All of them need to check.
For example, when a new developer join to the team, and want to add new dependency which has known vulnerability, this script stops the action before the package.json was changed, and show warning message for the developer
There isn't a way to do that with npm scripts. So, unless you feel like implementing one you're going to have to adjust your process. Start by identifying all the problems you're trying to address with an on-dependency-install hook.
You give the example of preventing the installation of a dependency or dependency version. That's not a problem: it's a solution you've identified for the problem. Figure out what the actual problem is, and then reevaluate your solution to see if it's really the most appropriate measure to take.
Possibly (probably) you are afraid of vulnerable code making it up to production. That's a problem definition you can work with. What possible solutions exist? You've already identified the blacklist. But not only is that not supported by your tooling, even if it were the onus is then on you to keep the blacklist up to date. Given just how quickly the Node world moves, that's enough work to keep several people employed fulltime. And that's not even getting into deploying it to your developers.
The good news is that that's not the only solution: you could establish procedural safeguards against integrating vulnerable code. If you're using a distributed VCS like Git, pull requests are right there: disable pushing commits to the master or development branch, have developers work in feature branches and submit pull requests, then review those pull requests and screen any new dependencies for vulnerabilities when they show up. If you're using something like SVN, you can use feature branches with code reviews to similar effect. Your developers get extra eyes on their code looking for vulnerabilities, optimizations, edge cases, and so forth; you don't waste time screening dependencies that nobody ever tries to integrate. And nobody has to worry about getting the latest copy of the blacklist. For this particular scenario, everybody wins with a process solution over a technical solution.
If you have other reasons for wanting to fire scripts when dependencies are installed, try working back to the root of the problem the same way. The way Node dependency management and module interactions work, you'll probably discover it's preferable to develop better process habits.
If you are using git, you can use pre-commit/push hooks, the result is pretty much the same, no vulnerabilities in code base.
For exemple with husky and nsp you could do something like this :
{
"scripts": {
"prepush": "nsp check"
}
}
Riffing off Gabriel's suggestions, since you are concerned about devs wasting time when the lib they add fails an nsp check... You can use an editor extension to run the nsp check as they code. Then have Husky do a pre-commit nsp check as well.
I would also recommend Greenkeeper.io to prevent vulnerabilities before they are found.
If the main concern is that these vulnerable packages are being run within your network (since there's no way to prevent those devs from using those packages in general), you could mirror a subset of the npm registry that you consider safe, or manually add known safe dependencies to that mirror, and block access to the main registry https://registry.npmjs.org/ at the network level. This would mean your developers are stuck waiting for the mirror to be updated, but would require somebody to at least stop and think before they're able to install a problematic module.

When to add a dependency? Are there cases where I should rather copy the functionality?

I lately helped out on a project, where I added a really small dependency - in fact, it only contained a regular expression (https://www.npmjs.com/package/is-unc-path).
The feedback I got from the developer of the project was that he tries to minimize third-party dependencies if they can be implemented easily - whereby he - if I understand it correctly - asks me to just copy the code instead of adding another dependency.
To me, adding a new dependency looks just like putting some lines of code into an extra file in the repo. In addition, the developers will get informed by an update if the code needs a change.
Is it just a religious thought that drives a developer to do this? Are there maybe any costs (performance- or space-wise, etc) when adding a dependency?
I also had some disputes with my managers once concerning the third party libraries, the problem was even greater he got into believing that you should version the node_modules folder.
The source of any conflict usually is the ignorance.
His arguments were:
you should deliver to the client a working product not needing for him to do any other jobs like npm install
if github, npm is down in the moment when you run npm install on the server what you will do ?
if the library that you install has a bug who will be responsible
My arguments were:
versioning node_modules is not going to work due to how package dependencies work, each library will download his own node_modules dependencies and then your git repository will grow rapidly to hundreds of mb. Deploy will become more and more slow, downloading each time half a gb of code take time. NPM does use a module caching mechanism if there are no changes it will not download code uselessly.
the problem with left-pad was painfull but after that npm implemented a locking system and now for each package you just lock to a specific commit hash.
And Github, and npm does not have just a single instance service, they run in cloud.
When installing a dependency you always have some ideas behind and there are community best practices, usually they resume to: 1. Does the repo has unit tests. 2. The download number 3. When was the latest update.
Node.js ecosystem is built on modularity, it is not that node is so popular cause of some luck, but cause of how it was designed to create modules and reuse them. Sometimes working in node.js environment feels like putting lego pieces together and building your toy. This is the main cause of super fast development in node.js. People just reuse stuff.
Finally he stayed on his own ideas, and I left the project :D.

How to deal with local package dependencies in nodejs with npm

How should we deal with local packages that are a dependency in other local packages?
For simplicities sake, say we have the follow packages
api - express application
people - a package to deal with people
data-access - a package that deals with data access
And then the dependencies are
api depends on people
people depends on data-access
Currently we have these dependencies setup as file dependencies.
I.e. api package.json would have
"dependencies": {
"people": "file:../people"
}
Trouble with this is that we're finding it a PITA when we make updates to one package and want those changes in the other packages that depend on it.
The options we have thought of are:
npm install - but this won't overwrite previously installed packages if changes are made, so we have to delete the old one from the node_modules directory and re-run npm install... which can be niggly if the package dependency is deep.
npm link - we're not sold on the idea because it doesn't survive version control... Just thinking about it now, maybe we have some kind of local build script that would run the npm link commands for us... this way it could survive version control. Would that be a grunt job?
grunt - we haven't dived too deep into this one yet, but it feels like a good direction. A little bit of googling we came accross this: https://github.com/ahutchings/grunt-install-dependencies
So, what option would work best for our situation?
Are there other options that we haven't thought of yet?
Ps. we're a .NET shop doing a PoC in node, so assume we know nothing!
Pps. if you strongly believe we're setting up our project incorrectly and we shouldn't have smaller individual packages, let me know in the comments with a link to some reading on the subject.
So, I agree that going with 'many small packages' is usually a good idea. Check out 12factor.net if you haven't already.
That said, in specific answer to your question I'd say your best bet is to consider mainly how you want to maintain them.
If the 'subcomponents' are all just parts of your app (as, for example, data-access implies), then I'd keep them in the same folder structure, not map them in package.json at all, and just require them where you need them. In this case, everything versions together and is part of the same git repository.
If you really want to or need to keep them all in separate git repositories, then you can do npm link, but to be honest I've found it more useful to just use the URL syntax in package.json:
dependencies: {
"people" : "git://path.to.git:repo#version.number"
}
Then, when you want to explicitly update one of your dependencies, you just have to bump the version number in your package.json and run npm install again.

Resources