NodeJS sub-projects installation - node.js

I have a large project that contains multiple sub-modules, many of which are optional. And I have each sub-module sitting in its own folder with local package.json.
I'm trying to figure out the best way to create an integral package.json that would be able to run npm install on a specific subset of those modules.
Question
Is there a more civilized approach these days than just running npm install from every sub-module folder? I want to be able to halt the integral installation when it fails for one module, and hopefully do it on a declarative level within package.json, rather than implementing such logic myself. Plus, I want postinstall-s run on each package (if present), and then one of the integration package.
Other than that, maybe there is a better way today to manage such projects? I'm open to any suggestions!

Related

When to add a dependency? Are there cases where I should rather copy the functionality?

I lately helped out on a project, where I added a really small dependency - in fact, it only contained a regular expression (https://www.npmjs.com/package/is-unc-path).
The feedback I got from the developer of the project was that he tries to minimize third-party dependencies if they can be implemented easily - whereby he - if I understand it correctly - asks me to just copy the code instead of adding another dependency.
To me, adding a new dependency looks just like putting some lines of code into an extra file in the repo. In addition, the developers will get informed by an update if the code needs a change.
Is it just a religious thought that drives a developer to do this? Are there maybe any costs (performance- or space-wise, etc) when adding a dependency?
I also had some disputes with my managers once concerning the third party libraries, the problem was even greater he got into believing that you should version the node_modules folder.
The source of any conflict usually is the ignorance.
His arguments were:
you should deliver to the client a working product not needing for him to do any other jobs like npm install
if github, npm is down in the moment when you run npm install on the server what you will do ?
if the library that you install has a bug who will be responsible
My arguments were:
versioning node_modules is not going to work due to how package dependencies work, each library will download his own node_modules dependencies and then your git repository will grow rapidly to hundreds of mb. Deploy will become more and more slow, downloading each time half a gb of code take time. NPM does use a module caching mechanism if there are no changes it will not download code uselessly.
the problem with left-pad was painfull but after that npm implemented a locking system and now for each package you just lock to a specific commit hash.
And Github, and npm does not have just a single instance service, they run in cloud.
When installing a dependency you always have some ideas behind and there are community best practices, usually they resume to: 1. Does the repo has unit tests. 2. The download number 3. When was the latest update.
Node.js ecosystem is built on modularity, it is not that node is so popular cause of some luck, but cause of how it was designed to create modules and reuse them. Sometimes working in node.js environment feels like putting lego pieces together and building your toy. This is the main cause of super fast development in node.js. People just reuse stuff.
Finally he stayed on his own ideas, and I left the project :D.

Preventing global installations of an NPM package

I have a library and want to encourage/force users to use the locally installed version only. I could do this the hard way or the easy way.
The easy way would be if NPM had a mechanism to prevent using the --global switch with the npm install command, for any library.
The hard way would be to add code in my NPM packaged that returned early if the code determined it was globally installed not locally installed.
Does anyone know if you can prevent global installations of an NPM package? What might be the most user friendly way to approach this?
The best way to prevent users from installing your module globally would be to describe your preference in the documentation.
There is nothing you can do to force your users to never install it globally if they can install it locally. They will always be able to move the files manually if they want.
In the npm community the assumption is that the user has control over the modules he/she uses, not the other way around. Forcing people to use your module in certain ways will only make them unhappy.
So the only good answer to your question is to document the way your code should be used. You can ask them to use your module a certain way - but they are the ones who can choose to listen to you or not. You can state that using your module installed globally is unsupported, unwise, discouraged, dangerous, but you will not be able to force users to use the module as you want, and that's a good thing.
Now, for some bad answers, you can always test if the parent of your module's root directory is named node_modules or not and fail if it isn't but I'm sure it can cause some trouble if someone happens to install your module locally as you want but under a different directory. You can see if your module is run from one of the default paths that node uses to search for modules but those paths are not always the same, and you'd have to take the NODE_PATH environment variable into account as well.
You can do few tricks like that but they can only annoy users who know what they are doing because they will have to change the source code of your module to do what they want, and they will always be able to do that, no matter how hard you try to make their life harder.
In summary, my recommendation would be to document your module well and respect your users and their needs, and trust them to know what they're doing.
Update
For a working example of a Bash function that prevents global npm installation of a certain module, see this answer - section Working example of preventing global install.

import common dependencies to package.json?

At our company we have at the moment 5 web applications that are built using Gulp. For Gulp, we have a common buildfile that all applications use (and override certain parts of it if needed).
This makes it very easy to add features or fix bugs in all projects at the same time. However, I still need to edit the package.json file in each project separatly if I want to add a new npm dependency or bump a version for an existing one.
What I would like to accomplish is to a "base file" where all the common dependencies are configured, and the I would like to import that into the "local" package.json in each project. It would also be nice if each project could add more dependencies than the ones registered as common.
Is it possible to do this?
No, and it's a good thing that it isn't. You need to declare your dependencies explicitly on each project.
What you can do, if your build process is a shared API, is to extract your build script into an npm package of its own, and include that in the package.json of all other projects, and use it in them (coding it in a way that allows for overrides)
Then when you need a new dependency for your common build, you only need to change it once. (Note that with this, you'd still need to make sure your build package version is up to date in all other applications)

Including local dependencies in deployment to lambda

I have a repo which consists of several "micro-services" which I upload to AWS's Lambda. In addition I have a few shared libraries that I'd like to package up when sending to AWS.
Therefore my directory structure looks like:
/micro-service-1
/dist
package.json
index.js
/micro-service-2
/dist
package.json
index.js
/shared-component-1
/dist
package.json
component-name-1.js
/shared-component-2
/dist
package.json
component-name-2.js
The basic deployment leverages the handy node-lambda npm module but when I reference a local shared component with a statement like:
var sharedService = require('../../shared-component-1/dist/index');
This works just fine with the node-lambda run command but node-lambda deploy drops this local dependency. Probably makes sense because I'm going below the "root" directory in my dependency so I thought maybe I'd leverage gulp to make this work but I'm pretty darn new to it so I may be doing something dumb. My strategy was to:
Have gulp deploy depend on a local-deps task
the local-deps task would:
npm build --production to a directory
then pipe this directory over to the micro-service under the /local directory
clean up the install in the shared
I would then refer to all shared components like so:
var sharedService = require('local/component-name-1');
Hopefully this makes what I'm trying to achieve. Does this strategy make sense? Is there a simpler way I should be considering? Does anyone have any examples of anything like this in "gulp speak"?
I have an answer to this! :D
TL;DR - Use npm link to link create a symbolic link between your common component and the dependent component.
So, I have a a project with only two modules:
- main-module
- referenced-module
Each of these is a node module. If I cd into referenced-module and run npm link, then cd into main-module and npm link referenced-module, npm will 'install' my referenced-module into my main-module and store it in my node_modules folder. NOTE: When running the second npm link, the name of the project is the one you find in your package.json, not the name of the directory (see npm link documentation, previously linked).
Now, in my main-module all I need to do is var test = require('referenced-module') and I can use that to my hearts content. Be sure to module.exports your code from your referenced-module!
Now, when you zip up main-module to deploy it to AWS Lambda, the links are resolved and the real modules are put in their place! I've tested this and it works, though not with node-lambda yet, though I don't see why this should be a problem (unless it does something different with the package restores).
What's nice about this approach as well is that any changes I make to my referenced-module are automatically picked up by my main-module during development, so I don't have to run any gulp tasks or anything to sync them.
I find this is quite a nice, clean solution and I was able to get it working within a few minutes. If anything I've described above doesn't make any sense (as I've only just discovered this solution myself!), please leave a comment and I'll try and clarify for you.
UPDATE FEB 2016
Depending on your requirements and how large your application is, there may be an interesting alternative that solves this problem even more elegantly than using symlinking. Take a look at Serverless. It's quite a neat way of structuring serverless applications and includes useful features like being able to assign API Gateway endpoints that trigger the Lambda function you are writing. It even allows you to script CloudFormation configurations, so if you have other resources to deploy then you could do so here. Need a 'beta' or 'prod' stage? This can do it for you too. I've been using it for just over a week and while there is a bit of setup to do and things aren't always as clear as you'd like, it is quite flexible and the support community is good!
While using serverless we faced a similar issue, when having the need to share code between AWS Lambdas. Initially we used to duplication the code, across each microservice, but later as always it became difficult to manage.
Since the development done in Windows Environment, using symbolic links was not an option for us.
Then we came up with a solution to use a shared folder to keep the local dependencies and use a custom written gulp task to copy these dependencies across each of the microservice endpoints so that the dependency can be required similar to npm package.
One of the decisions we made is not to keep two places to define the dependencies for microservices, so we used the same package.json to define the local shared dependencies, where gulp task passes this file and copy the shared dependencies accordingly also installing the npm dependencies with a single command.
Later we made the code open source as npm modules serverless-dependency-install and gulp-dependency-install.

How to deal with local package dependencies in nodejs with npm

How should we deal with local packages that are a dependency in other local packages?
For simplicities sake, say we have the follow packages
api - express application
people - a package to deal with people
data-access - a package that deals with data access
And then the dependencies are
api depends on people
people depends on data-access
Currently we have these dependencies setup as file dependencies.
I.e. api package.json would have
"dependencies": {
"people": "file:../people"
}
Trouble with this is that we're finding it a PITA when we make updates to one package and want those changes in the other packages that depend on it.
The options we have thought of are:
npm install - but this won't overwrite previously installed packages if changes are made, so we have to delete the old one from the node_modules directory and re-run npm install... which can be niggly if the package dependency is deep.
npm link - we're not sold on the idea because it doesn't survive version control... Just thinking about it now, maybe we have some kind of local build script that would run the npm link commands for us... this way it could survive version control. Would that be a grunt job?
grunt - we haven't dived too deep into this one yet, but it feels like a good direction. A little bit of googling we came accross this: https://github.com/ahutchings/grunt-install-dependencies
So, what option would work best for our situation?
Are there other options that we haven't thought of yet?
Ps. we're a .NET shop doing a PoC in node, so assume we know nothing!
Pps. if you strongly believe we're setting up our project incorrectly and we shouldn't have smaller individual packages, let me know in the comments with a link to some reading on the subject.
So, I agree that going with 'many small packages' is usually a good idea. Check out 12factor.net if you haven't already.
That said, in specific answer to your question I'd say your best bet is to consider mainly how you want to maintain them.
If the 'subcomponents' are all just parts of your app (as, for example, data-access implies), then I'd keep them in the same folder structure, not map them in package.json at all, and just require them where you need them. In this case, everything versions together and is part of the same git repository.
If you really want to or need to keep them all in separate git repositories, then you can do npm link, but to be honest I've found it more useful to just use the URL syntax in package.json:
dependencies: {
"people" : "git://path.to.git:repo#version.number"
}
Then, when you want to explicitly update one of your dependencies, you just have to bump the version number in your package.json and run npm install again.

Resources