Dockering a nodejs application with external dependencies - node.js

We are building Node.js microservices. For some reusable components we have created a utils folder. This folder is outside the actual microservices package. When we run the microservices, we can refer to that code using require(../../utils/logger) and it works like a charm.
However when trying to create the docker image for my microservices
project the container gives me an error saying:
Error: Cannot find module '../../Utils/logger
which makes a lot of sense as we are building the docker image inside the microservice project.
There are few architectural decisions which needs to be taken here:
We move the utils code into each microservice as required.
Pro: Microservice remains self sustained completely and no code level dependency on any other package.
Cons: Maintenance of cross cutting concerns and the changes would be cumbersome.
2.Create a private npm module and inject dependency into the microservice package.json file. Not sure if that would work.
Any suggestions on this are highly appreciated.
Best,
- Vaibhav

Don't use require(../../utils/logger), use npm packages
You should avoid using same files for microservice with symlink or requiring from one folder, because it destroys Loose coupling.
Loose coupling is a design goal that seeks to reduce the
inter-dependencies between components of a system with the goal of
reducing the risk that changes in one component will require changes
in any other component. Loose coupling is a much more generic concept
intended to increase the flexibility of a system, make it more
maintainable, and makes the entire framework more "stable".
Simply put, you can't have different version of your logger file, but you can have different version of your logger npm package.
Implementation details for using npm modules as reusable components for Node.js microservices:
Chose naming convention for packages. My advice is scoped packages. Example: #vaibhav/logger
Chose npm registry. There are such options:
2.1. npmjs.com and public packages. It's free, but your packages should have only universal code without any business-valuable details.
2.2 npmjs.com with private packages. It's fast, but not free.
2.3 verdaccio your own npm registry server. It's free simple Node.js solution, which should be install as server in your infrastructure.
2.4 nexus. Universal private registry with npm and docker support.
If you use 2.3 or 2.4 solution, then you need to choose ip or link for your server. My advice is use link. Example https://your-registry.com
If you use 2.3 or 2.4 solution, the you need to chose install approach in your .npmrc file inside your microservice. There are two options:
install all required packages from your registry. The .npmrc file will looks like registry=https://your-registry.com. Your registry should be able to cache public packages.
install only your package from your registry, install other packages from public registry. The .npmrc file will looks like #vaibhav:registry=https://mycustomregistry.example.org
Define processes for package development, publishing and updating the package version in microservice package-lock.json file. In our projects we define processes in this way:
We use GitHub flow for package development. There are only master branch for publishing and feature branches for development. Master branch can be updated only with pull requests from developers or commits from CI server.
We use Jenkins as continues integration server for autoupdating version and publish after merging pull request. Jenkins runs npm version command for update version, then publishes new commit to master branch and then publishes to npm registry. Jenkins checks some our rules and use npm version with patch or minor param. Updating major version is breaking change, which we do manualy.
We don't have 100% automated process for updating the package version in microservices. We automate only opening pull requests with new package version in package-lock.json file. Developers should check build status and press merge button manually.

Its not part of your question to elaborate how to deal with shared libraries in Microservice-Ecosystems and what to avoid there, but if you like, you should read this up to get you at least a list for pros and cons of "sharing".
Beside that, you can create a library container which only offers this library to be mounted.
version: "2"
services:
shared:
image: me/mysharelib
m1:
volume_from:
- shared:ro
m2:
volume_from:
- shared:ro
while your mysharedlib image looks more or less like this
FROM busybox
COPY bin/busyscript.sh /usr/local/bin/busyscript
WORKDIR /your/lib/folder
VOLUME /your/lib/folder
CMD ["busyscript"]
and your busyscript is just a dummy like this
#!/bin/sh
#set -x
pid=0
# SIGTERM-handler
term_handler() {
if [ $pid -ne 0 ]; then
kill -SIGTERM "$pid"
wait "$pid"
fi
exit 143; # 128 + 15 -- SIGTERM
}
# setup handlers
# on callback, kill the last background process, which is `tail -f /dev/null` and execute the specified handler
trap 'kill ${!}; term_handler' SIGTERM
echo "Started DW php code"
# wait forever
while true
do
tail -f /dev/null & wait ${!}
done
As you see, m1/m2 ... m10 mount the library which and it is truly shared across all microservices.
Alternatives:
You can for sure use an private NPM packages or simply package the shared lib into the microservice m1..m10 during image build time.
What describe above especially suits you well when you want to replace the shared library in the stack with very little overhead and want to ensure the library is in-sync for all container instancs

Related

Is copying /node_modules inside a Docker image not a good idea?

Most/all examples I see online usually copy package.json into the image and then run npm install within the image. Is there a deal breaker reason for not running npm install from outside on the build server and then just copying everything including the node_modules/ folder?
My main motivation for doing this is that, we are using a private npm registry, with security, and running npm from within an image, we would need to figure out how to securely embed credentials. Also, we are using yarn, and we could just leverage the yarn cache across projects if yarn runs on the build server. I suppose there's workarounds for these, but running yarn/npm from the build server where everything is already set up seems very convenient.
thanks
Public Dockerfiles out there are trying to provide generalized solution.
Having dependencies coded in package.json makes it possible to share only one Dockerfile and not depend on anything not public available.
But at runtime Docker does not care how files got to container. So this is up to you, how you push all needed files to your container.
P.S. Consider layering. If you copy stuff under node_modules/, do it in one step, by that only one layer is used.

NodeJs development offline in docker

I'm trying to implement a developer workflow with docker, with the ability to develop offline (as in, not having to run npm install when you switch between branches that have differing dependencies)
The most intuitive way to do that is to store dependencies in source control. This has its own issues especially when using modules that compile dependencies. I have tried nearly everything I could think of and find:
npm packing my projects dependencies, storing in source but this doesn't store my dependencies' dependencies
storing node_modules in source, copying this to the container and running npm rebuild but it doesn't actually trigger a rebuild
running npm install --no-registry so t triggers a rebuild but doesn't try to call out, but it actually calls out to the public registry anyway
other solutions I've seen like Node-PAC seem abandoned
npmbox looks the most promising but it requires that it's installed on the target globally, which would work in a container I can build but not production, unless we start deploying containers in production.
Is this a fruitless effort? Lack of network access is rare and would only really be needed when installing a new module or moving between revisions that have differing dependencies
Another option is to setup a private npm repository and to configure it to cache public repository. There are several options to implement this, I would recommend to try Nexus: https://www.sonatype.com/nexus-repository-oss

Easy Install of Node js package on without publishing

I am looking for a way to deploy a node js app to multiple machines locally.
Is there some way to create a batch file to zip, or installer file, that will put my node js application and all its dependencies, and possibly get node js too easily on multiple machines by sending one or more files to install?
Also, is there some way to provide updates if the code is updated to all these machines?
Basically, I want to be able to install my node js package/application on multiple locations locally without having to publish my work to npm. Any ideas? cant seem to find anything out there except for putting node js on a web server, or publishing to npm?
This is quite vast. Without using advanced tools these two could work :
git pull origin master
npm install
or a solution with rsync
node js application and all its dependencies
Run an npm install where you're developing your application. Then, just tarball the whole thing, including the node_modules directory. When you deploy your tarball to another machine, be sure to run npm rebuild so that any binary dependencies are built for the platform you just deployed to. If you do your initial npm install on the same platform type, you can usually skip the rebuild step.
Also, is there some way to provide updates if the code is updated to all these machines?
There are an infinite number of ways, and what you pick depends on your needs. You could check-in your whole project including node_modules to version control and just have a Bash script regularly pull from a branch and bounce things as necessary for your specific needs. Beware though that node_modules tends to be huge... it's usually left out of version control. Perhaps stick to the tarball on a server and pull that as necessary.
and possibly get node js too
Keep that separate. You don't need to deploy Node.js every time you deploy your application.

Deploying node app with self-maintained NPM modules

I am developing a very complex app that is using internally developed, open source NPM modules.
I often need to change one of those modules (extra features, bug fixing, etc.) in order for the main application to work.
At the moment, I have:
A directory called my_modules, each containing a git repository one for each module. For example module1, module2.
A directory called my_apps, where for example there is app1 which has module1 as a dependency
Under my_apps/app1/node_modules I have module1 and module2, installed via NPM
In the server, deploy by pulling the git repository, running an npm install and npm dedupe, and running the server with forever.
At this stage, if I have to fix something in one of the modules, I:
Fix it within my_apps/app1/node_modules/module1 (not git)
When it's all working, COPY the files over to my_modules/module1 and do a git push and npm publish
The server will pull the latest modules after deploy thanks to npm install
This is way, way less than ideal. It's just too error-prone. However:
Having a symbolic link link my_apps/app1/node_modules/module1 => my_modules/module1 means that module1 will look for dependencies in its own path, which often causes problems (for example, I need to make sure that EVERY module uses the same copy of module1, which is imperative)
Having a git repo under my_apps/app1/node_modules/module1 feels dangerous, in case I accidentally overwrite changes using NPM on the module. Also, once fixed the change in the local git repo, I would still then need to pull the changes in my_modules/module1. Yes a step forward from copying files over...
What's the "recommended" way of dealing with this? Any best practices?

How to package & deploy Node.js + express web application?

I am new to Node.js programming and I have recently created a sample working web application using (express, backbone & other complimentary view technologies, with mongoDB). Now i am at a point where I want to deploy the same on a staging environment and I am not sure how to package this application and distribute the same. [I can take care of mongoDb and setting it up seperately]
I am from Java world and in there we create jars for reusable libs and war/ear packages for web applications which is deployed in a servlet container. Now in this case since node.js itself acts as a web container as well, how do i package my webapp?
Is there any standard format/guidelines of packaging node webapps built using express? (Is there a similar jar/war packaging systems for node apps?)
How do I deploy it once packaged? Would it become an exe, since it is also its own container?
PS: As of now I am thinking of just manually copying all the required source files into the staging environment and run npm commands to download all dependencies on that machine and then use 'forever' or some other mechanism to run my server.js. (Also, add some sort of monitoring, just in case app crashes and forever fails) I am not sure if that is the right way? I am sure there must be some standardized way of addressing this problem.
Deploying Node.js applications is very easy stuff. In maven, there is pom.xml. Related concept in Node.js is package.json. You can state your dependencies on package.json. You can also do environmental setup on package.json. For example, in dev environment you can say that
I want to run unit tests.
but in production;
I want to skip unit tests.
You have local repositories for maven under .m2 folder. In Node.js, there is node_modules folder under your Node.js project. You can see module folders with its name.
Let's come to the grunt part of this answer. Grunt is a task manager for your frontend assets, html, javascript, css. For example, before deployment you can minify html, css, javascript even images. You can also put grunt task run functions in package.json.
If you want to look at a sample application, you can find an example blog application here. Check folder structure and package.json for reference.
For deployment, I suggest you heroku deployment for startup applciations. You can find howto here. This is simple git based deployment.
On project running part, simply set your environment NODE_ENV=development and node app.js. Here app.js is in your project.
Here is relative concept for java and nodejs;
maven clean install => npm install
.m2 folder => node_modules(Under project folder)
mvn test => npm test(test section on package.json)
junit, powermock, ... => mocha, node-unit, ...
Spring MVC => Express.JS
pom.xml => package.json
import package => require('module_name')
There is no standardized way, but you're on the right track. If your package.json is up to date and well kept, you can just copy/zip/clone your app directory to the production system, excluding the node_modules.
On your production system, run
npm install to install your dependencies, npm test if you have tests and finally NODE_ENV=production node server.js
Some recent slides I considered to be quite helpful that also include the topic of wrappers like forever, can be found here.
Hope this might be helpful for somebody looking for the solution,Packaging of Node js apps can be done using "npm pack" command.It creates a zip file of your application which can be run in production/staging environment.
Is there any standard format/guidelines of packaging node webapps
built using express? (Is there a similar jar/war packaging systems for
node apps?)
Yes, the CommonJS Packages specification:
This specification describes the CommonJS package format for
distributing CommonJS programs and libraries. A CommonJS package is a
cohesive wrapping of a collection of modules, code and other assets
into a single form. It provides the basis for convenient delivery,
installation and management of CommonJS components.
For your next question:
2. How do I deploy it once packaged? Would it become an exe, since it is also its own container?
I second Hüseyin's suggestion to deploy on Heroku for production. For development and staging I use Node-Appliance with VirtualBox and Amazon EC2, respectively:
This program takes a Debian machine built by build-debian-cloud or
Debian-VirtualBox-Appliance and turns it into a Node.js "appliance",
capable of running a Node application deployed via git.
Your webapp will not become an exe.
few ways to approach this:
Push your code into Git repository, excluding everything that isn't your code (node_modules/**), then pull it in your staging environment, run npm install to restore all dependencies
create an NPM package out of it , install it via npm in your staging environment (this should also take care of all of the dependencies)
manual copy/ssh files to your staging environment (this can be automated with Grunt), than restore your dependencies via npm
I used zeit's pkg module. It can create cross platform deliverables for linux/win/macos. Actually used it in production and works fine without any issues.
It takes in all the js scripts and packages it into a single file.
The reason I used it is because it helps in securing your source code. That way in production at customers environment they will have access to application but not the source code.
Also one of the advantages is that at production environment, you do not actually need to have the customer install node.js as the node binaries also get packaged inside the build.
https://www.npmjs.com/package/pkg

Resources