Are there any workable approaches to watch/reload within docker?
The use case here is development, where switching between branches may change one or more of backend, frontend or database provisioning files.
Example: I have a node.js application. If server JS code changes, I want the backend server to restart. If package.json changes, I want the "install" container (runs npm install, saving node_modules to a shared Volume). If SQL files change, I want the provisioning container to run its psql commands again.
Basically, I want to watch certain files and restart the process if they change (container itself is not technically restarted). Supervisord isn't cut out for watching, but seems like process managers like PM2 or Forever would normally be the slam dunk choice if it weren't for the docker consideration.
Related
I have an ec2 instances that is running a node application. I am thinking of doing a container implementation using docker. The pm2 is running two application one is the actual node application (express and pug) and a cronjob using agenda. Is it a good idea to put my applications in one container?
I am not yet familiar with the pros and cons of this and I read that docker is already a process manager. How will the pm2 fit in all of this once I implement it. Or should I just ditch docker and run the applications in the native linux of my ec2.
You have a couple of questions, I try to answer them below:
1. Is it a good idea to put my applications in one container?
It depends, there are many cases why you would like to run the same container doing multiple things. But it really depends on the CPU/RAM/Memory usage of the job. And how often does it run?
Anyway from experience I can say if I run a cronjob from the same container, I would always use a worker approach for this using either NodeJS cores worker_threads or cluster module. Because you do not want that a cronjob impacts the behavior of the main thread. I have an example of running 2 applications on multiple threads in the following repo.
2. should I just ditch docker and run the applications in the native linux of my ec2
Docker and PM2 are 2 really different things. Docker is basically to containerize your entire Node app, so it is much easier to ship it. PM2 is a process manager for node and makes sure your app is up and comes with some nice metrics and logs UI on PM2 metrics. You can definitely use the 2 together, as PM2 makes also sure your app will start up after it crashes.
However, if you use pm2 you have to use the pm2-runtime when using docker. Example Dockerfile:
FROM node:16.9.0
WORKDIR /home/usr/app
COPY . .
RUN npm ci && npm run build
# default command is starting the server
CMD ["npx", "pm2-runtime", "npm", "--", "start"]
I would like to use nodemon to restart my project when its files are changed. I think nodemon works by listening for inotify events to trigger reloading a node.js project.
The project runs in a docker container, and the project files are in a mounted volume.
When the project files are edited from inside the docker container, for example
docker-compose exec dev vim server.js
nodemon works correctly and restarts the server.
However, when an editor running on the host machine is used, nodemon does not pick up the changes and restart the program.
The contents of the files in the docker container do in fact change, so I suspect editing files this way just doesn't trigger an FS event.
Is it possible to set this up so that editing files on the host machine causes file system events to occur in the Docker container? Why does this not happen already?
Platform Info:
Docker for Windows (Hyper-V)
node docker container
WebStorm -- Host based editor
It looks like file system events just don't work when Docker is running in Hyper-V and the changes happen on the host. But, it's possible to work around that limitation by enabling polling in nodemon:
nodemon -L server.js
In WebStorm the full command that ends up getting used is
docker-compose run dev node node_packages/nodemon/bin/nodemon.js -L server.js
More info:
https://github.com/remy/nodemon#application-isnt-restarting
My goal is to set up a Docker container that automatically restarts a NodeJS server when file changes are detected from the host machine.
I have chosen nodemon to watch the files for changes.
On Linux and Mac environments, nodemon and docker are working flawlessly.
However, when I am in a Windows environment, nodemon doesn't restart the server.
The files are updated on the host machine, and are linked using the volumes parameter in my docker-compose.yml file.
I can see the files have changed when I run docker exec <container-name> cat /path/to/fileChanged.js. This way I know the files are being linked correctly and have been modified in the container.
Is there any reason why nodemon doesn't restart the server for Windows?
Use nodemon --legacy-watch to poll for file changes instead of listening to file system events.
VirtualBox doesn't pass file system events over the vboxfs share to your Linux VM. If you're using Docker for Windows, it would appear HyperV doesn't propagate file system events either.
As a 2021 side note, Docker for Mac/Windows new GRPCfuse file system for mounting local files into the VM should send file system events across now.
2022 note: Looks like Windows/WSL Docker doesn't share FS events to the Linux VM (see comments #Mohamed Mirghani and #Ryan Wheale and github issue).
It is simple, according to the doc you must change:
nodemon server.js
to:
nodemon --legacy-watch server.js
As mentioned by others, using node --legacy-watch will work, however, the default polling rate is quite taxing on your cpu. In my case, it was consuming 30% of my CPU just by looping through all the files in my project. I would advise you to specify the polling interval as mention by #Sandokan El Cojo.
You can do so by either adding "pollingInterval": 4000 (4 seconds in this example) to your nodemon.json file or specifying it with the -P or --polling-interval flag in the command.
This was an issue in the docker for Windows. Now it's fixed
https://www.docker.com/blog/new-filesharing-implementation-in-docker-desktop-windows/
I'm gonna deploy a Node.js mobile web application on two remote servers.(Linux OS)
I'm using SVN server to manage my project source code.
To simply and clearly manage the app, I decided to use Jenkins.
I'm new to Jenkins so it was a quite difficult task installing and configuring Jenkins.
But I couldn't find how to set up Jenkins to build remote servers simultaneously.
Could you help me?
You should look into supervisor. It's language and application type agnostic, it just takes care of (re-) starting application.
So in your jenkins build:
You update your code from SVN
You run your unit tests (definitely a good idea)
You either launch an svn update on each host or copy the current content to them (I'd recommend this because there are many ways to make SVN fail and this allows to include SVN_REVISION in the some .JS file for instance)
You execute on each host: fuser -k -n tcp $DAEMON_PORT, this will kill the currently running application with the port $DAEMON_PORT (the one you use in your node.js's app)
And the best is obviously that it will automatically start your node.js at system's startup (provided supervisor is correctly installed (apt-get install supervisor on Debian)) and restart it in case of failure.
A node.js supervisord's subconfig looks like this:
# /etc/supervisor/conf.d/my-node-app.conf
[program:my-node-app]
user = running-user
environment = NODE_ENV=production
directory = /usr/local/share/dir_app
command = node app.js
stderr_logfile = /var/log/supervisor/my-node-app-stderr.log
stdout_logfile = /var/log/supervisor/my-node-app-stdout.log
There are many configuration parameters.
Note: There is a node.js's supervisor, it's not the one I'm talking about and I haven't tested it.
per Linux OS, you need to ssh to your hosts to run command to get application updated:
work out the workflow of application update in shell script. Especially you need to daemonize your node app so that a completed jenkins job execution will not kill your app when exits. Here's a nice article to tell how to do this: Running node.js Apps With Upstart, or you can refer to pure nodejs tech like forever. Assume you worked out a script under /etc/init.d/myNodeApp
ssh to your Linux OS from jenkins. so you need to make sure the ssh private key file has been copied to /var/lib/jenkins/.ssh/id_rsa with the ownership of jenkins user
Here's an example shell step in jenkins job configuration:
ssh <your application ip> "service myNodeApp stop; cd /ur/app/dir; svn update; service myNodeApp restart"
Since node is basically a single process, when something goes terribly wrong, the whole application dies.
I now have a couple of apps built on express and I am using some manual methods to prevent extended downtimes ( process.on('uncaughtException') and a custom heartbeat monitor ).
Any suggestions from the community?
Best-practices? Frameworks?
Thanks!
A
Use something like forever
or use supervisor.
Just npm link and then sudo supervisor server.js.
These types of libraries also support hot reloading. There are some which you use from the command line and they run node services as sub processes for you. There are others which expect you to write your code to reload itself.
Ideally what you want to move towards a full blown load balancer which is failure safe. If a single node proccess in your load balancer crashes you want it to be quietly restarted and all the connections and data rescued.
Personally I would recommend supervisor for development (It's written by isaacs!) and a full blown load balancer (either nginx or node) for your real production server.
Of course your already running multiple node server processes in parallel because you care about scaling across multiple cores right ;)
Use forever.
"A simple CLI tool for ensuring that a given script runs continuously (i.e. forever)"
Just install it with npm
npm install forever
and type
forever start app.js
Try to look at forever module.
If you're using Ubuntu you can use upstart (which is installed by default).
$ cat /etc/init/my_app.conf
description "my_app"
author "me"
start on (local-filesystems and net-device-up IFACE=eth0) stop on
shutdown
respawn
exec sh -c "env NODE_ENV=production node /path/myapp/app.js >> /var/log/node.log 2>&1"
"respawn" mean that the app will be restarted if it dies.
To start the app
start my_app
For other commands
man initctl
I'll strongly recommend forever too. I used it yesterday and its a breeze:
Install npm install forever
Start your app forever start myapp.js
Check if its working forever list
Try killing your app :
ps
Get your myapp.js pid and run kill <pid
run forever list and you'll see it's running again
You can try using Fugue, a library for node.js similar to Spark or Unicorn:
https://github.com/pgte/fugue
Fugue can manage any node.js server type, not just web servers, and it's set up and configured as a node.js script, not a CLI command, so normal node.js build & deployment toolchains can use it.