Node.js Koa app and CouchDB in single container - couchdb

I have a koa.js app which I want to run in a docker container. This loa app requires couchdb to run, which I want to ship in the same container. I know this is not best practise but it is indeed the best way for my users to get started.
Dockerfile:
# DOCKER-VERSION 1.2.0
FROM centos:centos6
# Enable EPEL for CouchDB and Node.js
RUN yum -y update; yum clean all
RUN yum -y install epel-release; yum clean all
RUN yum -y install tar
# Install Node.js and CouchDB
RUN yum -y install couchdb; yum clean all
RUN service couchdb start
RUN yum install -y nodejs
RUN yum install -y npm
# Bundle app source
ADD . .
# Install app dependencies
RUN npm install
RUN npm install -g n
RUN n 0.11.12
# Expose port and run
EXPOSE 8080
CMD ["npm", "start"]
which works fine, app gets launched but it can't connect to the couchdb. throwing in a
RUN service couchdb start
response with OK, so it seems to work, but
curl -X GET 127.0.0.1:5984
response with
curl: (7) couldn't connect to host
same for the koa.js app:
error: error stack=Error: connect ECONNREFUSED
at exports._errnoException (util.js:745:11)
at Object.afterConnect [as oncomplete] (net.js:995:19), code=ECONNREFUSED, errno=ECONNREFUSED, syscall=connect
someone knows what I am missing or what I am doing wrong?

The only command that is run when you start this image is what is in the CMD line. Every line before that creates a read only, non-running image. Thus, the line RUN service couchdb start will start the service for an instant, until it is marked as successful, then docker will stop that image, save it, and move on to the next line. The "running" state of the service doesn't persist.
It is a common misconception, and one I fell into when I started.
The three options, top being fastest yet most hacky, and last being most work but most proper:
Put service couchdb start && npm start as your CMD line.
Create a startup.sh script, do all the "running" that you need to do in there, and call that as your CMD line
Use a service designed to do this stuff for you, supervisord is often recommended.
This is a common issue, so if you have a read through google results searching for "start service in docker" you'll see more information around the subject.

Related

Nodejs or node returns nothing on Ubuntu

I've built a Javascript app running on Node within my MacOS environment, and everything works great. Now I've created an Azure Ubuntu server, rsync'd the source from my machine.
I've duplicated the app requirements by installing npm, node, and all the packages required. I SSH into the server and when I run the app from the Ubuntu server via
$node app.js
All that is returned is
$
Reading that Ubuntu uses nodejs-legacy, i've also tried
$nodejs app.js
Same result
$node -v
v4.7.2
I've also built a package.json file and when executing with
npm start
it immediately returns back to $.
The reason why it wasn't working is the default APT repository that is called when installing nodejs on Ubuntu is outdated. I ran the following to code to fix the problem. It automatically uninstalls all the other incorrect packages, sets the correct repository, and re-installs.
# Sets up the correct APT repository hosted by NodeSource, and adds the PGP key to the system's APT keychain
$ curl -sL https://deb.nodesource.com/setup_6.x | sudo -E bash -
# Installs Node.js
$ sudo apt-get install -y nodejs
# Updates NPM
$ sudo npm install npm --global
All apps work as intended now!

How to restart a docker service using Shell Script

I have Node service which is running in Docker container
Due following exception service gets stopped after some time.
events.js:141 throw er; // Unhandled 'error' event
Error: read ECONNRESET
at exports._errnoException (util.js:870:11)
at TLSWrap.onread (net.js:544:26)
I am not aware why this exception is coming up.
I am looking for work around which can restart the service once its stop.
I am using shell file to run these service, So is there something that I can add in shell file which can restart this stopped service.
Here is a sample of my shell file:
#!/bin/bash
ORGANISATION="$1"
SERVICE_NAME="$2"
VERSION="$3"
ENVIRONMENT="$4"
INTERNAL_PORT_NUMBER="$5"
EXTERNAL_PORT_NUMBER="$6"
NETWORK="$7"
docker build -t ${ORGANISATION}/${SERVICE_NAME}:${VERSION} --build-arg PORT=${INTERNAL_PORT_NUMBER} --build-arg ENVIRONMENT=${ENVIRONMENT} --no-cache .
docker stop ${SERVICE_NAME}
docker rm ${SERVICE_NAME}
sudo npm install
sudo npm install -g express
docker run -p ${EXTERNAL_PORT_NUMBER}:${INTERNAL_PORT_NUMBER} --network ${NETWORK} --restart always --name ${SERVICE_NAME} -itd ${ORGANISATION}/${SERVICE_NAME}:${VERSION}
Here is my Dockerfile
FROM ubuntu
ARG ENVIRONMENT
ARG PORT
ENV PORT $PORT
ENV ENVIRONMENT $ENVIRONMENT
RUN apt-get update -qq
RUN apt-get install -y build-essential nodejs npm nodejs-legacy vim
RUN mkdir /database_service
ADD . /database_service
WORKDIR /database_service
RUN npm install -g path
RUN npm cache clean
EXPOSE $PORT
ENTRYPOINT [ "node", "server.js" ]
CMD [ $PORT, $ENVIRONMENT ]
Thanks in advance.
You can use docker run --restart always .... Then Docker will restart the container every time it is stopped.
The error comes from a tcp connection that is abruptly closed, maybe from a database or websocket.
I don't know why you use npm in your script, because it is outside of the container. If you want it to be installed inside the container add it to a RUN in your Dockerfile.
Maybe take a look at docker-compose. With it you can write your config in a docker-compose.yml file and simply use docker-compose up --build and have the same functionality as this script.

How can you get Grunt livereload to work inside Docker?

I'm trying to use Docker as a dev environment in Windows.
The app I'm developing uses Node, NPM and Bower for setting up the dev tools, and Grunt for its task running, and includes a live reload so the app updates when the code changes. Pretty standard. It works fine outside of Docker but I keep running into the Grunt error Fatal error: Unable to find local grunt. no matter how I try to do it inside Docker.
My latest effort involves installing all the npm and bower dependencies to an app directory in the image at build time, as well as copying the app's Gruntfile.js to that directory.
Then in Docker-Compose I create a Volume that is linked to the host app, and ask Grunt to watch that volume using Grunt's --base option. It still won't work. I still get the fatal error.
Here are the Docker files in question:
Dockerfile:
# Pull base image.
FROM node:5.1
# Setup environment
ENV NODE_ENV development
# Setup build folder
RUN mkdir /app
WORKDIR /app
# Build apps
#globals
RUN npm install -g bower
RUN echo '{ "allow_root": true }' > /root/.bowerrc
RUN npm install -g grunt
RUN npm install -g grunt-cli
RUN apt-get update
RUN apt-get install ruby-compass -y
#locals
ADD package.json /app/
ADD Gruntfile.js /app/
RUN npm install
ADD bower.json /app/
RUN bower install
docker-compose.yml:
angular:
build: .
command: sh /host_app/startup.sh
volumes:
- .:/host_app
net: "host"
startup.sh:
#!/bin/bash
grunt --base /host_app serve
The only way I can actually get the app to run at all in Docker is to copy all the files over to the image at build time, create the dev dependencies there and then, and run Grunt against the copied files. But then I have to run a new build every time I change anything in my app.
There must be a way? My Django app is able to do a live reload in Docker no problems, as per Docker's own Django quick startup instructions. So I know live reload can work with Docker.
PS: I have tried leaving the Gruntfile on the Volume and using Grunt's --gruntfile option but it still crashes. I have also tried creating the dependencies at Docker-Compose time, in the shared Volume, but I run into npm errors to do with unpacking tars. I get the impression that the VM can't cope with the amount of data running over the shared file system and chokes, or maybe that the Windows file system can't store the Linux files properly. Or something.

Docker - Properly Mounting Host Directory in Docker Container (Windows)

I am having some trouble mounting a directory on my machine into my Docker container. I would like to mount a directory containing files necessary to run a node server. So far, I have successfully been able to run and access my server in browser using the Dockerfile below:
# Use an ubuntu base image
FROM ubuntu:14.04
# Install Node.js and npm (this will install the latest version for ubuntu)
RUN apt-get update
RUN apt-get -y install curl
RUN curl -sL https://deb.nodesource.com/setup_0.12 | sudo bash -
RUN apt-get -y install nodejs
RUN apt-get -y install git
# Bundle app source (note: all of my configuration files/folders are in the current directory along with the Dockerfile)
COPY . /src
Install app dependencies
#WORKDIR /src
RUN npm install
RUN npm install -g bower
RUN bower install --allow-root
RUN npm install -g grunt
RUN npm install -g grunt-cli
#What port to expose?
EXPOSE 1234
#Run grunt on container start
CMD grunt
And these commands:
docker build -t test/test .
docker run -p 1234:1234 -d test/test
However, I figured that I would like the configuration files and whatnot to persist, and thought to do this by mounting the directory (with the files and Dockerfile) as a volume. I used other solutions on StackOverflow to get this command:
docker run -p 1234:1234 -v //c/Users/username/directory:/src -d test/test
My node server seems to start up fine (no errors in the log), but it takes significantly longer to do so, and when I try to access my webpage in browser I just get a blank page.
Am I doing something incorrectly?
EDIT: I have gotten my server to run--seems to have been a weird error in my configuration. However, it still takes a long time (around a minute or two) for my server to start when I mount a volume from my host machine. Does anyone have some insight as to why this is?

Deploy MEAN.JS on Google Cloud Platform

I tried to deploy MEAN.JS on Google Cloud Platform (also the KeystoneJS CMS), but it doesn't work. I use the command-line tool online.
In the order, i installed Node.js, MongoDB, bower and grunt, then I try to deploy MEAN.JS :
Install Node.js :
sudo apt-get install curl
curl -sL https://deb.nodesource.com/setup | sudo bash -
sudo apt-get install -y nodejs nodejs-legacy
Install MongoDB :
sudo apt-key adv --keyserver keyserver.ubuntu.com --recv 7F0CEB10
echo 'deb http://downloads-distro.mongodb.org/repo/debian-sysvinit dist 10gen' | sudo tee /etc/apt/sources.list.d/mongodb.list
sudo apt-get update
sudo apt-get install -y mongodb-org
Install bower et Grunt :
sudo npm install -g bower
sudo npm install -g grunt-cli
Install MEAN.js :
sudo npm install -g generator-meanjs
mkdir mean
cd mean
yo meanjs
grunt
Here is the result on the command-line board :
Running "jshint:all" (jshint) task
53 files lint free.
Running "csslint:all" (csslint) task
2 files lint free.
Running "concurrent:default" (concurrent) task
Running "watch" task
Running "nodemon:dev" (nodemon) task
Waiting...
[nodemon] v1.2.1
[nodemon] to restart at any time, enter rs
[nodemon] watching: app/views//. gruntfile.js server.js config//.js app/*/*.j
s
[nodemon] starting node --debug server.js
debugger listening on port 5858
NODE_ENV is not defined! Using default development environment
js-bson: Failed to load c++ bson extension, using pure JS version
Failed to load c++ bson extension, using pure JS version
MEAN.JS application started on port 3000
I can define the NODE_ENV variable (test, development, all...) but it's the same problem.
The problem is :
It should work but whan i try to access to my IP:port (in this case 146.148.113.68:3000) : "This webpage is not available".
Is it a problem with the VM, the packages, MEAN.JS ? I have the same problem with the KeystoneJS CMS.
Thanks !
Are you sure ports to your machine are open for access? When you deploy an app on Compute Engine, you have to edit network settings to allow custom ports. There's easy checkmark options for allowing HTTP and HTTPs traffic, but for custom ports, you will have to add port in settings.
This documentation might be helpful but you can always find these things in the Cloud Console.
Also, now Google Cloud Launcher also supports MEAN stack deployment with both MEAN.io and MEAN.js flavors which simplifies the whole process.
Everything looks fine Only thing to change is the firewall settings which is blocking your web application ,there is no problem with VM or KeystoneJS CMS. You Need to just change the firewall settings for incoming and Outing Traffic:
You need to specify the ip and port number , i have given access to all ports temporarily for testing purpose.
Try other services to host your MEANJS Web Application.
Heroku : https://www.heroku.com/
Nodejitsu : https://www.nodejitsu.com/
If it works in these platoforms,then the problem is not with your cloud.

Resources