Deploy NoneJS application with Docker - node.js

Is it possible to develop an NoneJS application on windows (or another platform e.g. Raspbian) and deploy it on Linux with Docker?

Yes, if you dockerize your nodejs application. nodejs.org explains:
You create a new directory where all the files would live.
(package.json, server.js, ...)
You create a Dockerfile (in that same folder) using FROM node:argon which is node 4.6.1 (or another version: see hub.docker.com/_/node/: latest is 7.0.0)
That is:
FROM node:argon
# Create app directory
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
# Install app dependencies
COPY package.json /usr/src/app/
RUN npm install
# Bundle app source
COPY . /usr/src/app
EXPOSE 8080
CMD [ "npm", "start" ]
you build and run:
docker build -t <your username>/node-web-app .
docker run -p 49160:8080 -d <your username>/node-web-app

Related

Create Node.js app usin Docker without installing node on host machine

New to docker. I wanted to create a simple Node.js app using docker on my Ubuntu OS. But all the tutorials/YouTube Videos are first installing Node on host machine, then run and test the app, and then dockerizing the app. Means all the tutorials are simply teaching "How to dockerize an existing app".
If anyone can provide a little guidance on this then it would be really helpful.
First create a directory for your source code and create something like the starter app from here in app.js. You have to make one change though: hostname has to be 0.0.0.0 rather than 127.0.0.1.
Then run an interactive node container using this command
docker run -it --rm -v $(pwd):/app -p 3000:3000 node /bin/bash
This container has your host directory mapped to /app, so if you do
cd /app
ls -al
you should see your app.js file.
Now you can run it using
node app.js
If you then switch back to the host, open a browser and go to http://localhost:3000/ you should get a 'Hello world' response.
Follow this
# Create app directory
WORKDIR /usr/src/app
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm#5+)
COPY package*.json ./
RUN npm install
# If you are building your code for production
# RUN npm ci --only=production
# Bundle app source
COPY . .
EXPOSE 8080
CMD [ "node", "server.js" ]
docker build . -t <your username>/node-web-app
docker run -p 49160:8080 -d <your username>/node-web-app
You can create package*.json ./ files manually, if you don't want to install node/npm on your local machine.

Docker Build CLI Doesn’t update the code (TS, node)

When I run docker build on my codebase, it doesn’t apply the file modifications. I’m using TypeScript with Node as the language/framework. Here is my Dockerfile:
#Imports the node runtime/os base image
FROM node:14
#like cd into the working directory
WORKDIR /usr/src/app
ENV PORT 8080
#copies the package.json from the local dev machine and pastes it in the ./ directory on google cloud run
COPY package*.json ./
#runs on the cloud run instance terminal
RUN npm install --only=production
#copy the actual code from the . directory (root) and places it in the cloud run root.
COPY . .
EXPOSE 8080
RUN rm -rf src
#Start the service on instance startup
CMD ["npm", "start"]
The issue was that the TS code was not getting compiled into JS code. After explicitly running the compiler and checking the TS config, the problem is resolved.

Need to use wkhtmltopdf in docker node:alpine image

I have image docker image node:alpine , need to use wkhtmltopdf in some part of my API services. Here is my docker file
FROM node:alpine
# Create app directory
WORKDIR /usr/src/app
# Install app dependencies
COPY package.json .
# For npm#5 or later, copy package-lock.json as well
# COPY package.json package-lock.json .
RUN npm install
# Bundle app source
COPY . .
EXPOSE 8080
CMD [ "npm", "start" ]
remember i already have wkhtmltopdf container on my docker
You can download it from its repo on github and call the binary within your container:
RUN curl -L https://github.com/wkhtmltopdf/wkhtmltopdf/releases/download/0.12.4/wkhtmltox-0.12.4_linux-generic-amd64.tar.xz | tar -xJ
The above downloads and extracts it in your active working directory in your docker image.

Using SemanticUI with Docker

I setted up a Docker container running an express app. Here is the dockerfile :
FROM node:latest
# Create app directory
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
# Install app dependencies
COPY package.json /usr/src/app/
RUN npm install
RUN npm install -g nodemon
RUN npm install --global gulp
RUN npm i gulp
# Bundle app source
COPY . /usr/src/app
EXPOSE 3000
CMD [ "nodemon", "-L", "./bin/www" ]
As you can see, it uses the nodejs image and creates an app folder in my container, itself containing my application. IThe docker container, on start runs npm install and installs my modules on the container (thanks to that i don't have this node_modules folder in my local folder) i want to integrate SemanticUI which uses gulp. I so installed gulp on my container but the files created by semantic are only present on my container. They are not on my local folder. How can i dynamically make that those files created on my container are locally present to modify.
I thought that one of docker's great jobs was to not have node_modules installed on your local app folder .. maybe i am wrong if so please correct me
You can use data volume to share files with the container and host machine.
Something like this:
docker run ... -v /path/on/host:/path/in/container ...
or if you are using docker-compose, see volume configuration reference.
...
volumes:
- /path/on/host:/path/in/container

How should I run thumbd as a service inside a Docker container?

I'd like to run thumbd as a service inside a node Docker image! At the moment I'm just running it before I start my app, which is no use to me! Is there a way I could setup my Dockerfile to run it as an init.d service on startup without blocking any of my other docker commands?
My Dockerfile goes as follows:
FROM node:6.2.0
# Create app directory
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
# Install app dependencies
COPY package.json /usr/src/app/
RUN npm install
# Thumbd
RUN npm install -g thumbd
RUN mkdir -p /var/log/
RUN echo "" > /var/log/thumbd.log
RUN thumbd server --aws_key=<KEY> --aws_secret=<SECRET> --sqs_queue=<QUEUE> --bucket=<BUCKET> --aws_region=us-west-1 --s3_acl=public-read
# Bundle app source
COPY . /usr/src/app
EXPOSE 8080
CMD npm run build && npm start
It's probably easiest to run thumbd in it's own container due to the way it works without direct links to your application. Docker likes to push the idea of a single process per container too.
FROM node:6.2.0
# Thumbd
RUN set -uex; \
npm install -g thumbd; \
mkdir -p /var/log/; \
touch /var/log/thumbd.log
CMD thumbd server --aws_key=<KEY> --aws_secret=<SECRET> --sqs_queue=<QUEUE> --bucket=<BUCKET> --aws_region=us-west-1 --s3_acl=public-read
You can use Docker Compose to orchestrate running multiple containers in your project.
If you really want to run multiple processes in a container, use an init system like s6 or possibly supervisord.

Resources