Docker EXPOSE. Can't get it - node.js

this past two day I'm having trouble with docker and i can get it. Following to the docker doc you can expose the ports on which a container will listen for connections with EXPOSE. So far, so good!
If my app listen on port 8080 I should expose my docker container with EXPOSE 8080 and bind it to port 80 of the main host with docker run -p 80:8080.
Here is my Dockerfile:
# DOCKER-VERSION 0.0.1
FROM ubuntu:14.10
# make sure apt is up to date
RUN apt-get update
# install nodejs and npm
RUN apt-get install -y nodejs-legacy npm git git-core
ADD package.json /root/
ADD server.js /root/
# start script
ADD start.sh /root/
RUN chmod +x /root/start.sh
EXPOSE 8080
CMD ./root/start.sh
And my start.sh just runan cd /root/ & npm install & node server.js.
I got a simple express nodejs app:
var express = require('express');
// Constants
var PORT = 8080;
// App
var app = express();
app.get('/', function (req, res) {
res.send('Hello world\n');
});
app.listen(PORT);
console.log('Running on http://localhost:' + PORT);
Here is how i build my docker image: docker build -t app1 .
And how i launch my docker: docker run -it -p 80:8080 --name app1 app1
What is really wired, this is not working. To make it work i have to change EXPOSE 8080 to EXPOSE 80. I don't get it.
Any explanation?
Thanks for reading,
Tom

In your nodejs app, you have the instruction app.listen(PORT); which tells nodejs to start a server listening for connections on the loopback interface on port PORT.
As a result your app will only by able to see connections originating from localhost (the container itself).
You need to tell your app to listen on all interfaces on port PORT:
app.listen(PORT, "0.0.0.0");
This way it will see the connections originating from outside your Docker container.

Related

node js docker is not running on heroku

Node js project in Docker container is not running on Heroku.
Here is the source code.
Docker file
FROM node:14
WORKDIR /home/tor/Desktop/work/docker/speech-analysis/build
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 8080
CMD [ "node", "server.js" ]
server.js
'use strict';
const express = require('express');
const PORT = process.env.port||8080;
const app = express();
app.get('/', (req, res) => {
res.send('Hello World');
});
app.listen(PORT);
console.log("Running on http://:${PORT}");
You don't need to expose anything when having a container for Heroku. It takes care of it automatically. If you are running the same Docker locally, you can do:
docker build -t myapp:latest .
docker run -e PORT=8080 -p 8080:8080 -t myapp:latest
I think that the environment variables are case-sensitive on Linux systems - so you need to change the
const PORT = process.env.port||8080;
... to:
const PORT = process.env.PORT||8080;
... as Heroku sets an environment variable PORT (and not port).
According to this answer you just need to use the port 80 in your expose or inside of nodejs:
app.listen(80)
Heroku at run, will generate a random port and bind it to 80
docker run ... -p 46574:80 ...
So if your nodejs app is running at port 80 inside of container, everything will be fine

Creating a Docker Container to deploy to a prod env

I'm having some problems with building my application through Jenkins and running the container on a extrernal tomcat.
Dockerfile:
FROM node:10.16.3
RUN ls -al
WORKDIR /app
COPY /package/repo/package.json /app/package.json
RUN npm install
COPY /package/repo /app
RUN npm run build
EXPOSE 8080
CMD ["npm", "start]
npm start calls node server.js
server.js:
const express = require('express');
const app = express();
const port = 8080;
app.get('/', (req, res) => {
res.send('Hello World!');
});
app.listen(port, () => {
console.log(`Example app listening on port ${port}!`);
console.log(__dirname+'/client/build/index.html');
});
app.listen(port, () => console.log(`Example app listening on port ${port}!`));
docker build -t reacttest .
docker run reacttest
I'm trying to access the container using localhost:8080, however, whenever I access that port, I'm getting error not found. Is there a step I'm missing. Sorry, i'm very new to docker.
You need to map a port from your machine to the container. Use the p flag for this.
docker run reacttest -p 8080:8080
In general the syntax is:
docker run <image> -p <host port>:<container port>
You can read more in the documentation
EXPOSE does not accually publish the port. You should run you container with -p flag to map ports from container to your host system. See the documents.

Nodejs port with docker is unreachable

I'm trying to run node inside a docker container and to expose externally port 8585.
I just want to test the accessibility of the exposed port.
For now, in order to simplify the problem, I excluded Nginx from this setup, I just want to test the nodejs+docker port.
The setup:
Nodejs:
const PORT = 8585;
const HOST = '127.0.0.1';
app.listen(PORT, HOST);
Dockerfile:
FROM node:11.10.1
ENV NODE_ENV production
WORKDIR /usr/src/app
COPY . .
RUN npm install --production --silent
EXPOSE 8585
CMD node index.js
docker-compose.yml
version: '2.1'
services:
mws-client:
image: mws-client
build: .
environment:
NODE_ENV: production
ports:
- '8585'
Running the docker image:
docker run -p 8585:8585 --expose 8585 1085d876c882
Running output:
$ docker run -p 8585:8585 --expose 8585 1085d876c882
About 2 run on http://127.0.0.1:8585
Running on http://127.0.0.1:8585
Netstat output:
$ netstat -a | grep 8585
tcp46 0 0 *.8585 *.* LISTEN
Docker ps:
$ docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
99d032ce1ad2 1085d876c882 "/bin/sh -c 'node in…" 7 minutes ago Up 7 minutes 0.0.0.0:8585->8585/tcp inspiring_hypatia
Still - no connection:
$ curl http://127.0.0.1:8585
curl: (52) Empty reply from server
$ curl http://127.0.0.1:80
<html><body><h1>It works!</h1></body></html>
Please help - clearly I am missing something fundamental (believe me I tested it with so many options..)
The issue is not with Docker or Host, The issue is in the node application.
you should not bind server to listen 127.0.0.1.
change your code and bind to 0.0.0.0 and everything will work fine.
const PORT = 8585;
const HOST = '0.0.0.0';
app.listen(PORT, HOST);
you are good to test
docker run -p 8585:8585 -it 1085d876c882
update compose
ports:
- "8585:8585"

Can we use http package in nodejs with docker

I am researching about docker and I have code a demo nodejs with docker. I use HTTP package in nodejs instead of express, the app is built with docker, but when I go to localhost:80, the return is
ERR_EMPTY_RESPONSE
I have code a demo with nodejs and use express, it can run, and I cannot find any example using HTTP package.
I do not clear what EXPOSE port in docker for, that is port call to browser or port for app?
Docker file
FROM node:8
RUN mkdir -p /home/node/app && chown -R node:node /home/node/app
WORKDIR /home/node/app
COPY package*.json ./
USER node
RUN npm install
COPY --chown=node:node . .
EXPOSE 80
CMD ["npm", "start"]
index.js
const http = require('http');
const hostname = '127.0.0.1';
const port = 3000;
const server = http.createServer((req, res) => {
res.statusCode = 200;
res.setHeader('Content-Type', 'text/plain');
res.end("Hello world \n");
});
server.listen(port, hostname, () => {
console.log(`server is running at abcxyz http://${hostname}:${port}/`);
});
Have you published the port at your docker run command?
docker run -p 80:3000 ...
Your hostname if you run it without docker is localhost (127.0.0.1)
But if you run it in docker it have to be:
const hostname = '0.0.0.0';
In your code, server listens to PORT 3000 and you have exposed PORT 80 to HOST which means PORT 80 has nothing corresponding running inside docker container, you actually have to EXPOSE PORT 3000 from docker container and use that.
You can use this command to map to a port usable in host where port number before : represents port to be exposed in host and port number after : represents port in docker container that's exposed AFAIK.
docker run -p 80:80

How can I run Ghost in Docker with the google/node-runtime image?

I'm very new to Docker, Ghost and node really, so excuse any blatant ignorance here.
I'm trying to set up a Docker image/container for Ghost based on the google/nodejs-runtime image, but can't connect to the server when I run via Docker.
A few details: I'm on OS X, so using I'm boot2docker. I'm running Ghost as a npm module, configured to use port 8080 because that's what google/nodejs-runtime expects. This configuration runs fine outside of Docker when I use npm start. I also tried a simple "Hello, World" Express app on port 8080 which works from within Docker.
My directory structure looks like this:
my_app
content/
Dockerfile
ghost_config.js
package.json
server.js
package.json
{
"name": "my_app",
"private": true,
"dependencies": {
"ghost": "0.5.2",
"express": "3.x"
}
}
Dockerfile
FROM google/nodejs-runtime
ghost_config.js
I changed all occurrences of port 2368 to 8080.
server.js
// This Ghost server works with npm start, but not with Docker
var ghost = require('ghost');
var path = require('path');
ghost({
config: path.join(__dirname, 'ghost_config.js')
}).then(function (ghostServer) {
ghostServer.start();
});
// This "Hello World" app works in Docker
// var express = require('express');
// var app = express();
// app.get('/', function(req, res) {
// res.send('Hello World');
// });
// var server = app.listen(8080, function() {
// console.log('Listening on port %d', server.address().port);
// });
I build my Docker image with docker build -t my_app ., then run it with docker run -p 8080 my_app, which prints this to the console:
> my_app# start /app
> node server.js
Migrations: Up to date at version 003
Ghost is running in development...
Listening on 127.0.0.1:8080
Url configured as: http://localhost:8080
Ctrl+C to shut down
docker ps outputs:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
4f4c7027f62f my_app:latest "/nodejs/bin/npm sta 23 hours ago Up About a minute 0.0.0.0:49165->8080/tcp pensive_lovelace
And boot2docker ip outputs:
The VM's Host only interface IP address is: 192.168.59.103
So I point my browser at: 192.168.59.103:49165 and get nothing, an no output in the Docker logs. Like I said above, running the "Hello World" app in the same server.js works fine.
Everything looks correct to me. The only odd thing that I see is that sqlite3 fails in npm install during docker build:
[sqlite3] Command failed:
module.js:356
Module._extensions[extension](this, filename);
^
Error: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.14' not found
...
node-pre-gyp ERR! Testing pre-built binary failed, attempting to source compile
but the source compile appears to succeed just fine.
I hope I'm just doing something silly here.
In your ghost config, change the related server host to 0.0.0.0 instead of 127.0.0.1:
server: {
host: '0.0.0.0',
...
}
PS: for the SQLite error. Try this Dockerfile:
FROM phusion/baseimage:latest
# Set correct environment variables.
ENV HOME /root
# Regenerate SSH host keys. baseimage-docker does not contain any, so you
# have to do that yourself. You may also comment out this instruction; the
# init system will auto-generate one during boot.
RUN /etc/my_init.d/00_regen_ssh_host_keys.sh
# Use baseimage-docker's init system.
CMD ["/sbin/my_init"]
# ...put your own build instructions here...
# Install Node.js and npm
ENV DEBIAN_FRONTEND noninteractive
RUN curl -sL https://deb.nodesource.com/setup | sudo bash -
RUN apt-get install -y nodejs
# Copy Project Files
RUN mkdir /root/webapp
WORKDIR /root/webapp
COPY app /root/webapp/app
COPY package.json /root/webapp/
RUN npm install
# Add runit service for Node.js app
RUN mkdir /etc/service/webapp
ADD deploy/runit/webapp.sh /etc/service/webapp/run
RUN chmod +x /etc/service/webapp/run
# Add syslog-ng Logentries config file
ADD deploy/syslog-ng/logentries.conf /etc/syslog-ng/conf.d/logentries.conf
# Expose Ghost port
EXPOSE 2368
# Clean up APT when done.
RUN apt-get clean && rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/*
Note I used phusion/baseimage instead of google/nodejs-runtime and installed node.js & npm with:
ENV DEBIAN_FRONTEND noninteractive
RUN curl -sL https://deb.nodesource.com/setup | sudo bash -
RUN apt-get install -y nodejs
In your Dockerfile, you need this command EXPOSE 8080.
But that only makes that port accessible outside the Docker container. When you run a container from that image you need to 'map' that port. For example:
$ docker run -d -t -p 80:8080 <imagename>
The -p 80:8080 directs port '8080' in the container to port '80' on the outside when it is running.
The syntax always confuses me (I think it is backwards).

Resources