connect two docker containers with socket io - node.js

I am currently working on dockerizing a tested socket.io app for a simple chat application, using socket.io, and mocha for testing. the server opens up a socket for listening on port 3000 and the test client uses the socket to emit messages or receive emissions.
I am using version 3 of docker compose files.
nodeserver dockerfile:
FROM node:10
WORKDIR /usr/src/appserver
COPY package*.json ./
COPY public public
COPY main.js main.js
RUN npm install
RUN npm install express
RUN npm install socket.io
CMD ["npm", "start"]
test dockerfile:
FROM nodeserver
COPY test test
RUN npm update && \
npm install -g mocha && \
npm install -g socket.io-client
CMD ["npm", "test"]
docker-compose:
version: "3"
services:
nodeserver:
build: .
expose:
- "3000"
image: ws
test:
depends_on:
- nodeserver
links:
- nodeserver
build: ./test
image: test_image
my node server is listening on port 3000, and on connection sends a hi message to all.
let express = require('express');
let app = express();
let http = require('http').createServer(app);
let io = require('socket.io')(http);
http.listen(3000, function ()
{
console.log('listening on *:3000');
});
io.on('connection', function(socket)
{
console.log('a user connected');
io.emit('hi', 'hi');
});
and my mocha test looks like this, which in essence attaches itself as a client and waits for the the hi message to come.
const url = 'ws://nodeserver:3000';
describe("Chat Server", function()
{
it("Should broadcast hi!", function(done)
{
let client1 = io.connect(url, options);
client1.on('connect', function()
{
client1.on('hi', function(msg)
{
msg.should.equal("hi");
client1.disconnect();
done();
});
});
});
}
running docker-compose, runs the nodeserver and the test client fails with timeout, which tells me the client cannot see the swarm network.
now running the dockers separately, that is exposing nodeservers to host and trying to connect to my localhost instead works perfectly and the test passes. This tells me that my socket and the way I communicate with nodeserver should be correct, which would basically mean that I should be having a problem with setting up my swarm's network. Can somebody tell me what I am doing wrong here?

I think your configuration looks good, it's a matter of readiness of your nodeserver. Even with depends_on, there is no guarantee that nodeserver is ready when test starts. (also links is useless and deprecated).
To verify my hypothesis, try the following sequence:
docker-compose up -d nodeserver
wait a few seconds
docker-compose up -d test

Related

Failed to start and then listen on the port

I have an NodeJS app that I want to deploy on Google Cloud Run.
I have Google Cloud Build configured to build container from dockerfile whenever something has been pushed ona master branch and after build Cloud Run will run new revision.
My problem is that every time I want to deploy my app I got the following error:
Cloud Run error: Container failed to start. Failed to start and then listen on the port defined by the PORT environment variable.
Cloud Run have configured containerPort: 8080
In my dockerfile I'm exposing port 8080 and in nodejs I have set up simply http server using
const http = require('http');
const server = http.createServer((req, res) => {
res.writeHead(200, {'Content-Type': 'text/plain'});
res.end('Just for testing purposes\n');
});
const port = parseInt(process.env.PORT, 10) || 8080;
server.listen(port, '0.0.0.0', () => {
console.log('Hello world listening on port', port);
});
my Dockerfile
FROM node:12-alpine
# Install app dependencies.
COPY package.json /src/package.json
WORKDIR /src
RUN npm install
# Cloud Run requrement
EXPOSE 8080
COPY index.js /src/index.js
ENTRYPOINT "node index.js"
Have I missed something? This is my first time working with google cloud so I'm sure there is something I need to configure that I don't know about yet.
The problem was in my Dockerfile.
I had to change ENTRYPOINT "node index.js" to CMD ["node", "index.js"].
Bu reason behind it is still unknown to me.

node js docker is not running on heroku

Node js project in Docker container is not running on Heroku.
Here is the source code.
Docker file
FROM node:14
WORKDIR /home/tor/Desktop/work/docker/speech-analysis/build
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 8080
CMD [ "node", "server.js" ]
server.js
'use strict';
const express = require('express');
const PORT = process.env.port||8080;
const app = express();
app.get('/', (req, res) => {
res.send('Hello World');
});
app.listen(PORT);
console.log("Running on http://:${PORT}");
You don't need to expose anything when having a container for Heroku. It takes care of it automatically. If you are running the same Docker locally, you can do:
docker build -t myapp:latest .
docker run -e PORT=8080 -p 8080:8080 -t myapp:latest
I think that the environment variables are case-sensitive on Linux systems - so you need to change the
const PORT = process.env.port||8080;
... to:
const PORT = process.env.PORT||8080;
... as Heroku sets an environment variable PORT (and not port).
According to this answer you just need to use the port 80 in your expose or inside of nodejs:
app.listen(80)
Heroku at run, will generate a random port and bind it to 80
docker run ... -p 46574:80 ...
So if your nodejs app is running at port 80 inside of container, everything will be fine

Can't access Hapi server inside Docker Container

I built a simple NodeJS server with Hapi and tried to run it inside a Docker container.
It runs nicely inside Docker, but I can't get access to it (even though I have done port mapping).
const hapi = require("#hapi/hapi");
const startServer = async () => {
const server = hapi.Server({
host: "localhost",
port: 5000,
});
server.route({
method: 'GET',
path: '/sample',
handler: (request, h) => {
return 'Hello World!';
}
});
await server.start();
console.log(`Server running on port ${server.settings.port}`);
};
startServer();
Docker file is as follows:
FROM node:alpine
WORKDIR /usr/app
COPY ./package.json ./
RUN npm install
COPY ./ ./
CMD [ "npm","run","dev" ]
To run docker, I first build with:
docker build .
I then run the image I get from above command to do port mapping:
docker run -p 5000:5000 <image-name>
When I try to access it via postman on http://localhost:5000/sample or even localhost:5000/sample, it keeps saying Couldn't connect to server and when I open in chrome, it says the same Can't display page.
PS. When i run the code as usual without Docker container, with simply npm run dev from my terminal, the code runs just fine.
So, I am confident, the API code is fine.
Any suggestions??
As mentioned by #pzaenger on your HAPI server configuration change localhost to 0.0.0.0.
host: 'localhost' to host: '0.0.0.0',

ERR_CONNECTION_REFUSED when attempting to connect to NodeJS server running on a Docker Container in Windows 10

I have an app that's contains a NodeJS server and a ReactJS client. The project's structure is as follows:
client
Dockerfile
package.json
.
.
.
server
Dockerfile
package.json
.
.
.
docker-compose.yml
.gitignore
To run both of these, I am using a docker-compose.yml:
version: "3"
services:
server:
build: ./server
expose:
- 8000
environment:
API_HOST: "http://localhost:3000/"
APP_SERVER_PORT: 8000
MYSQL_HOST_IP: mysql
ports:
- 8000:8000
volumes:
- ./server:/app
command: yarn start
client:
build: ./client
environment:
REACT_APP_PORT: 3000
NODE_PATH: src
expose:
- 3000
ports:
- 3000:3000
volumes:
- ./client/src:/app/src
- ./client/public:/app/public
links:
- server
command: yarn start
And inside of each of both the client and server folders I have a Dockerfile (same for each):
FROM node:10-alpine
RUN mkdir -p /app
WORKDIR /app
COPY package.json /app
COPY yarn.lock /app
COPY . /app
RUN yarn install
CMD ["yarn", "start"]
EXPOSE 80
Where the client's start script is simply react-scripts start, and the server's is nodemon index.js. The client's package.json has a proxy that is supposed to allow it communication with the server:
"proxy": "http://server:8000",
The react app would call a component that looks like this:
import React from 'react';
import axios from 'axios';
function callServer() {
axios.get('http://localhost:8000/test', {
params: {
table: 'sample',
},
}).then((response) => {
console.log(response.data);
});
}
export function SampleComponent() {
return (
<div>
This is a sample component
{callServer()}
</div>
);
}
Which would call the /test path in the node server, as defined in the index.js file:
const cors = require('cors');
const express = require('express');
const app = express();
app.use(cors());
app.listen(process.env.APP_SERVER_PORT, () => {
console.log(`App server now listening on port ${process.env.APP_SERVER_PORT}`);
});
app.get('/test', (req, res) => {
res.send('hi');
});
Now, this code works like a charm when I run it on my Linux Mint machine, but when I run it on Windows 10 I get the following error (I run both on Chrome):
GET http://localhost:8000/test?table=sample net::ERR_CONNECTION_REFUSED
Is it that I would need to run these on a different port or IP address? I read somewhere that the connections may be using a windows network instead of the network created by docker-compose, but I'm not even sure how to start diagnosing this. Please let me know if you have any ideas that could help.
EDIT: Here are the results of docker ps:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
1824c61bbe99 react-node-mysql-docker-boilerplate_client "docker-entrypoint.s…" 3 minutes ago Up 3 minutes 80/tcp, 0.0.0.0:3000->3000/tcp react-node-mysql-docker-boilerplate_client_1
and here is docker ps -a. For some reason, it seems the server image stops by itself as soon as it starts
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
1824c61bbe99 react-node-mysql-docker-boilerplate_client "docker-entrypoint.s…" 3 minutes ago Up 3 minutes 80/tcp, 0.0.0.0:3000->3000/tcp react-node-mysql-docker-boilerplate_client_1
5c26276e37d1 react-node-mysql-docker-boilerplate_server "docker-entrypoint.s…" 3 minutes ago Exited (127) 3 minutes ago react-node-mysql-docker-boilerplate_server_1

Can we use http package in nodejs with docker

I am researching about docker and I have code a demo nodejs with docker. I use HTTP package in nodejs instead of express, the app is built with docker, but when I go to localhost:80, the return is
ERR_EMPTY_RESPONSE
I have code a demo with nodejs and use express, it can run, and I cannot find any example using HTTP package.
I do not clear what EXPOSE port in docker for, that is port call to browser or port for app?
Docker file
FROM node:8
RUN mkdir -p /home/node/app && chown -R node:node /home/node/app
WORKDIR /home/node/app
COPY package*.json ./
USER node
RUN npm install
COPY --chown=node:node . .
EXPOSE 80
CMD ["npm", "start"]
index.js
const http = require('http');
const hostname = '127.0.0.1';
const port = 3000;
const server = http.createServer((req, res) => {
res.statusCode = 200;
res.setHeader('Content-Type', 'text/plain');
res.end("Hello world \n");
});
server.listen(port, hostname, () => {
console.log(`server is running at abcxyz http://${hostname}:${port}/`);
});
Have you published the port at your docker run command?
docker run -p 80:3000 ...
Your hostname if you run it without docker is localhost (127.0.0.1)
But if you run it in docker it have to be:
const hostname = '0.0.0.0';
In your code, server listens to PORT 3000 and you have exposed PORT 80 to HOST which means PORT 80 has nothing corresponding running inside docker container, you actually have to EXPOSE PORT 3000 from docker container and use that.
You can use this command to map to a port usable in host where port number before : represents port to be exposed in host and port number after : represents port in docker container that's exposed AFAIK.
docker run -p 80:80

Resources