This question already has answers here:
Containerized Node server inaccessible with server.listen(port, '127.0.0.1')
(2 answers)
Closed 9 months ago.
I'm just trying to learn Node.js and Docker at the same time. I have a very simple Node.js app that listens on a port and returns a string. The Node app itself runs fine when running locally. I'm now trying to get it running in a Docker container but I can't seem to reach it.
Here's my Node app:
const http = require('http');
const hostname = '127.0.0.1';
const port = 3000;
var count = 0;
var server = http.createServer(function(req, res) {
res.writeHead(200, { 'Content-Type': 'text/plain' });
res.end("Here's the current value: " + count);
console.log('Got a request: ', req.url);
count++;
});
server.listen(port, hostname, () => {
console.log(`Server running at http://${hostname}:${port}/`);
});
My Dockerfile:
FROM node:latest
MAINTAINER Jason
ENV PORT=3000
COPY . /var/www
WORKDIR /var/www
EXPOSE $PORT
ENTRYPOINT ["node", "app.js"]
My build command:
docker build -t jason/node .
And my run command:
docker run -p 3000:3000 jason/node
The app.js file and Dockerfile live in the same directory where I'm running the commands. Doing a docker ps shows the app running but I just get a site cannot be reached error when navigating to 127.0.0.1:3000 in the browser. I've also confirmed that app.js was properly added to the image and I get the message "Server running at http://127.0.0.1:3000/" after running.
I think I'm missing something really simple, any ideas?
Omit hostname or use '0.0.0.0' on listen function. Make it server.listen(port, '0.0.0.0', () => { console.log(Server running..); });
If You use docker on Windows 7/8 you most probably have a docker-machine running then You would need to access it on something like 192.168.99.100 or whatever ip your docker-machine has.
To see if you are running a docker-machine just issue the command
docker-machine ls
Related
I have a NodeJS Server that houses two servers in one process.
//This function immediately starts the express server
async startExpressServer() {
const app=express();
const port = 3000;
app.listen(port, async function () {
console.log(`Express Service running on port ${port} in ${process.env.NODE_ENV} mode`);
});
}
//This function immediately starts the gRPC server
async startGRPCServer(){
let port = 4000;
grpcServer.bindAsync(`0.0.0.0:${port}`, grpc.ServerCredentials.createInsecure(),
()
=> {
grpcServer.start();
console.log(`GRPC Server running on ${port}`);
});
}
Since cloud run exposes just a single port per service instance, I attempted to proxy the ports of the above servers using nginx with the following nginx configurations in my nginx.conf file which is sitting at the root of my project:
upstream grpc-server {
server 0.0.0.0:4000;
}
server {
listen $PORT;
server_name customer-service;
location /customer{ # Send requests on this route to the grpc server
grpc_pass grpcs://grpc-server;
}
location /v1 { # Send requests on this route to the express server
proxy_pass http://0.0.0.0:3000;
}
}
And below is the script I have in my Dockerfile to start the app as a whole
FROM node:18-slim
WORKDIR /usr/src/app
COPY package*.json ./
RUN npm install --only=production
COPY . ./
# server environment
FROM nginx:alpine
COPY nginx.conf /etc/nginx/conf.d/configfile.template
CMD sh -c "envsubst '\$PORT' < /etc/nginx/conf.d/configfile.template > /etc/nginx/conf.d/default.conf && nginx -g 'daemon off;' && node server.js"
When I deployed the app to cloud run the app refuse to run. I have a very strong feeling certain configurations above are not correct, please I would appreciate any corrections and or suggestions.
Thank you.
I've searched this answer on the StackOverflow community and none of them resulted so I ask one here.
I have a pretty simple nodejs app that has a server.js file, having the following.
'use strict'
require('dotenv').config();
const app = require('./app/app');
const main = async () => {
try {
const server = await app.build({
logger: true,
shopify: './Shopify',
shopifyToken: process.env.SHOPIFY_TOKEN,
shopifyUrl: process.env.SHOPIFY_URL
});
await server.listen(process.env.PORT || 3000);
} catch (err) {
console.log(err)
process.exit(1)
}
}
main();
If I boot the server locally works perfect and I able to see a json on the web browser.
Log of the working server when running locally:
{"level":30,"time":1648676240097,"pid":40331,"hostname":"Erick-Macbook-Air.local","msg":"Server listening at http://127.0.0.1:3000"}
When I run my container, and I go to localhost:3000 I see a blank page with the error message:
This page isn’t working
localhost didn’t send any data.
ERR_EMPTY_RESPONSE
I have my Dockerfile like this:
FROM node:16
WORKDIR /app
COPY package.json .
RUN npm install
COPY . ./
EXPOSE 3000
CMD ["node", "server.js"]
This is how I run my container:
docker run -d -it --name proxyservice -p 3000:3000 proxyserver:1.0
And when I run it I see the container log working:
{"level":30,"time":1648758470430,"pid":1,"hostname":"03f5d00d762b","msg":"Server listening at http://127.0.0.1:3000"}
As you can see it boot's up right, but when going to localhost:3000 I see that error message. Any idea of what am I missing/doing wrong?
Thanks!
can you add 0.0.0.0 in the host section of your service,
something like this?
server.listen(3000, '0.0.0.0');
give it a try then.
Since you want your service to be accessible from outside the container you should give the address as 0.0.0.0
Hello. I've spent some time without luck trying to understand the problem here.
I've looked through each Question on StackOverflow which seems to deal with the same problem, though nothing has worked so far.
I have a simple chat app built using Create React App and Socket.io (which runs fine on localhost), but when deployed to my Node server I'm receiving ERR_CONNECTION_TIMED_OUT errors and no response. The website itself runs fine, but when I make a call to my Socket.io server, but errors.
I'm guessing this is down to my lack of knowledge with how Node and Socket.io want to work.
Some info:
server.js
const path = require("path");
const express = require("express");
const app = express();
const http = require("http").createServer(app);
const port = 8080;
http.listen(port, () => console.log(`http: Listening on port ${port}`));
const io = require("socket.io")(http, { cookie: false });
app.use(express.static(path.join(__dirname, "build")));
app.get("/*", function (req, res) {
res.sendFile(path.join(__dirname, "build", "index.html"));
});
io.on("connection", (socket) => {
console.log("New client connected");
// Emitting a new message. Will be consumed by the client
socket.on("messages", (data) => {
socket.broadcast.emit("messages", data);
});
//A special namespace "disconnect" for when a client disconnects
socket.on("disconnect", () => console.log("Client disconnected"));
});
client.js
....
const socket =
process.env.NODE_ENV === "development"
? io("http://localhost:4001")
: io("https://my-test-site:8080");
socket.on("messages", (msgs: string[]) => {
setMessages(msgs);
});
....
docker-compose.yml
version: "X.X"
services:
app:
image: "my-docker-image"
build:
context: .
dockerfile: Dockerfile
args:
DEPENDENCY: "my-deps"
ports:
- 8080:8080
Dockerfile
...
RUN yarn build
CMD node server.js // run my server.js
...
UPDATE: I got around this problem by making sure my main port was only used to run Express (with socket.io) - in my set up that was port: 8080. When running in the same Docker container, I don't think I needed to create and use the https version of the express 'createServer'.
This looks like you forgot to map the port of your docker container. The expose statement in your dockerfile will only advertise for other docker containers, which share a docker network with your container, that they can connect to port 4001 of your container.
The port mapping can be configured with the -p flag for docker run commands. In your case the full command look somehow like this:
docker run -p 4001:4001 your_image_name
Also, do you have a signed certificate? Browser will likely block the conneciton if they do not trust your servers certificate.
I got around this problem by keeping just one port available (in my case :8080). This port is what express/socket.io is using (originally I had two different ports, one for my site, one for express). Also, in my case, when running in the same Docker container, I didn't require the require("https").createServer(app) (https) version of the server, as http was sufficient.
I have a simple application composed with Express.js as backend API and React.js as Frontend client.
I create a singles image container with frontend and backend
Application repo: https://github.com/vitorvr/list-users-kubernetes
Dockerfile:
FROM node:13
WORKDIR /usr/app/listusers
COPY . .
RUN yarn
RUN yarn client-install
RUN yarn client-build
EXPOSE 8080
CMD ["node", "server.js"]
server.js
const express = require('express');
const cors = require('cors');
const path = require('path');
const app = express();
const ip = process.env.IP || '0.0.0.0';
const port = process.env.PORT || 8080;
app.use(express.json());
app.use(cors());
app.use(express.static(path.join(__dirname, 'public')));
app.get('/users', (req, res) => {
res.json([
{ name: 'Jhon', id: 1 },
{ name: 'Ashe', id: 2 }
]);
});
app.listen(port, ip, () =>
console.log(`Server is running at http://${ip}:${port}`)
);
React call:
const api = axios.create({
baseURL: 'http://0.0.0.0:8080'
});
useEffect(() => {
async function loadUsers() {
const response = await api.get('/users');
if (response.data) {
setUsers(response.data);
}
}
loadUsers();
}, []);
To deploy and run this image in minikube I use these follow commands:
kubectl run list-users-kubernetes --image=list-users-kubernetes:1.0 --image-pull-policy=Never
kubectl expose pod list-users-kubernetes --type=LoadBalancer --port=8080
minikube service list-users-kubernetes
The issue occurs when the front end try to access the localhost:
I don't know where I need to fix this, if I have to do some fix in React, or do some settings in Kubernetes or even this is the best practice to deploy small applications as a Container Image at Kubernetes.
Thanks in advance.
Your Kubernetes node, assuming it is running as a virtual machine on your local development machine, would have an IP address assigned to it. Similarly, when an IP address would be assigned to your pod where the "list-user-kubernetes" service is running. You can view the IP address by running the following command: kubectl get pod list-users-kubernetes, and to view more information add -o wide at the end of the command, eg. kubectl get pod list-users-kubernetes -o wide.
Alternatively, you can do port forwarding to your localhost using kubectl port-forward pod/POD_NAME POD_PORT:LOCAL_PORT. Example below:
kubectl port-forward pod/list-users-kubernetes 8080:8080
Note: You should run this as a background service or in a different tab in your terminal, as the port forwarding would be available as long as the command is running.
I would recommend using the second approach, as your external IP for the pod can change during deployments, but mapping it to localhost would allow you to run your app without making code changes.
Link to port forwarding documentation
I have a problem with the connection database MongoDB in Cloud9
Please help to resolve this issue!
var MongoClient = require("mongodb").MongoClient;
var port = process.env.PORT;
var ip = process.env.IP;
MongoClient.connect("mongodb://"+ip+":"+port+"/test",function(error,db){
if(!error){
console.log("We are connected");
}
else{
console.dir(error); //failed to connect to [127.4.68.129:8080]
}
});
Output:
Running Node Process
Your code is running at 'http://demo-project.alfared1991.c9.io'.
Important: use 'process.env.PORT' as the port and 'process.env.IP' as the host in your scripts!
[Error: failed to connect to [127.4.68.129:8080]]
If you follow https://docs.c9.io/setting_up_mongodb.html this link, you will setup and run your mongodb daemon under your workspace.
And if you take a look at the output of ./mongod, you'll find out this output:
2015-08-22T12:46:47.120+0000 [initandlisten] MongoDB starting : pid=7699 port=27017 dbpath=data 64-bit host=velvetdeth-express-example-1804858
Just copy the host and port value to your mongodb config, set up the database url, in this case is:
mongodb://velvetdeth-express-example-1804858:27017
process.env.PORT and process.env.IP are the port and IP address for your application, not your database. You'll want to pull your Mongo connection string from your MongoDB provider.
Below is the hello world example from the Node.js homepage modified to use the two environment variables.
var http = require('http');
http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/plain'});
res.end('Hello World\n');
}).listen(process.env.PORT || 1337, process.env.IP || '127.0.0.1');
For anyone else who runs into this issue, the solution is here: https://docs.c9.io/setting_up_mongodb.html
MongoDB is preinstalled in the Cloud9 workspace. Run this:
$ mkdir data
$ echo 'mongod --bind_ip=$IP --dbpath=data --nojournal --rest "$#"' > mongod
$ chmod a+x mongod
To start the Mongodb process, run:
$ ./mongod
Then 'run' your node.js app script and you're off to the races.
Here's what the parameters mean:
--dbpath=data (because it defaults to /var/db which isn't accessible)
--nojournal because mongodb usually pre-allocates 2 GB journal file (which exceeds Cloud9 disk space quota)
--bind_ip=$IP (because you can't bind to 0.0.0.0)
--rest runs on default port 28017