I've searched this answer on the StackOverflow community and none of them resulted so I ask one here.
I have a pretty simple nodejs app that has a server.js file, having the following.
'use strict'
require('dotenv').config();
const app = require('./app/app');
const main = async () => {
try {
const server = await app.build({
logger: true,
shopify: './Shopify',
shopifyToken: process.env.SHOPIFY_TOKEN,
shopifyUrl: process.env.SHOPIFY_URL
});
await server.listen(process.env.PORT || 3000);
} catch (err) {
console.log(err)
process.exit(1)
}
}
main();
If I boot the server locally works perfect and I able to see a json on the web browser.
Log of the working server when running locally:
{"level":30,"time":1648676240097,"pid":40331,"hostname":"Erick-Macbook-Air.local","msg":"Server listening at http://127.0.0.1:3000"}
When I run my container, and I go to localhost:3000 I see a blank page with the error message:
This page isn’t working
localhost didn’t send any data.
ERR_EMPTY_RESPONSE
I have my Dockerfile like this:
FROM node:16
WORKDIR /app
COPY package.json .
RUN npm install
COPY . ./
EXPOSE 3000
CMD ["node", "server.js"]
This is how I run my container:
docker run -d -it --name proxyservice -p 3000:3000 proxyserver:1.0
And when I run it I see the container log working:
{"level":30,"time":1648758470430,"pid":1,"hostname":"03f5d00d762b","msg":"Server listening at http://127.0.0.1:3000"}
As you can see it boot's up right, but when going to localhost:3000 I see that error message. Any idea of what am I missing/doing wrong?
Thanks!
can you add 0.0.0.0 in the host section of your service,
something like this?
server.listen(3000, '0.0.0.0');
give it a try then.
Since you want your service to be accessible from outside the container you should give the address as 0.0.0.0
Related
I'm trying to deploy my API to Cloud Run but I'm stuck with this error
ERROR: (gcloud.run.deploy) The user-provided container failed to start and listen on the port defined provided by the PORT=8080 environment variable. Logs for this revision might contain more information.
This is my Dockerfile
FROM node:lts
WORKDIR /src
COPY package.json package*.json ./
RUN npm install --omit=dev
COPY . .
CMD [ "npm", "execute" ]
This are my package.json scripts
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1",
"start": "env-cmd -x -f ./src/config/env/.env.local nodemon ./src/index.js",
"deploy:dev": "env-cmd -x -f ./src/config/env/.env.dev ./deploy.sh",
"execute": "env-cmd -x -f ./src/config/env/.env.dev node ./src/index.js"
},
This is my index.js
const api = require("../src/config/config");
const port = process.env.PORT || 8080;
console.log("Puerto => ", port);
api.listen(port, () => {
console.log(`Rest API started succesfully`);
});
This is my config file (I'm working with firebase)
const express = require("express");
// Config
const api = express();
api.use(express.json());
// Routes
api.use(require("../routes/start.routes.js"));
module.exports = api;
And I have a .env file with the PORT variable
PORT=8080
And these are the commands I execute
gcloud builds submit --tag gcr.io/$GOOGLE_PROJECT_ID/api --project=$GOOGLE_PROJECT_ID
gcloud run deploy api --image gcr.io/$GOOGLE_PROJECT_ID/api --port 8080 --platform managed --region us-central1 --allow-unauthenticated --project=$GOOGLE_PROJECT_ID
I have followed every tip about similar questions but none has worked for me
I checked logs and they only expose the description I referred in the beginning.
I try to run my project locally with Cloud Run Emulator, it does not work but I don't get enough info to figure out what's wrong. I don't even understand why in the docker container I see several ports except 8080 which is the one the app should listen on and then says the deploy process failed
I'm using windows 11
My API works fine locally if I run npm run start
The error clearly does indicate that there is an issue with the specific defined incoming HTTP requests ports and the container is failing to listen on the expected port.The official document Cloud Run container contract has these mentioned to meet these requirements in order to operate properly.
In Node.js ,your js should define as below:
const port = process.env.PORT || 8080; app.listen(port, () => { console.log('Hello listening port', port); });
You may check if your container is listening on all network interfaces once by denoting port as 0.0.0.0 and see if that works.You may also want to confirm if your container image is compiled for 64-bit Linux as expected by the container runtime contract.
You need to troubleshoot the issue
Check the logs of your Cloud Run service using the command in cloudshell
gcloud logs read --project $GOOGLE_PROJECT_ID --service api --limit 100
//You can adjust the --limit flag to show more or fewer log entries.
2.Check the logs of your container If there is any problem with the container itself, you will get using that command
docker run -p 8080:8080 gcr.io/$GOOGLE_PROJECT_ID/api
that command starts your container and map port 8080 inside the container to your local machine's port 8080. Access your app at http://localhost:8080. Check the console output for any errors with the container.
Check your Cloud Run configuration
Check your application code
in your index.js file Make sure the api.listen() function is using a port variable that is set to the value of the PORT environment variable:
const port = process.env.PORT || 8080;
api.listen(port, () => {
console.log(`Rest API started successfully`);
});
5 check your firewall settings
some times firewall blocks traffic to port 8080.
you can check the firewall rules in the Google Cloud Console
using that step you find the issues and if you can't find the issue then reach out to the Cloud Run support team for further assistance.
I am very new to cloud run. I created a very simple express server as shown below with no Dockerfile as I decided to deploy from source.
import dotenv from 'dotenv';
dotenv.config();
import express from 'express';
const app = express();
const port = process.env.PORT || 8000;
app.get('/test', (req, res) => {
return res.json({ message: 'test' });
})
app.listen(port, async function () {
console.log(`Sample Service running on port ${port} in ${process.env.NODE_ENV} mode`);
});
Please note that I am deploying from source, hence no Dockerfile in my directory.
Here is the command I used to deploy
gcloud run deploy --source .
And then the error I keep getting back is:
The user-provided container failed to start and listen on the port defined provided by the PORT=8080 environment variable. Logs for this revision might contain more information.
I have no idea where PORT 8080 is coming from as I am listening on PORT 8000 and not 8080.
How can this be resolved?
Thanks
The issue most likely is not to do with the port but with some other problem that is causing the container to fail at startup. I suggest the following:
Visit Cloud Run in the Google cloud console and for this specific service, go to the logs from the Cloud Run service detail itself. It should tell you the exact reason while the container startup is failing. At times, it could be a dependency, a missing command, etc.
For the port 8080 being used, instead of 8000 -- Cloud Run injects a default port which is 8080. Check out the container contract documentation. You can override that by specifying the --port parameter in the gcloud command but it may not be necessary at this point.
Hello. I've spent some time without luck trying to understand the problem here.
I've looked through each Question on StackOverflow which seems to deal with the same problem, though nothing has worked so far.
I have a simple chat app built using Create React App and Socket.io (which runs fine on localhost), but when deployed to my Node server I'm receiving ERR_CONNECTION_TIMED_OUT errors and no response. The website itself runs fine, but when I make a call to my Socket.io server, but errors.
I'm guessing this is down to my lack of knowledge with how Node and Socket.io want to work.
Some info:
server.js
const path = require("path");
const express = require("express");
const app = express();
const http = require("http").createServer(app);
const port = 8080;
http.listen(port, () => console.log(`http: Listening on port ${port}`));
const io = require("socket.io")(http, { cookie: false });
app.use(express.static(path.join(__dirname, "build")));
app.get("/*", function (req, res) {
res.sendFile(path.join(__dirname, "build", "index.html"));
});
io.on("connection", (socket) => {
console.log("New client connected");
// Emitting a new message. Will be consumed by the client
socket.on("messages", (data) => {
socket.broadcast.emit("messages", data);
});
//A special namespace "disconnect" for when a client disconnects
socket.on("disconnect", () => console.log("Client disconnected"));
});
client.js
....
const socket =
process.env.NODE_ENV === "development"
? io("http://localhost:4001")
: io("https://my-test-site:8080");
socket.on("messages", (msgs: string[]) => {
setMessages(msgs);
});
....
docker-compose.yml
version: "X.X"
services:
app:
image: "my-docker-image"
build:
context: .
dockerfile: Dockerfile
args:
DEPENDENCY: "my-deps"
ports:
- 8080:8080
Dockerfile
...
RUN yarn build
CMD node server.js // run my server.js
...
UPDATE: I got around this problem by making sure my main port was only used to run Express (with socket.io) - in my set up that was port: 8080. When running in the same Docker container, I don't think I needed to create and use the https version of the express 'createServer'.
This looks like you forgot to map the port of your docker container. The expose statement in your dockerfile will only advertise for other docker containers, which share a docker network with your container, that they can connect to port 4001 of your container.
The port mapping can be configured with the -p flag for docker run commands. In your case the full command look somehow like this:
docker run -p 4001:4001 your_image_name
Also, do you have a signed certificate? Browser will likely block the conneciton if they do not trust your servers certificate.
I got around this problem by keeping just one port available (in my case :8080). This port is what express/socket.io is using (originally I had two different ports, one for my site, one for express). Also, in my case, when running in the same Docker container, I didn't require the require("https").createServer(app) (https) version of the server, as http was sufficient.
So,
I am using NUXT
I am deploying to google cloud run
I am using dotenv package with a .env file on development and it works fine.
I use the command process.env.VARIABLE_NAME within my dev server on Nuxt and it works great, I make sure that the .env is in git ignore so that it doesnt get uploaded.
However, I then deploy my application using the google cloud run... I make sure I go to the Enviroments tab and add in exactly the same variables that are within the .env file.
However, the variables are coming back as "UNDEFINED".
I have tried all sorts of ways of fixing this, but the only way I can is to upload my .env with the project - which I do not wish to do as NUXT exposes this file in the client side js.
Anyone come across this issue and know how to sort it out?
DOCKERFILE:
# base node image
FROM node:10
WORKDIR /user/src/app
ENV PORT 8080
ENV HOST 0.0.0.0
COPY package*.json ./
RUN npm install
# Copy local nuxt code to the container
COPY . .
# Build production app
RUN npm run build
# Start the service
CMD npm start
Kind Regards,
Josh
Finally I found a solution.
I was using Nuxt v1.11.x
From version equal to or greater than 1.13, Nuxt comes with Runtime Configurations, and this is what you need.
in your nuxt.config.js:
export default {
publicRuntimeConfig: {
BASE_URL: 'some'
},
privateRuntimeConfig: {
TOKEN: 'some'
}
}
then, you can access like:
this.$config.BASE_URL || context.$config.TOKEN
More details here
To insert value to the environment variables is not required to do it in the Dockerfile. You can do it through the command line at the deployment time.
For example here is the Dockerfile that I used.
FROM node:10
WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .
CMD ["npm","start"]
this is the app.js file
const express = require('express')
const app = express()
const port = 8080
app.get('/',(req,res) => {
const envtest = process.env.ENV_TEST;
res.json({message: 'Hello world',
envtest});
});
app.listen(port, () => console.log(`Example app listening on port ${port}`))
To deploy use a script like this:
gcloud run deploy [SERVICE] --image gcr.io/[PROJECT-ID]/[IMAGE] --update-env-vars ENV_TEST=TESTVARIABLE
And the output will be like the following:
{"message":"Hello world","envtest":"TESTVARIABLE"}
You can check more detail on the official documentation:
https://cloud.google.com/run/docs/configuring/environment-variables#command-line
This question already has answers here:
Containerized Node server inaccessible with server.listen(port, '127.0.0.1')
(2 answers)
Closed 9 months ago.
I'm just trying to learn Node.js and Docker at the same time. I have a very simple Node.js app that listens on a port and returns a string. The Node app itself runs fine when running locally. I'm now trying to get it running in a Docker container but I can't seem to reach it.
Here's my Node app:
const http = require('http');
const hostname = '127.0.0.1';
const port = 3000;
var count = 0;
var server = http.createServer(function(req, res) {
res.writeHead(200, { 'Content-Type': 'text/plain' });
res.end("Here's the current value: " + count);
console.log('Got a request: ', req.url);
count++;
});
server.listen(port, hostname, () => {
console.log(`Server running at http://${hostname}:${port}/`);
});
My Dockerfile:
FROM node:latest
MAINTAINER Jason
ENV PORT=3000
COPY . /var/www
WORKDIR /var/www
EXPOSE $PORT
ENTRYPOINT ["node", "app.js"]
My build command:
docker build -t jason/node .
And my run command:
docker run -p 3000:3000 jason/node
The app.js file and Dockerfile live in the same directory where I'm running the commands. Doing a docker ps shows the app running but I just get a site cannot be reached error when navigating to 127.0.0.1:3000 in the browser. I've also confirmed that app.js was properly added to the image and I get the message "Server running at http://127.0.0.1:3000/" after running.
I think I'm missing something really simple, any ideas?
Omit hostname or use '0.0.0.0' on listen function. Make it server.listen(port, '0.0.0.0', () => { console.log(Server running..); });
If You use docker on Windows 7/8 you most probably have a docker-machine running then You would need to access it on something like 192.168.99.100 or whatever ip your docker-machine has.
To see if you are running a docker-machine just issue the command
docker-machine ls