can't open site with docker-compose and nodejs - node.js

everybody. I have docker-compose with nodejs.
web:
...
ports:
- "3030:3000"
....
app.js:
...
server.listen(8000, function(){
console.log('test');
});
we can see port is 8000.
I do docker-compose up
and terminal has 'test'
But http://MY_IP:8000 hasn't "Can not access site"
I do start with docker-compose run web npm start server started, but i get
that error too.
How start nodejs with docker-compose ?
P.S.:
If i install npm install WITHOUT docker-compose command. I have not this problem

In your docker-compose.yml you are saying to bind the port 3030 to your host and 3000 in your container.
The config in your docker-compose.yml should be:
web:
...
ports:
- "8000:8000"
...
Moreover I think you should also bind your app inside the container to 0.0.0.0 to ease development, additional set-up can be done once moving to production. app.js:
server.listen('8000', '0.0.0.0', function(){
console.log('test');
});

You couldn't access to http://YOUR_IP:8000 because you didn't bind container port (8000) to host port (8000). Add this to your docker-compose.yml.
web:
...
ports:
- "3030:3000"
- "8000:8000"
....

Just change - "3030:3000" to - "8000:8000" in your docker-compose.yml file

Try to add this flag: --host=0.0.0.0
version: '2'
services:
app:
container_name: sdp
build: .
ports:
- 8000:8000
- 3000:3000
volumes:
- .:/usr/src/app
- /usr/src/app/node_modules
command: sls offline --host=0.0.0.0

Related

WSL2, Docker & Node : Unable to request Node

I created a JS app with Docker Compose with a front, a back and a common component with Yarn Workspaces. It works on Linux. I am out of ideas to make it work on WSL.
The Docker Compose :
# Use postgres/example user/password credentials
version: '3.1'
services:
postgres:
image: postgres:latest
# restart: always
environment:
POSTGRES_PASSWORD: password
POSTGRES_DB: caddie_app
ports:
- '5432:5432'
backend:
image: node:16
volumes:
- '.:/app'
ports:
- '3001:3001' # Nest
depends_on:
- postgres
working_dir: /app
command: ["yarn", "workspace", "#caddie/backend", "start:dev"]
environment:
# with docker we listen to the postgres network, but it is reachable at #localhost on our post
DATABASE_URL: postgresql://postgres:password#postgres:5432/caddie_app?schema=public
frontend:
image: node:16
volumes:
- '.:/app'
ports:
- '3000:3000' # React
depends_on:
- backend
working_dir: /app
command: ["yarn", "workspace", "#caddie/frontend", "start"]
I can reach the database with DBeaver, I can fetch the React JS scripts on localhost:3000, but I cannot request the NestJS server on localhost:3001.
The NestJS server is listening on 0.0.0.0
await app.listen(3001, '0.0.0.0');
I allowed the ports 3000 & 3001 on the Firewall. I tried to request directly the NodeJS through the IP of WSL found in ipconfig but the problem remains. I can't figure out what's wrong.
Thanks !

Docker-compose builds but app does not serve on localhost

Docker newbie here. Docker-compose file builds without any issues but when I try to run my app on localhost:4200, I get a message - localhost didn't send any data on chrome and the server unexpectedly dropped the connection in safari. I am working on MacOs Catalina. Here is my yml file:
version: '3.0'
services:
my-portal:
build: .
ports:
- "4200:4200"
depends_on:
- backend
backend:
build: ./backend
ports:
- "3000:3000"
environment:
POSTGRES_HOST: host.docker.internal
POSTGRES_USER: "postgres"
POSTGRES_PASSWORD: mypwd
depends_on:
-db
db:
image: postgres:9.6-alpine
environment:
POSTGRES_DB: mydb
POSTGRES_USER: "postgres"
POSTGRES_PASSWORD: mypwd
POSTGRES_HOST: host.docker.internal
ports:
- 5432:5432
restart: always
volumes:
- ./docker/db/data:/var/lib/postgresql/data
Log for Angular:
/docker-entrypoint.sh: Configuration complete; ready for start up
Log for Node: db connected
Log for Postgres: database system is ready to accept connections
Below are my Angular and Node Docker files:
FROM node:latest AS builder
WORKDIR /app
COPY . .
RUN npm install
RUN npm run build --prod
EXPOSE 4200
# Stage 2
FROM nginx:alpine
COPY --from=builder /app/dist/* /usr/share/nginx/html/
Node:
FROM node:12
WORKDIR /backend
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 3000
CMD [ "node", "server.js" ]
When I created Angular image and ran my app on localhost:4200 it worked fine. Please let me know if I am missing anything.
Your Angular container is built FROM nginx, and you use the default Nginx configuration from the Docker Hub nginx image. That listens on port 80, so that's the port number you need to use in use ports: directive:
services:
quickcoms-portal:
build: .
ports:
- "4200:80" # <-- second port must match nginx image's port
depends_on:
- backend
The EXPOSE directive in the first stage is completely ignored and you can delete it. The FROM nginx line causes docker build to basically completely start over from a new base image, so your final image is stock Nginx plus the files you COPY --from=builder.

Can't access Ionic from browser while using docker

i'm trying to access localhost:8100 which is my ionic app that's running inside a docker container but it won't open.
Here is my dockerfile :
FROM node:10.16.3
WORKDIR /usr/src/ionic-app
COPY ./ /usr/src/ionic-app
RUN npm install -g cordova ionic
RUN npm install
And here is my Docker-compose file
version: '3.6'
services:
#Backend API
backend-api:
container_name: backend
build:
context: ./api/
working_dir: /usr/src/smart-brain-api
command: npm run debug
ports:
- "3000:3000"
environment:
REDIS_HOST: redis
MONGOOSE_URI: 'mongodb://mongo:27017/appcomdill'
links:
- mongo
- redis
#MongoDB
mongo:
container_name: mongo
image: mongo
environment:
MONGOOSE_URI: 'mongodb://mongo:27017/appcomdill'
ports:
- "27017:27017"
#Redis
redis:
container_name: redis
environment:
REDIS_HOST: redis
image: redis
ports:
- "6379:6379"
#Ionic Front-end
ionic:
container_name: front-end
build:
context: ./ionic
working_dir: /usr/src/ionic-app
command: ionic serve
ports:
- "8100:8100"
Every time i try to connect to http://localhost:8100/ it keeps on giving me "localhost didn't send any data"
Try changing your command to ionic serve --external
As Joseph mentioned you can fix that by specifying the --external option
This explained in ionic docs:
By default, ionic serve boots up a development server on localhost. To serve to your LAN, specify the --external option, which will use all network interfaces and print the external address(es) on which your app is being served.

How to access Docker container app on local?

I have a simple Node.js/Express app:
const port = 3000
app.get('/', (req, res) => res.send('Hello World!'))
app.listen(port, () => console.log(`Example app listening on port ${port}!`))
It works fine when I start it like: node src/app.js
Now I'm trying to run it in a Docker container. Dockerfile is:
FROM node:8
WORKDIR /app
ADD src/. /app/src
ADD package.json package-lock.json /app/
RUN npm install
COPY . /app
EXPOSE 3000
CMD [ "node", "src/app.js" ]
It starts fine: docker run <my image>:
Listening on port 3000
But now I cannot access it in my browser:
http://localhost:3000
This site can’t be reached localhost refused to connect.
Same happen if I try to run it within docker-compose:
version: '3.4'
services:
service1:
image: xxxxxx
ports:
- 8080:8080
volumes:
- xxxxxxxx
myapp:
build:
context: .
dockerfile: Dockerfile
networks:
- private
ports:
- 3000
command:
node src/app.js
Not sure if I deal right with ports in both docker files
When you work with docker you must define the host for your app as 0.0.0.0 instead of localhost.
For your express application you can define the host on app.listen call.
Check the documentation:
app.listen([port[, host[, backlog]]][, callback])
Your express code should be updated to:
const port = 3000
const host = '0.0.0.0'
app.get('/', (req, res) => res.send('Hello World!'))
app.listen(port, host, () => console.log(`Example app listening on ${port}!`))
It's also important publish docker ports:
Running docker: docker run -p 3000:3000 <my image>
Running docker-compose:
services:
myapp:
build:
context: .
dockerfile: Dockerfile
networks:
- private
ports:
- 3000:3000
command:
node src/app.js
try this:
services:
myapp:
build:
context: .
dockerfile: Dockerfile
networks:
- private
ports:
- 3000:3000 ##THIS IS THE CHANGE, YOU NEED TO MAP MACHINE PORT TO CONTAINER
command:
node src/app.js
You need to publish ports
docker run -p 3000:3000 <my image>
-p - stands for publish

Docker-Compose publishing ports on Host computer Nodejs + Express

I have built a docker-compose file and want to access my nodejs app on localhost:3000 from my host computer but publishing the ports doesn't seem to be working.
When I run compose-up everything seems to be working fine and I get the confirmation Listening on port 3000. However when I go to localhost:3000 from a browser as well as curl I get a not found or timeout response.
Am I missing something here?
My NodeJS server:
var server = app.listen( process.env.PORT || 3000, function(){
console.log('Listening on port ' + server.address().port);
});
My Docker-Compose.yml file:
version: "3"
services:
api:
image: baum-test:v0
ports:
- "3000:3000"
networks:
- webnet
mongodb:
image: mongo:latest
ports:
- "27017:27017"
volumes:
- ./data:/data
deploy:
placement:
constraints: [node.role == manager]
networks:
- webnet
networks:
webnet:
If you are running this using docker stack deploy command then you can "constraints: [node.role == manager]" in your docker-compose.yml. If you used docker-compose up command to get the composition up then you are using Swarm features of the compose file but on docker-compose. Remove the below section completely
deploy:
placement:
constraints: [node.role == manager]
Since I am running docker tools for windows (the old version without hyper-v) I had to manually declare the Docker VM Image IP address as it wasn't binding to localhost.

Resources