NestJS with Docker and Portainer - node.js

i'm trying to turn UP my project with a Virtual Private Server. I've installed Docker and Portainer and i can start the project. But its not running in any port. I did set to run in port 3000 but when i put in browser IP_Of_My_VPS:3000 nothing happens. I'm new with docker and every configuration that i did was based on my searchs.
This print shows that image is running in no one port.
This other print shows that my application is running (but i dont know how access it).
My docker config:
FROM node:12-alpine
RUN apk --no-cache add curl
RUN apk --no-cache add git
RUN git --version
WORKDIR /app
COPY package*.json ./
RUN npm set progress=false && npm config set depth 0 && npm cache clean --force
RUN npm ci
COPY . .
RUN npm run build && rm -rf src
HEALTHCHECK --interval=30s --timeout=3s --start-period=30s \
CMD curl -f http://localhost:3000/health || exit 1
EXPOSE 3000
CMD ["node", "./dist/main.js"]

When docker container up, perform port forwarding
for examples,
docker run -p <your_forwarding_port>:3000 ~~~
# docker-compose.yaml
~~~
ports:
- "<your_forwarding_port>:3000"
~~~
you can see ref
: docker-container port
: docker-compose port

Related

NPM not found when using npm run start command within shell script from a docker container

I am not sure what I may be doing wrong but I have the following script.sh file sitting at the root of my project:
script.sh
#!/bin/sh
npm run start
envsubst '\$PORT' < /etc/nginx/conf.d/configfile.template > /etc/nginx/conf.d/default.conf
nginx -g 'daemon off;'
Then I referenced the above script in my Dockerfile as shown below:
Dockerfile
# Build environment
FROM node:16.14.2
WORKDIR /usr/src/app
COPY package*.json ./
RUN npm install --only=production
COPY . ./
# server environment
FROM nginx:alpine
COPY nginx.conf /etc/nginx/conf.d/configfile.template
ENV HOST 0.0.0.0
ENV NODE_ENV production
EXPOSE 8080
COPY script.sh /
RUN chmod +x /script.sh
ENTRYPOINT ["/script.sh"]
After building the Docker image successfully, I attempted to run it as a container but all I keep getting back is the following error:
/script.sh: line 2: npm: not found
I expect that the script should be able to pick up the already installed npm from the environment.
What can I do differently to make this work?
You're trying to run two separate programs, so run them in two separate containers.
# Dockerfile.app
FROM node:16.14.2
WORKDIR /usr/src/app
COPY package*.json ./
RUN npm install --only=production
COPY . ./
ENV HOST 0.0.0.0
ENV NODE_ENV production
EXPOSE 8080
CMD npm run start
# Dockerfile.nginx
FROM nginx:alpine
COPY nginx.conf /etc/nginx/conf.d/configfile.template
You might use a system like Docker Compose to run the two parts together:
# docker-compose.yml
version: '3.8'
services:
app:
build:
context: .
dockerfile: Dockerfile.app
nginx:
build:
context: .
dockerfile: nginx
ports:
- 8080:80
Running docker-compose up -d will start both containers together. In your Nginx configuration, make sure to proxy_pass http://app:8080, using the Compose service name and the port number the service is listening on, to forward requests to the other container.
(The Nginx Dockerfile looks short, but it's correct. The Docker Hub nginx image already knows how to run the envsubst line from your script in its own entrypoint script and it has a correct default command already.)
There's two basic problems in the setup you show in the question, both related to trying to run two programs in the same container. The first is that you can't merge images, having a second FROM line makes Docker start over from the new base image. (So your final image contains only Nginx, not Node or your built application, hence the npm not found error.) The second you'll run into is that your script will start your application, but not start the Nginx proxy until after the application exits. There are some common workarounds to this (like using a background process) but it essentially results in one process or the other being unmonitored by Docker, so your application could potentially fail and Docker wouldn't notice it to be able to restart it.

404 when serving react application in docker container

I've create a react application with create-react-app and have build a docker image with the following docker file.
FROM node:alpine AS builder
WORKDIR /app
RUN npm install
COPY . .
RUN npm run build
FROM node:alpine
WORKDIR /app
COPY --from=builder /app/build .
RUN npm install -g serve
EXPOSE 80
CMD serve -p 80 -s build
When running the container and accessing port 80 on localhost I'm met with "404 the requested path could not be found". The container is run with the command `docker run -p 80:80 "image name" and the output is "Accepting connections at http://localhost:80" What could be the reasons for the 404 and what can i do to fix it?
Looking at the documentation of serve... You are copying /app/build from the builder container in to /app on the new container and then calling serve with a folder name of build, which does not exist. (-s doesn't take a parameter`)

docker port mapping ignored when adding volumes to the run command

When I start my docker container with:
docker run -it -d -p 8081:8080 --name ${APP_CONTAINER_NAME} ${APP_IMAGE}
I can access my web application just fine in my browser on: localhost:8081
But if I instead run it with the two volumes below:
docker run -it -d -p 8081:8080 -v ${PWD}:/app -v /app/node_modules --name ${APP_CONTAINER_NAME} ${APP_IMAGE}
The port mapping is ignored - I cannot access it at localhost:8081 but I can access it at localhost:8080.
My dockerfile has:
FROM node:8-alpine
RUN apk update && apk add bash
RUN npm install -g http-server
WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .
RUN npm run build
EXPOSE 8080
CMD [ "http-server", "dist" ]
Why does adding the volumes to the second docker run command ignore the port mapping from 8081 to 8080?
As suggested below running without -d (but with volumes):
docker run -it -p 8081:8080 -v ${PWD}:/app -v /app/node_modules --name ${APP_CONTAINER_NAME} ${APP_IMAGE}
gives:
Starting up http-server, serving dist
Available on:
http://127.0.0.1:8080
http://172.17.0.2:8080
Hit CTRL-C to stop the server
But I cannot access it on localhost:8080 or localhost:8081 even though the container is indeed running:
$ docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
603b1bf02d58 app-image "http-server dist" 11 seconds ago Up 5 seconds 0.0.0.0:8081->8080/tcp app-container
When I instead run it without volumes but still map to 8081 it works:
Starting up http-server, serving dist
Available on:
http://127.0.0.1:8080
http://172.17.0.2:8080
Hit CTRL-C to stop the server
and I can access it on localhost:8081. So something in the application must be messed up when adding the volumes just not sure what. I have also tried to run:
docker volume prune
before starting the container but it has no effect. Any ideas why creating the volumes prevents the application from being accessed?

Docker container is not accessible

I have docker installed on Ubuntu 16.04 VM and I'm working on a personal project using nodejs and Docker image is from the DockerFile.
the container runs but when I try to access it with the VP'm public IP It's not accessible.
I tried to curl and I get curl: (52) empty reply from the server. after taking a very long time.
The port is mapped correctly and no firewall issues as well.
here is my docker file
FROM node:10.13-alpine
ENV NODE_ENV production
WORKDIR /usr/src/app
COPY ["package.json", "package-lock.json*", "npm-shrinkwrap.json*", "./"]
RUN apk update && apk upgrade \
&& apk add --no-cache git \
&& apk --no-cache add --virtual builds-deps build-base python \
&& npm install -g nodemon cross-env eslint npm-run-all node-gyp
node-pre-gyp && npm install\
&& npm rebuild bcrypt --build-from-source
RUN npm install --production --silent && mv node_modules ../
COPY . .
RUN pwd
EXPOSE 3001
CMD npm start
docker ps
CONTAINER ID IMAGE COMMAND CREATED
STATUS PORTS NAMES
8588419b40c4 xxx:v1 "/bin/sh -c 'npm sta…" 2 days ago
Up 2 days 0.0.0.0:3000->3001/tcp youthful_roentgen
Let xxx:v1 be the image name built by the Dockerfile you provided.
If you want to access your app via your host (curl localhost:3001), then you should run :
docker run -p 3001:3000 xxx:v1
This command binds port 3000 in your container to your port 3001 on your host (IIRC, 3000 is the default port used by npm start).
You should then be able to access localhost:3001 from your host with curl.
Note that EXPOSE directive in the Dockerfile does not automatically expose a port when running docker run. It's just an indication telling that your container listens on port you EXPOSEd. Here, your EXPOSE directive is wrong, you should have written :
EXPOSE 3000
because only port 3000 is exposed in the container (3000 is the default port used by npm). What port you choose to bind on the host (or not) is specified at runtime only.
If you don't want to access your app via localhost, but only via the container's IP, there is no need to bind the port (no -p). You only need to do curl <container_ip>:3000 from your host.

Docker container with Angular2 app and NodeJs does not respond

I created new Angular2 app by angular-cli and run it in Docker. But I cannot connect it from localhost.
At first I init app on my local machine:
ng new project && cd project && "put my Dockerfile there" && docker build -t my-ui .
I start it by command:
docker run -p 4200:4200 my-ui
Then try on my localhost:
curl localhost:4200
and receive
curl: (56) Recv failure: Connection reset by peer
Then, I tried switch into running container (docker exec -ti container-id bash) and run curl localhost:4200 and it works.
I also tried to run container with --net= host param:
docker run --net=host -p 4200:4200 my-ui
And it works. What is the problem? I also tried to run container in daemon mode and it did not helped. Thanks.
My Dockerfile
FROM node
RUN npm install -g angular-cli#v1.0.0-beta.24 && npm cache clean && rm -rf ~/.npm
RUN mkdir -p /opt/client-ui/src
WORKDIR /opt/client-ui
COPY package.json /opt/client-ui/
COPY angular-cli.json /opt/client-ui/
COPY tslint.json /opt/client-ui/
ADD src/ /opt/client-ui/src
RUN npm install
RUN ng build --prod --aot
EXPOSE 4200
ENV PATH="$PATH:/usr/local/bin/"
CMD ["npm", "start"]
It seems that you use ng serve to run development server and it by default starts on loop interface (available only on localhost). You should provide specific parameter:
ng serve --host 0.0.0.0
to run it on all interfaces.
You need to change angular-cli to serve the app externally i.e update your npm script to ng serve --host 0.0.0.0

Resources