Good Morning. I am trying to run the docker file to start my mock api and my UI.
When I run those inside individual terminals, I am able to see the UI up and running. But when I run those inside a docker container the API doesn't start for some reasons.
Can you help me with this?
# My Docker file.
FROM node:11
# Set working directory for API
RUN mkdir /usr/src/api
WORKDIR /usr/src/api
COPY ./YYY/. /usr/src/api/.
RUN npm install
RUN npm start &
# set working directory for UI
RUN mkdir /usr/src/app/
WORKDIR /usr/src/app/
COPY ./ZZZ/. /usr/src/app/.
ENV PATH /usr/src/app/node_modules/.bin:$PATH
EXPOSE 3000
RUN npm install
RUN npm start
Thanks,
Ranjith
The command npm start starts a web server that only listens on the loopback interface of the container. To fix this, in package.json, under start, add —host 0.0.0.0. This will allow you to access the app in your browser using the container ip.
Related
I am new to creating Dockerfiles and cannot figure out what command to use to start up the API backend application. I know that backend applications don't use Angular and that the command to start it is not "CMD ng serve --host 0.0.0.0".
I am attaching the code of the backend Dockerfile and also providing the errors that I am getting when trying to run the container in Docker Desktop below.
I have looked at Docker documentation and Node commands but cannot figure out what command to use to make the API backend run. What am I doing wrong?
Code:
# using Node v10
FROM node:10
# Create app directory
WORKDIR /usr/src/lafs
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm#5+)
COPY package*.json ./
RUN npm install
# If you are building your code for production
# RUN npm ci --only=production
# Bundle app source
COPY . .
# Expose port 3000 outside container
EXPOSE 3000
# Command used to start application
CMD ng serve --host 0.0.0.0
Errors that I am receiving in Docker Desktop:
/bin/sh: 1: ng: not found
From your original screenshot, it looks like you've got a server directory. Assuming that's where your Express app lives, try something like this
FROM node:16 # 12 and older are EOL, 14 is in maintenance
WORKDIR /usr/src/lafs
EXPOSE 3000 # assuming this is your server port
COPY server/package*.json . # copy package.json and package-lock.json
RUN npm ci --only=production # install dependencies
COPY server . # copy source code
CMD ["npm", "start"] # start the Express server
I have my cypress tests, but also, I have some services that works with remote database and, most important, Node.js server that I need to send emails in case of error. The structure of project looks like this:
I have created Dockerfile, but it doesn't seem to be working:
FROM cypress/included:9.4.1
WORKDIR /e2e-test
COPY package.json /e2e
RUN npm install
COPY . .
VOLUME [ "/e2e-test" ]
CMD ["npm", "run", "tests"]
So, what I need to do is mount this folder to docker container and then, by npm run tests run local server (NOTE: I need this local server only for nodemailer, because it works only on server side).
Also, script npm run tests looks like this. This script runs server and then tests - "tests": "npm run dev & npx cypress run"
How can I implement this?
This can be done this way:
Dockerfile:
FROM cypress/base:16.13.0
RUN mkdir /e2e-test
WORKDIR /e2e-test
COPY package.json ./
RUN npm install
COPY . .
ENTRYPOINT ["npm", "run", "tests"]
And start this using next command (in Makefile, for example): docker run -it -v $PWD:/e2e-test -w /e2e-test e2e-tests
This way, we can make work node server and make it send emails and at the same time, we will have mounted volume.
I created a new docker container for a Node.js app.
My Dockerfile is:
FROM node:14
# app directory
WORKDIR /home/my-username/my-proj-name
# Install app dependencies
COPY package*.json ./
RUN npm install
# bundle app source
COPY . .
EXPOSE 3016
CMD ["node", "src/app.js"]
After this I ran:
docker build . -t my-username/node-web-app
Then I ran: docker run -p 8160:3016 -d -v /home/my-username/my-proj-name:/my-proj-name my-username/node-web-app
The app is successfully hosted at my-public-ip:8160.
However, any changes I make on my server do not propagate to the docker container. For example, if I touch test.txt in my server, I will not be able GET /test.txt online or see it in the container. The only way I can make changes is to rebuild the image, which is quite tedious.
Did I miss something here when binding the volume or something? How can I make it so that the changes I make locally also appear in the container?
I am trying to dockerize both a frontend made with create-react-app and its express API backend.
The docker containers sit on ec2 instances. I have tried several tutorials and also threads here on stackoverflow, but I can't work out what is going wrong.
The error I get is:
Could not find an open port at ec2-x-xx-xx-xxx.eu-west-2.compute.amazonaws.com.
this refers to my .env file which contains only
HOST= ec2-x-xx-xx-xxx.eu-west-2.compute.amazonaws.com
my Dockerfile for the front end is as follows:
FROM node:12-alpine
WORKDIR /usr/src/app
COPY package*.json ./
RUN npm install --silent
COPY . .
EXPOSE 3000
CMD ["npm", "start"]
whereas the backend's:
FROM node:12-alpine as builder
WORKDIR /app
COPY package.json /app/package.json
RUN npm install
COPY . /app
EXPOSE 3001
CMD ["node", "app.js" ]
I need the backend to run on port 3001 and the frontend to run on 3000. The frontend is bound to the backend api via the proxy line in package.json, where I have placed:
"proxy":"http://ec2-x-xx-xx-xxx.eu-west-2.compute.amazonaws.com:3001",
Both containers build fine. However, running the backend with docker run -p 3001:3001 server and then docker run -p 3000:3000 client spits the error
Attempting to bind to HOST environment variable: ec2-x-xx-xx-xxx.eu-west-2.compute.amazonaws.com
If this was unintentional, check that you haven't mistakenly set it in your shell.
Learn more here: https://xxx.xx/xxx-advanced-config
Could not find an open port at ec2-x-xx-xx-xxx.eu-west-2.compute.amazonaws.com.
Network error message: listen EADDRNOTAVAIL: address not available 10.xx.0.xxx
I have tried running only the server side and then running npm start from my local machine, it works. The problem seems to have to do with docker networking between containers.
I also tried running the server side with the command
docker run -p 10.xx.0.xxx:3000:3000 client
to make sure the client pings the right ip address, however this didn't work.
Could anyone give me some direction please?
If you need more info on the source code, please just leave a comment, I didn't want to clutter the thread by making it longer than it already is..
Thank you
i have a sample project using reactjs and nodejs below is the folder structure.
movielisting
Dockerfile
client
package.json
package.lock.json
... other create-react-app folders like src..
server
index.js
I start this project by npm run start - client folder and nodemon index.js - server folder. All my api's are written in server folder. My client is running in port 3000 and server is running in port 4000, i have added proxy in package.json of client like below
"proxy": "http://localhost:4000"
So what i am trying to achieve in Dockerfile is i want to start application by running this Dockerfile
1) i need to create the build of client by running npm run build
2) copy to the workdir inside the container
3) copy all server folder
4) npm install for the server folder
5) proxy the application
how can i do this one ? should i need to write some code in nodejs to serve the build index.html file
also how can i run the Dockerfile command to run the application.
Any help appreciated !!!
In order to build both frontend and backend into single image. you need to do the following:
For the frontend application it needs to be built in order to serve it as a static file
For the backend application it needs to be running inside the container and exposed publically so the frontend can reach it from the browser as using localhost:4000 will end up calling the user's localhost not the container's localhost
Finally you need to use something like supervisor as a service manager in order to run multiple service in single container. You might need to check the following
As an example you might check the following:
FROM node:10.15.1-alpine
COPY . /home/movielisting
#Prepare the client
WORKDIR /home/movielisting/client
RUN npm install --no-cache && npm run build && npm install -g serve
#Prepare the server
WORKDIR /home/movielisting/server
RUN npm install --no-cache
EXPOSE 4000
CMD [ "npm", "start", "index.js" ]
You might need to check the following links:
ReactApp Static Server
Deployment