How to create Dockerfile for Cypress tests with Node.js server - node.js

I have my cypress tests, but also, I have some services that works with remote database and, most important, Node.js server that I need to send emails in case of error. The structure of project looks like this:
I have created Dockerfile, but it doesn't seem to be working:
FROM cypress/included:9.4.1
WORKDIR /e2e-test
COPY package.json /e2e
RUN npm install
COPY . .
VOLUME [ "/e2e-test" ]
CMD ["npm", "run", "tests"]
So, what I need to do is mount this folder to docker container and then, by npm run tests run local server (NOTE: I need this local server only for nodemailer, because it works only on server side).
Also, script npm run tests looks like this. This script runs server and then tests - "tests": "npm run dev & npx cypress run"
How can I implement this?

This can be done this way:
Dockerfile:
FROM cypress/base:16.13.0
RUN mkdir /e2e-test
WORKDIR /e2e-test
COPY package.json ./
RUN npm install
COPY . .
ENTRYPOINT ["npm", "run", "tests"]
And start this using next command (in Makefile, for example): docker run -it -v $PWD:/e2e-test -w /e2e-test e2e-tests
This way, we can make work node server and make it send emails and at the same time, we will have mounted volume.

Related

What is the command needed for an API backend Dockerfile?

I am new to creating Dockerfiles and cannot figure out what command to use to start up the API backend application. I know that backend applications don't use Angular and that the command to start it is not "CMD ng serve --host 0.0.0.0".
I am attaching the code of the backend Dockerfile and also providing the errors that I am getting when trying to run the container in Docker Desktop below.
I have looked at Docker documentation and Node commands but cannot figure out what command to use to make the API backend run. What am I doing wrong?
Code:
# using Node v10
FROM node:10
# Create app directory
WORKDIR /usr/src/lafs
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm#5+)
COPY package*.json ./
RUN npm install
# If you are building your code for production
# RUN npm ci --only=production
# Bundle app source
COPY . .
# Expose port 3000 outside container
EXPOSE 3000
# Command used to start application
CMD ng serve --host 0.0.0.0
Errors that I am receiving in Docker Desktop:
/bin/sh: 1: ng: not found
From your original screenshot, it looks like you've got a server directory. Assuming that's where your Express app lives, try something like this
FROM node:16 # 12 and older are EOL, 14 is in maintenance
WORKDIR /usr/src/lafs
EXPOSE 3000 # assuming this is your server port
COPY server/package*.json . # copy package.json and package-lock.json
RUN npm ci --only=production # install dependencies
COPY server . # copy source code
CMD ["npm", "start"] # start the Express server

Node.js application works in local, but not working when deployed to Docker Container

I've a working Node js application in local i.e., if I run the following command, it works in browser while I open http://127.0.0.1:8082
$node server.js
I'm trying to build image using following
docker build -t dashboard .
And create and run container using following
docker run --name dashcont3 -p 8082:8080 -d dashboard
This is my Dockerfile
FROM node:10
WORKDIR /usr/src/app/
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 80
EXPOSE 8080
CMD [ "node", "server.js" ]
When I build and run the container and trying to open the URL in browser, it gives following error
This page isn’t working
127.0.0.1 didn’t send any data.
ERR_EMPTY_RESPONSE
Please let me know what is that I'm missing. Thanks in advance!

running react and node in a single docker file without docker-compose.yml file

i have a sample project using reactjs and nodejs below is the folder structure.
movielisting
Dockerfile
client
package.json
package.lock.json
... other create-react-app folders like src..
server
index.js
I start this project by npm run start - client folder and nodemon index.js - server folder. All my api's are written in server folder. My client is running in port 3000 and server is running in port 4000, i have added proxy in package.json of client like below
"proxy": "http://localhost:4000"
So what i am trying to achieve in Dockerfile is i want to start application by running this Dockerfile
1) i need to create the build of client by running npm run build
2) copy to the workdir inside the container
3) copy all server folder
4) npm install for the server folder
5) proxy the application
how can i do this one ? should i need to write some code in nodejs to serve the build index.html file
also how can i run the Dockerfile command to run the application.
Any help appreciated !!!
In order to build both frontend and backend into single image. you need to do the following:
For the frontend application it needs to be built in order to serve it as a static file
For the backend application it needs to be running inside the container and exposed publically so the frontend can reach it from the browser as using localhost:4000 will end up calling the user's localhost not the container's localhost
Finally you need to use something like supervisor as a service manager in order to run multiple service in single container. You might need to check the following
As an example you might check the following:
FROM node:10.15.1-alpine
COPY . /home/movielisting
#Prepare the client
WORKDIR /home/movielisting/client
RUN npm install --no-cache && npm run build && npm install -g serve
#Prepare the server
WORKDIR /home/movielisting/server
RUN npm install --no-cache
EXPOSE 4000
CMD [ "npm", "start", "index.js" ]
You might need to check the following links:
ReactApp Static Server
Deployment

Dockerfile not running my mock api and reactjs app

Good Morning. I am trying to run the docker file to start my mock api and my UI.
When I run those inside individual terminals, I am able to see the UI up and running. But when I run those inside a docker container the API doesn't start for some reasons.
Can you help me with this?
# My Docker file.
FROM node:11
# Set working directory for API
RUN mkdir /usr/src/api
WORKDIR /usr/src/api
COPY ./YYY/. /usr/src/api/.
RUN npm install
RUN npm start &
# set working directory for UI
RUN mkdir /usr/src/app/
WORKDIR /usr/src/app/
COPY ./ZZZ/. /usr/src/app/.
ENV PATH /usr/src/app/node_modules/.bin:$PATH
EXPOSE 3000
RUN npm install
RUN npm start
Thanks,
Ranjith
The command npm start starts a web server that only listens on the loopback interface of the container. To fix this, in package.json, under start, add —host 0.0.0.0. This will allow you to access the app in your browser using the container ip.

Docker: ENTRYPOINT can't execute command because it doesn't find the file

I'm trying to create a container from node js image and I have configured my Dockerfile as shown:
FROM node:boron
# Create app directory
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
# Install app dependencies
COPY package.json /usr/src/app
RUN npm install
# Bundle app source
COPY . /usr/src/app
VOLUME ./:/usr/src/app
ENTRYPOINT [ "npm run watch" ]
In the package.json I have a script called watch than runs the gulp task named watch-less.
If I run npm run watch in my local environment the command works but when I try running the container it doesn't and shows the next error:
docker: Error response from daemon: oci runtime error: container_linux.go:247: starting container process caused "exec: \"npm run watch\": executable file not found in $PATH".
ENTRYPOINT [ "npm run watch" ]
This is an incorrect json syntax, it's looking for the executable npm run watch, not the executable npm with parameters run and watch:
With the json syntax you need to separate each argument. You can use the shell syntax:
ENTRYPOINT npm run watch
Or you can update the json syntax like (assuming npm is installed in /usr/bin):
ENTRYPOINT [ "/usr/bin/npm", "run", "watch" ]
You also have an incorrect volume definition:
VOLUME ./:/usr/src/app
Dockerfiles cannot specify the how the volume is mounted to the host, only that an anonymous volume is defined at a specific directory location. With a syntax like:
VOLUME /usr/src/app
I've got strong opinions against using a volume definition inside of the Dockerfile described in this blog post. In short, you can define the volume better in a docker-compose.yml, all you can do with a Dockerfile is create anonymous volumes that you'd need to still redefine elsewhere if you want to be able to easily reuse them later.
If you use the list notation for ENTRYPOINT, that is, with the [brackets], you must separate the arguments properly.
ENTRYPOINT ["npm", "run", "watch"]
Right now it is trying to find a file literally named "npm run watch" and that does not exist.

Resources