Workdir of Node.JS - node.js

I have a dockerfile from nodejs. I'm copying the right folders (which are in my repo after a npm install) and I start the node.js server.
It seems to work fine but what I don't understand is /usr/src/www.
I saw it often on the internet. But is it important in which directory you're running your server.js?
FROM node
# Create app directory
RUN mkdir -p /usr/src/www
WORKDIR /usr/src/www
# copy
COPY node_modules /usr/src/www/node_modules
COPY gulpfile.js /usr/src/www/gulpfile.js
COPY gulp.config.js /usr/src/www/gulp.config.js
COPY server.js /usr/src/www/server.js
EXPOSE 8080
CMD [ "node", "server.js" ]

In node.js all the references are relative to the file which you reference from. So usually the working dir is less relevent

Related

What is the command needed for an API backend Dockerfile?

I am new to creating Dockerfiles and cannot figure out what command to use to start up the API backend application. I know that backend applications don't use Angular and that the command to start it is not "CMD ng serve --host 0.0.0.0".
I am attaching the code of the backend Dockerfile and also providing the errors that I am getting when trying to run the container in Docker Desktop below.
I have looked at Docker documentation and Node commands but cannot figure out what command to use to make the API backend run. What am I doing wrong?
Code:
# using Node v10
FROM node:10
# Create app directory
WORKDIR /usr/src/lafs
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm#5+)
COPY package*.json ./
RUN npm install
# If you are building your code for production
# RUN npm ci --only=production
# Bundle app source
COPY . .
# Expose port 3000 outside container
EXPOSE 3000
# Command used to start application
CMD ng serve --host 0.0.0.0
Errors that I am receiving in Docker Desktop:
/bin/sh: 1: ng: not found
From your original screenshot, it looks like you've got a server directory. Assuming that's where your Express app lives, try something like this
FROM node:16 # 12 and older are EOL, 14 is in maintenance
WORKDIR /usr/src/lafs
EXPOSE 3000 # assuming this is your server port
COPY server/package*.json . # copy package.json and package-lock.json
RUN npm ci --only=production # install dependencies
COPY server . # copy source code
CMD ["npm", "start"] # start the Express server

How to create Dockerfile for Cypress tests with Node.js server

I have my cypress tests, but also, I have some services that works with remote database and, most important, Node.js server that I need to send emails in case of error. The structure of project looks like this:
I have created Dockerfile, but it doesn't seem to be working:
FROM cypress/included:9.4.1
WORKDIR /e2e-test
COPY package.json /e2e
RUN npm install
COPY . .
VOLUME [ "/e2e-test" ]
CMD ["npm", "run", "tests"]
So, what I need to do is mount this folder to docker container and then, by npm run tests run local server (NOTE: I need this local server only for nodemailer, because it works only on server side).
Also, script npm run tests looks like this. This script runs server and then tests - "tests": "npm run dev & npx cypress run"
How can I implement this?
This can be done this way:
Dockerfile:
FROM cypress/base:16.13.0
RUN mkdir /e2e-test
WORKDIR /e2e-test
COPY package.json ./
RUN npm install
COPY . .
ENTRYPOINT ["npm", "run", "tests"]
And start this using next command (in Makefile, for example): docker run -it -v $PWD:/e2e-test -w /e2e-test e2e-tests
This way, we can make work node server and make it send emails and at the same time, we will have mounted volume.

Docker Nginx with nuxt.js and hot reload setup

Problem
Currently I can only create simple Dockerfiles but I have no experience with docker-compose. I have taken over a project where a static page was
built with Nuxt.js. I want to build a development environment where I can work
with hot reload or just a copy which immediately transfers the change to the nginx service.
Currently, only when I building a container, the dist folder is copied into the nginx/html folder. So I can then call the page.
Question:
What can I do so that the nginx/html folder is overwritten (hot reload)
when saving the Vue file?
I found an interesting approach on SO (option 4). https://stackoverflow.com/a/64010631/8466673
But I don't know how to configure a docker-compose from my one docker file.
My dockerfile
# build project
FROM node:14 AS build
WORKDIR /app
COPY package.json ./
RUN npm install
COPY . .
RUN npm run build
# create nginx server
FROM nginx:1.19.0-alpine AS prod-stage
COPY --from=build /app/dist /usr/share/nginx/html
EXPOSE 80
CMD [ "nginx", "-g", "daemon off;" ]

Deploying Angular Universal 9 to Google App Engine

I am not sure the situation has been changed but it seems I got stuck with the versions I am using.
Previously, in Angular 7, we were able to generate server files for Angular Universal at the root level so we could have node main.js in app yaml and Google App Engine just found the way to run our web application. It seems this is not possible anymore for Angular 9.
We are using Angular SSR for our production web site. It compiles all the server files in /dist-server folder. There is a docker file to deploy it on Google App Engine:
FROM node:12-alpine as buildContainer
WORKDIR /app
COPY ./package.json ./package-lock.json /app/
RUN npm install
COPY . /app
RUN npm run build:ssr // This will create dist/ and dist-server/ folders in the docker
FROM node:12-alpine
WORKDIR /app
COPY --from=buildContainer /app/package.json /app
COPY --from=buildContainer /app/dist /app/dist
COPY --from=buildContainer /app/dist-server /app/dist-server
EXPOSE 4000
CMD ["npm", "run", "serve:ssr"]
In package.json we have :
"serve:ssr": "node dist-server/main.js",
In order to start the deployment, we type gcloud app deploy in the terminal and everything works fine for this process. The main problem is this takes almost 25 mins to finish. The main bottleneck for the time consuming part is the compiling.
I thought we could have compiled the repo on our local dev machine, and copy only dist/ and dist-server folder to the docker and add node dist-server/main.js to run our web application in the docker file. Whenever I tried to copy only dist and dist-server folder. I got below error message:
COPY failed: stat /var/lib/docker/tmp/docker-builder{random numbers}/dist: no such file or directory
I also tried to compile the main.js which is the main server file for angular universal at the same level as app.yaml. I assumed this is required according to Google App Engine Node JS deployment rule since there is an example repo from Google. I cannot compile our main.js file into the root folder, it gives below error message:
An unhandled exception occurred: Output path MUST not be project root directory!
So I am looking for a solution to which does not require Google App Engine to rebuild our repo (since we can do this in our dev machine and upload the compiled files for the sake of time saving) to make the deployment process faster.
Thanks for your help
I have found that the .dockerignore file had dist and dist-server folder in it. I have removed those entries. I am able to compile and deploy the docker file on google app engine now.

files not in docker workdir azure web app

I created an image with this dockerfile:
FROM node:12
WORKDIR /usr/src/app
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 3000
CMD [ "npm", "start" ]
my workdir is /usr/src/app, and I copy my project using COPY . .
I created a container and uploaded it to azure web service. I search in the kudu bash for the workdir, or any other file, and I cant find it. The container runs and does what it should, but the files are no where to be seen. how can I find it?
For your question, the first thing you need to know is that the kudu bash is not in the system that your container. They are two different systems. So it's obvious you cannot find your files in it. And if you want to connect to your container, you need to enable the SSH in your container and SSH into it, then you can find the files in the working directory.

Resources