I have an NodeJS express app that I want to dockerize. For that I created a Dockerfile:
FROM node:18 AS server
ENV NODE_ENV=production
WORKDIR /app
COPY package*.json /
RUN npm ci
COPY . .
I also have a .dockerignore file:
node_modules/
client/node_modules/
Dockerfile
docker-compose.yml
.git
.gitignore
.dockerignore
.env
All is run with a help of docker-compose.yml:
version: '3.8'
services:
app:
container_name: my-app
image: my-org/my-app
build:
context: .
dockerfile: Dockerfile
command: node index.js
ports:
- "3030:3030"
environment:
HELLO: world
env_file:
- .env
When I run the Dockerfile commands in this order, the COPY . . seems to remove the node_modules from the image, that are created with npm ci that runs beforehand. I've tried it with first running COPY . . and then npm ci and node_modules stays in the image.
My question is – is it better to run npm ci before COPY . ., and if the answer is yes, then how can I make the node_modules stay?
I had a similar problem and used for that a "build container".
You ignored the node_modules folder in your .dockerignore. Therfore it does not get copied in your image.
# The instructions for the first stage
FROM node:16-alpine as builder
ARG NODE_ENV=production
ENV NODE_ENV=${NODE_ENV}
RUN apk --no-cache add python3 make g++
COPY ./package*.json ./
RUN npm install
# ------------------------------------
# The instructions for second stage
FROM node:16-alpine
WORKDIR /opt/OpenHaus/backend
COPY --from=builder node_modules node_modules
RUN apk --no-cache add openssl
COPY . ./
#COPY ./package.json ./
#ENV HTTP_PORT=8080
ENV NODE_ENV=production
#ENV DB_HOST=10.0.0.1
#EXPOSE 8080
CMD ["node", "index.js"]
With that the dependencies are "build" and copied from the build container inside your node.js image.
More to read:
https://medium.com/#kahana.hagai/docker-compose-with-node-js-and-mongodb-dbdadab5ce0a
https://dev.to/alex_barashkov/using-docker-for-nodejs-in-development-and-production-3cgp
Related
My Angular app runs fine locally but I haven't figured out how to do the same with a Docker image. Outside of Docker, the UI runs on port 4200 with ng serve and the API serves data from 8080 with node server.js.
My Dockerfile is set up so it can get the Node server running and available on 8080, but the Angular UI won't run. I've tried several options but right now I have:
FROM node:14.17.3
COPY package*.json ./
EXPOSE 4200 8080
RUN npm install -g #angular/cli
RUN npm install --only=production
COPY . ./
RUN ng serve
CMD ["node", "server.js"]
It fails on ng serve with the error: The serve command requires to be run in an Angular project, but a project definition could not be found. I do have an angular.json file in the root. I'm not sure what I am missing. I read that ng serve shouldn't be used in this situation but the alternatives I've seen haven't made a difference.
Workspace:
EDIT 8/10/21: Based on the answers here and a bunch of research, this will display the UI with nginx:
FROM node:12.16.1-alpine as build
WORKDIR /usr/src/app
COPY package*.json ./
RUN npm ci --only=production
COPY . .
# RUN npm install -g #angular/cli
# RUN npm run build --prod
FROM nginx:1.15.8-alpine
COPY --from=build /usr/src/app/dist /usr/share/nginx/html
# CMD ["node", "server.js"]
However, the npm run build step fails because ng is not found despite installing #angular/cli. I have to run this manually to build the dist folder. And I can't run node server.js alongside this. It seems I can only get the front end or back end, not both.
Use below command at the end to run ng serve with host 0.0.0.0 which means it listens to all interfaces.
CMD ["ng","serve","--host", "0.0.0.0"]
But I would suggest using ngInx.
Steps to follow:
Create a docker file under the root of your project, and add the below code. It takes care of: downloading dependencies, building angular project, and deploy it to ngInx server.
#Download Node Alpine image
FROM node:12.16.1-alpine As build
#Setup the working directory
WORKDIR /usr/src/ng-app
#Copy package.json
COPY package.json package-lock.json ./
#Install dependencies
RUN npm install
#Copy other files and folder to working directory
COPY . .
#Build Angular application in PROD mode
RUN npm run build
#Download NGINX Image
FROM nginx:1.15.8-alpine
#Copy built angular files to NGINX HTML folder
COPY --from=build /usr/src/ng-app/dist/pokemon-app/ /usr/share/nginx/html
Build docker image:
docker build -t my-ng-app .
Spinning the docker container with below command expose your app at port 80
docker run -dp 3000:80 my-ng-app
Check out my article on this - https://askudhay.com/how-to-dockerize-an-angular-application, and please let me know if you still have any questions.
I figured out a solution that will run the full application. Most answers here focus on running the front end (the nginx suggestion was helpful). It seemed a Docker container could enable the UI or server but not both. I came across Docker Compose, which will run the front and back ends in separate images. My solution:
Dockerfile.ui
# Define node version
FROM node:12.16.1-alpine as build
# Define container directory
WORKDIR /usr/src/app
# Copy package*.json for npm install
COPY package*.json ./
# Run npm clean install, including dev dependencies for #angular-devkit
RUN npm ci
# Run npm install #angular/cli
RUN npm install -g #angular/cli
# Copy all files
COPY . .
# Run ng build through npm to create dist folder
RUN npm run build --prod
# Define nginx for front-end server
FROM nginx:1.15.8-alpine
# Copy dist from ng build to nginx html folder
COPY --from=build /usr/src/app/dist /usr/share/nginx/html
Dockerfile.server
# Define node version
FROM node:12.16.1-alpine
# Define container directory
WORKDIR /usr/src/app
# Copy package*.json for npm install
COPY package*.json ./
# Run npm clean install, prod dependencies only
RUN npm ci --only=production
# Copy all files
COPY . .
# Expose port 8080 for server
EXPOSE 8080
# Run "node server/run.js"
CMD ["node", "server/run.js"]
docker-compose.yml
version: '3'
services:
server:
build:
context: ./
dockerfile: Dockerfile.server
container_name: server
ports:
- 8080:8080
ui:
build:
context: ./
dockerfile: Dockerfile.ui
container_name: ui
ports:
- 4200:80
links:
- server
docker-compose up will build out an image for server and UI and deploy concurrently. I also resolved the ng not found errors by installing dev dependencies, particularly #angular-devkit/build-angular.
This tutorial helped me figure out Docker Compose: https://wkrzywiec.medium.com/how-to-run-database-backend-and-frontend-in-a-single-click-with-docker-compose-4bcda66f6de
I think updating this line
COPY . ./
with
COPY . ./app
should solve that error. It appears that the node "volume" is in that folder.
Otherwise setting the workdir also seems like a solution:
FROM node:14
WORKDIR /usr/src/app
COPY package*.json ./
...
Source: https://nodejs.org/en/docs/guides/nodejs-docker-webapp/
I'am using the new SvelteKit Framework with the node-adapter
and i have a problem of undefined Environment-Variables when using process.env.APPLICATION_KEY_ID Syntax in an endpoint in production build.
When i use:
console.log(process.env) i'am getting a list of all variables, including my APPLICATION_KEY_ID
ALLUSERSPROFILE: 'C:\\ProgramData',
APPDATA: 'C:\\Users\\user\\AppData\\Roaming',
APPLICATION_KEY_ID: 'test',
But when i use console.log(process.env.APPLICATION_KEY_ID)
i'am getting undefined
Can someone give me a hint what i'am doing wrong?
I'am running the app in kubernetes, this is my Dockerfile for building this image:
# build the sapper app
FROM mhart/alpine-node:14 AS build
WORKDIR /app
COPY . .
RUN npm install
RUN npm run build
# install dependencies
FROM mhart/alpine-node:14 AS deps
WORKDIR /app
COPY package.json .
COPY --from=build /app/package-lock.json package-lock.json
RUN npm ci --prod
COPY --from=build /app/build build
COPY --from=build /app/node_modules node_modules
# copy node_modules/ and other build files over
FROM mhart/alpine-node:slim-14
WORKDIR /app
COPY --from=deps /app .
EXPOSE 3000
CMD ["node", "build"]
ENV HOST=0.0.0.0
SvelteKit uses Vite as it's bundler. It is probably best to stick to how this package deals with environment variables. Which is to say, all env variabled prefixed with VITE_ will be available in your code using import.meta.env.VITE_xxx
I using node:latest image.
And get
ModuleBuildError: Module build failed: ModuleBuildError: Module build failed: Error: spawn /hobover_web_client/node_modules/pngquant-bin/vendor/pngquant ENOENT.
Dockerfile
FROM node:latest
# set working directory
RUN mkdir -p /hobover_web_client
WORKDIR /hobover_web_client
ENV NPM_CONFIG_LOGLEVEL=warn
COPY package.json yarn.lock /hobover_web_client/
# install app dependencies
RUN rm -rf node_modules/ && yarn install --ignore-scripts && yarn global add babel babel-cli webpack nodemon pngquant optipng recjpeg
ADD . /hobover_web_client
In docker-compose.yml
version: '2'
hobover_web_client:
container_name: hobover_web_client
build: ./hobover_web_client
command: yarn start
ports:
- "8080:8080"
volumes:
- ./hobover_web_client:/hobover_web_client
- /hobover_web_client/node_modules
Build work successfully, but up cause an error.
How can I fix it if without docker it works?
Your issue the is mount of app and node_modules in the same directory. When you use below in docker-compose
- ./hobover_web_client:/hobover_web_client
You are overshadowing the existing node_modules. So you need to use NODE_PATH to relocate your packages. Change your Dockerfile to below
FROM node:latest
# set working directory
RUN mkdir -p /hobover_web_client /node_modules
WORKDIR /hobover_web_client
ENV NPM_CONFIG_LOGLEVEL=warn NODE_PATH=/node_moudles
COPY package.json yarn.lock /hobover_web_client/
# install app dependencies
RUN yarn install --ignore-scripts && yarn global add babel babel-cli webpack nodemon pngquant optipng recjpeg
ADD . /hobover_web_client
Change your compose to below
version: '2'
hobover_web_client:
container_name: hobover_web_client
build: ./hobover_web_client
command: yarn start
ports:
- "8080:8080"
volumes:
- ./hobover_web_client:/hobover_web_client
- /node_modules
So now your /node_modules goes to a anonymous volume, which you as such don't need and can remove, because the path is inside different folder
I am using Docker with Docker Compose and these are my files:
#DOCKERFILE
FROM mhart/alpine-node
# Create app directory
RUN mkdir -p /home/app
# Bundle app soure
COPY . /home/app
# From now on we work in /home/app
WORKDIR /home/app
# Install yarn and node modules
RUN echo -e 'http://dl-cdn.alpinelinux.org/alpine/edge/main\nhttp://dl-
cdn.alpinelinux.org/alpine/edge/community\nhttp://dl-
cdn.alpinelinux.org/alpine/edge/testing' > /etc/apk/repositories \
&& apk add --no-cache yarn \
&& yarn
EXPOSE 8080
This is the docker-compose file for dev:
app:
build: .
command: yarn start:dev
environment:
NODE_ENV: development
ports:
- '8080:8080'
volumes:
- .:/home/app
- /home/app/node_modules
The problem I am having is that this setup seems to work just once because no matter which new module I add to the package.json, whenever I run docker-compose build it will not install the new package.
The reason why I am using the volumes is because nodemon would not work without .:/home/app, but if the node modules are not installed in the host then it will fail, reason why I need /home/app/node_modules. I suspect this could be the cause of my error, but I am not sure how to circumvent that.
I solved this by moving my src code inside an src directory.
This means my docker-compose.yml file now looks like this:
app:
build: .
command: yarn start:dev
environment:
NODE_ENV: development
ports:
- '8080:8080'
volumes:
- ./src:/home/app/src
Since I am not mounting the whole dir with the node_modules, new ones seem to be installed correctly.
The package.json should be copied into app directory and "npm install" should be invoked in Dockerfile before copying the bundle line.
#DOCKERFILE
FROM mhart/alpine-node
# Create app directory
RUN mkdir -p /home/app
WORKDIR /home/app
# Install app dependencies
COPY package.json /home/app
RUN npm install
# Bundle app soure
COPY . /home/app
# Install yarn and node modules
RUN echo -e 'http://dl-cdn.alpinelinux.org/alpine/edge/main\nhttp://dl-
cdn.alpinelinux.org/alpine/edge/community\nhttp://dl-
cdn.alpinelinux.org/alpine/edge/testing' > /etc/apk/repositories \
&& apk add --no-cache yarn \
&& yarn
EXPOSE 8080
If there is any new dependency registers in package.json, it should be installed when the docker build command is invoked.
I have just started learning docker-compose and I am using a nodejs image. I want to install gulp to create some tasks and have one of them working on the background.
When I run: docker-compose run --rm -d server gulp watch-less
I get this error: ERROR: oci runtime error: container_linux.go:247: starting container process caused "exec: \"gulp\": executable file not found in $PATH"
Here are my file:
# Dockerfile
FROM node:6.10.2
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
COPY package.json /usr/src/app
RUN npm install --quiet
COPY . /usr/src/app
CMD ["npm", "start"]
# docker-compose.yml
version: "2"
services:
server:
build: .
ports:
- "5000:5000"
volumes:
- ./:/usr/src/app
I also have a .dockerignore to ignore the node_modules folder and the npm-debug.log
EDIT:
When I run docker-compose run --rm server npm install package-name I don't have any problem and the package is installed.
Try adding gulp install in Dockerfile:
# Dockerfile
FROM node:6.10.2
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
RUN npm install -g gulp
COPY package.json /usr/src/app
RUN npm install --quiet
COPY . /usr/src/app
CMD ["npm", "start"]
I have found a solution that works but maybe is not the best solution. I have created an script that runs a gulp task on the package.json and if I run:
docker-compose run --rm server npm run gulp_task it works and does what it has to do.
I find that just referencing gulp via
./node_modules/.bin/gulp
instead of directly works fine. ./node_modules/.bin isn't in the path by default. Another option would be to add that dir to the PATH.