How to write Dockerfile to serve Angular app and Node server - node.js

My Angular app runs fine locally but I haven't figured out how to do the same with a Docker image. Outside of Docker, the UI runs on port 4200 with ng serve and the API serves data from 8080 with node server.js.
My Dockerfile is set up so it can get the Node server running and available on 8080, but the Angular UI won't run. I've tried several options but right now I have:
FROM node:14.17.3
COPY package*.json ./
EXPOSE 4200 8080
RUN npm install -g #angular/cli
RUN npm install --only=production
COPY . ./
RUN ng serve
CMD ["node", "server.js"]
It fails on ng serve with the error: The serve command requires to be run in an Angular project, but a project definition could not be found. I do have an angular.json file in the root. I'm not sure what I am missing. I read that ng serve shouldn't be used in this situation but the alternatives I've seen haven't made a difference.
Workspace:
EDIT 8/10/21: Based on the answers here and a bunch of research, this will display the UI with nginx:
FROM node:12.16.1-alpine as build
WORKDIR /usr/src/app
COPY package*.json ./
RUN npm ci --only=production
COPY . .
# RUN npm install -g #angular/cli
# RUN npm run build --prod
FROM nginx:1.15.8-alpine
COPY --from=build /usr/src/app/dist /usr/share/nginx/html
# CMD ["node", "server.js"]
However, the npm run build step fails because ng is not found despite installing #angular/cli. I have to run this manually to build the dist folder. And I can't run node server.js alongside this. It seems I can only get the front end or back end, not both.

Use below command at the end to run ng serve with host 0.0.0.0 which means it listens to all interfaces.
CMD ["ng","serve","--host", "0.0.0.0"]
But I would suggest using ngInx.
Steps to follow:
Create a docker file under the root of your project, and add the below code. It takes care of: downloading dependencies, building angular project, and deploy it to ngInx server.
#Download Node Alpine image
FROM node:12.16.1-alpine As build
#Setup the working directory
WORKDIR /usr/src/ng-app
#Copy package.json
COPY package.json package-lock.json ./
#Install dependencies
RUN npm install
#Copy other files and folder to working directory
COPY . .
#Build Angular application in PROD mode
RUN npm run build
#Download NGINX Image
FROM nginx:1.15.8-alpine
#Copy built angular files to NGINX HTML folder
COPY --from=build /usr/src/ng-app/dist/pokemon-app/ /usr/share/nginx/html
Build docker image:
docker build -t my-ng-app .
Spinning the docker container with below command expose your app at port 80
docker run -dp 3000:80 my-ng-app
Check out my article on this - https://askudhay.com/how-to-dockerize-an-angular-application, and please let me know if you still have any questions.

I figured out a solution that will run the full application. Most answers here focus on running the front end (the nginx suggestion was helpful). It seemed a Docker container could enable the UI or server but not both. I came across Docker Compose, which will run the front and back ends in separate images. My solution:
Dockerfile.ui
# Define node version
FROM node:12.16.1-alpine as build
# Define container directory
WORKDIR /usr/src/app
# Copy package*.json for npm install
COPY package*.json ./
# Run npm clean install, including dev dependencies for #angular-devkit
RUN npm ci
# Run npm install #angular/cli
RUN npm install -g #angular/cli
# Copy all files
COPY . .
# Run ng build through npm to create dist folder
RUN npm run build --prod
# Define nginx for front-end server
FROM nginx:1.15.8-alpine
# Copy dist from ng build to nginx html folder
COPY --from=build /usr/src/app/dist /usr/share/nginx/html
Dockerfile.server
# Define node version
FROM node:12.16.1-alpine
# Define container directory
WORKDIR /usr/src/app
# Copy package*.json for npm install
COPY package*.json ./
# Run npm clean install, prod dependencies only
RUN npm ci --only=production
# Copy all files
COPY . .
# Expose port 8080 for server
EXPOSE 8080
# Run "node server/run.js"
CMD ["node", "server/run.js"]
docker-compose.yml
version: '3'
services:
server:
build:
context: ./
dockerfile: Dockerfile.server
container_name: server
ports:
- 8080:8080
ui:
build:
context: ./
dockerfile: Dockerfile.ui
container_name: ui
ports:
- 4200:80
links:
- server
docker-compose up will build out an image for server and UI and deploy concurrently. I also resolved the ng not found errors by installing dev dependencies, particularly #angular-devkit/build-angular.
This tutorial helped me figure out Docker Compose: https://wkrzywiec.medium.com/how-to-run-database-backend-and-frontend-in-a-single-click-with-docker-compose-4bcda66f6de

I think updating this line
COPY . ./
with
COPY . ./app
should solve that error. It appears that the node "volume" is in that folder.
Otherwise setting the workdir also seems like a solution:
FROM node:14
WORKDIR /usr/src/app
COPY package*.json ./
...
Source: https://nodejs.org/en/docs/guides/nodejs-docker-webapp/

Related

SvelteKit change port to different one

I want to change port in production build of node-adapter. I use port 3000 internally for different service, so I need to change Svelte app to have port 3001.
My Dockerfile:
# Container setup ---------
FROM node:16-alpine as build
RUN mkdir /appbuild
COPY . /appbuild
WORKDIR /appbuild
# clean install all dependencies
RUN npm ci
# remove potential security issues
RUN npm audit fix
RUN npm run build
# Container setup ---------
FROM node:16-alpine
WORKDIR /app
# copy dependency list
COPY --from=build /appbuild/package*.json ./
# clean install dependencies, no devDependencies, no prepare script
RUN npm ci --production --ignore-scripts
# remove potential security issues
RUN npm audit fix
# copy built SvelteKit app to /app
COPY --from=build /appbuild/build ./
CMD ["node", "./index.js"]
I found that if I start app like PORT=3001 node build from root (after calling npm run build) it works, but for dockerfile, CMD must be provided and PORT cannot be passed.
How to change default port to another one?
Have you tried using ENV?
ENV PORT=3001
CMD ["node", "./index.js"]

cannot replace to directory /var/lib/docker/overlay2/if2ip5okvavl8u6jpdtpczuog/merged/app/node_modules/#ampproject/remapping with file

On my Windows machine, I am attempting to build a containerized node.js application with the following Dockerfile:
# use latest version of nodejs
FROM node:lts-alpine
# install aurelia-cli to build the app & http-server to serve static contents
RUN npm i -g http-server
RUN npm i -g aurelia-cli
# set working directory to app
# henceforth all commands will run inside this folder
WORKDIR /app
# copy package.json related files first and install all required dependencies
COPY package*.json ./
RUN npm install
# copy the rest of the files and folders & install dependencies
COPY . ./
RUN npm run build
# by default http-server will serve contents on port 8080
# so we expose this port to host machine
EXPOSE 8080
CMD [ "http-server" , "dist" ]
However, docker build . fails at the line Copy . ./. with the message cannot replace to directory /var/lib/docker/overlay2/if2ip5okvavl8u6jpdtpczuog/merged/app/node_modules/#ampproject/remapping with file.
What do I need to do to get my container image to build?
Add node_modules to a .dockerignore file in the same directory as your Dockerfile, as outlined here: (h/t David Maze).
Less gracefully, simply delete the project's node_modules directory then rerun docker build.

Docker Build CLI Doesn’t update the code (TS, node)

When I run docker build on my codebase, it doesn’t apply the file modifications. I’m using TypeScript with Node as the language/framework. Here is my Dockerfile:
#Imports the node runtime/os base image
FROM node:14
#like cd into the working directory
WORKDIR /usr/src/app
ENV PORT 8080
#copies the package.json from the local dev machine and pastes it in the ./ directory on google cloud run
COPY package*.json ./
#runs on the cloud run instance terminal
RUN npm install --only=production
#copy the actual code from the . directory (root) and places it in the cloud run root.
COPY . .
EXPOSE 8080
RUN rm -rf src
#Start the service on instance startup
CMD ["npm", "start"]
The issue was that the TS code was not getting compiled into JS code. After explicitly running the compiler and checking the TS config, the problem is resolved.

Docker app won`t work but application works if not in docker

I have built a react and node app and it works. I am trying to build an docker image and run it it compiles but when i try to access it thru thr browser it says This site can’t be reachedlocalhost refused to connect. This is the dockerfile that i`ve written because I think this is the problem:
# pull official base image
FROM node:13.12.0-alpine
# set working directory
WORKDIR /client
# add `/app/node_modules/.bin` to $PATH
ENV PATH /client/node_modules/.bin:$PATH
# install app dependencies
COPY package.json ./
COPY package-lock.json ./
RUN npm install --silent
RUN npm install react-scripts#3.4.1 -g --silent
# add app
COPY . ./
# start app
CMD ["npm", "start"]
FROM node:12
# Create app directory
WORKDIR ./
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm#5+)
COPY package*.json ./
RUN npm install
# If you are building your code for production
# RUN npm ci --only=production
# Bundle app source
COPY . .
EXPOSE 5000
CMD [ "node", "index.js" ]
First make sure you can reach your server - you app may be running locally but it doesn't use the same server as the one from your dockerfile e.g. npm start starts a local server.
Remove everything related to the react app stage from your dockerfile and just make sure you can hit your server on port 5000 and that it servers pages - you can host an index.html with <body>Hello world</body> or something. When you're able to do that it's just a matter of adding the react app bundled files to the server static (public) folder
Is your node server listening on port 5000 ?
Typically with node js server you have to set a PORT and HOST env variables that are used by the server like process.env.PORT and process.env.HOST
Other issues
# start app
CMD ["npm", "start"]
npm start is (usually) used to start a local server while you develop your react app
Instead of starting the app you should run a bundle command that will produce static files to be hosted - js html etc...
Typically npm run build
This should make a folder ./dist or ./build with all the static content you should put on the server (perhaps that's your /client folder for?)
Some tweaks
# pull official base image
FROM node:13.12.0-alpine AS builder
# set working directory
WORKDIR /client
# Copy source
COPY . .
# add `/app/node_modules/.bin` to $PATH (you probably don't need this)
ENV PATH /client/node_modules/.bin:$PATH
# install app dependencies
RUN npm install --silent
RUN npm install react-scripts#3.4.1 -g --silent (you can move this to dependencies)
# Build bundle
RUN npm run build
# Next stage
FROM node:12
# Create app directory
WORKDIR ./
# Copy source
COPY . .
RUN npm install
# If you are building your code for production
# RUN npm ci --only=production
# Copy js bundle
COPY --from=builder /client/dist ./public
EXPOSE 5000
CMD [ "node", "index.js" ]
This might need a little more tweaking but that's the general idea
you don't need to install react-scripts globally it can just be a local dependency - it's used by npm start and npm build to build your app so you depend on it - it should be part of package.json dependencies
Adding /client/node_modules/.bin to PATH shouldn't be needed, maybe it's a leftover or a tell that something else isn't properly setup

Docker container install -g copy to other container

I am installing Express in a container.
I need the smallest development container and I am using a full container to install the dependencies
# container to install dendencies
FROM node:10 as installer
WORKDIR /src
COPY package.json package-lock.json ./
# install all dependencies
RUN npm install
# i need nodemon as global
RUN npm install -g nodemon
# working container
FROM node:10-alpine
RUN mkdir /src
WORKDIR /src
# copy everything (node_modules)
COPY --from=installer /src .
COPY . .
EXPOSE 3000
CMD ["nodemon", "start"]
I copied all the node_modules but how can I copy the global -g nodemon ?
How can I remove the "installer" container after the installation process?

Resources