Dockerizing any Angular SPA app that runs in node - node.js

There are a lot of Angular 2+ templates (for example http://coreui.io/) that run fine in Node, simply by:
npm install
npm start
However, making them run in Docker container is a challenge. I tried the standard approach to creating dockerfile but this doesn't work
Should there be a simple way of dockerizing any app that runs on Node with?
What am I missing?
This is what my dockerfile looks like (generated by yo docker):
FROM node:latest
WORKDIR /src
EXPOSE 4200
ENTRYPOINT ["npm", "start"]
COPY . /src
RUN npm install

first thing i would suggest, don't use node:latest. what version of node do you normally run? specify that version.
you probably need to create the /src folder
and you should also change ENTRYPOINT to CMD
FROM node:7.9
RUN mkdir /src
WORKDIR /src
# cache the node modules for faster re-builds
COPY ./package.json /src
RUN npm install
COPY . /src
EXPOSE 4200
CMD ["npm", "start"]
this should cover the most basic of node apps.
things do get complex quickly, though. if you need more info on building out a proper dockerfile, i have a full course on docker with nodejs: https://sub.watchmecode.net/guides/build-node-apps-in-docker/

Related

Docker Build CLI Doesn’t update the code (TS, node)

When I run docker build on my codebase, it doesn’t apply the file modifications. I’m using TypeScript with Node as the language/framework. Here is my Dockerfile:
#Imports the node runtime/os base image
FROM node:14
#like cd into the working directory
WORKDIR /usr/src/app
ENV PORT 8080
#copies the package.json from the local dev machine and pastes it in the ./ directory on google cloud run
COPY package*.json ./
#runs on the cloud run instance terminal
RUN npm install --only=production
#copy the actual code from the . directory (root) and places it in the cloud run root.
COPY . .
EXPOSE 8080
RUN rm -rf src
#Start the service on instance startup
CMD ["npm", "start"]
The issue was that the TS code was not getting compiled into JS code. After explicitly running the compiler and checking the TS config, the problem is resolved.

How to write Dockerfile to serve Angular app and Node server

My Angular app runs fine locally but I haven't figured out how to do the same with a Docker image. Outside of Docker, the UI runs on port 4200 with ng serve and the API serves data from 8080 with node server.js.
My Dockerfile is set up so it can get the Node server running and available on 8080, but the Angular UI won't run. I've tried several options but right now I have:
FROM node:14.17.3
COPY package*.json ./
EXPOSE 4200 8080
RUN npm install -g #angular/cli
RUN npm install --only=production
COPY . ./
RUN ng serve
CMD ["node", "server.js"]
It fails on ng serve with the error: The serve command requires to be run in an Angular project, but a project definition could not be found. I do have an angular.json file in the root. I'm not sure what I am missing. I read that ng serve shouldn't be used in this situation but the alternatives I've seen haven't made a difference.
Workspace:
EDIT 8/10/21: Based on the answers here and a bunch of research, this will display the UI with nginx:
FROM node:12.16.1-alpine as build
WORKDIR /usr/src/app
COPY package*.json ./
RUN npm ci --only=production
COPY . .
# RUN npm install -g #angular/cli
# RUN npm run build --prod
FROM nginx:1.15.8-alpine
COPY --from=build /usr/src/app/dist /usr/share/nginx/html
# CMD ["node", "server.js"]
However, the npm run build step fails because ng is not found despite installing #angular/cli. I have to run this manually to build the dist folder. And I can't run node server.js alongside this. It seems I can only get the front end or back end, not both.
Use below command at the end to run ng serve with host 0.0.0.0 which means it listens to all interfaces.
CMD ["ng","serve","--host", "0.0.0.0"]
But I would suggest using ngInx.
Steps to follow:
Create a docker file under the root of your project, and add the below code. It takes care of: downloading dependencies, building angular project, and deploy it to ngInx server.
#Download Node Alpine image
FROM node:12.16.1-alpine As build
#Setup the working directory
WORKDIR /usr/src/ng-app
#Copy package.json
COPY package.json package-lock.json ./
#Install dependencies
RUN npm install
#Copy other files and folder to working directory
COPY . .
#Build Angular application in PROD mode
RUN npm run build
#Download NGINX Image
FROM nginx:1.15.8-alpine
#Copy built angular files to NGINX HTML folder
COPY --from=build /usr/src/ng-app/dist/pokemon-app/ /usr/share/nginx/html
Build docker image:
docker build -t my-ng-app .
Spinning the docker container with below command expose your app at port 80
docker run -dp 3000:80 my-ng-app
Check out my article on this - https://askudhay.com/how-to-dockerize-an-angular-application, and please let me know if you still have any questions.
I figured out a solution that will run the full application. Most answers here focus on running the front end (the nginx suggestion was helpful). It seemed a Docker container could enable the UI or server but not both. I came across Docker Compose, which will run the front and back ends in separate images. My solution:
Dockerfile.ui
# Define node version
FROM node:12.16.1-alpine as build
# Define container directory
WORKDIR /usr/src/app
# Copy package*.json for npm install
COPY package*.json ./
# Run npm clean install, including dev dependencies for #angular-devkit
RUN npm ci
# Run npm install #angular/cli
RUN npm install -g #angular/cli
# Copy all files
COPY . .
# Run ng build through npm to create dist folder
RUN npm run build --prod
# Define nginx for front-end server
FROM nginx:1.15.8-alpine
# Copy dist from ng build to nginx html folder
COPY --from=build /usr/src/app/dist /usr/share/nginx/html
Dockerfile.server
# Define node version
FROM node:12.16.1-alpine
# Define container directory
WORKDIR /usr/src/app
# Copy package*.json for npm install
COPY package*.json ./
# Run npm clean install, prod dependencies only
RUN npm ci --only=production
# Copy all files
COPY . .
# Expose port 8080 for server
EXPOSE 8080
# Run "node server/run.js"
CMD ["node", "server/run.js"]
docker-compose.yml
version: '3'
services:
server:
build:
context: ./
dockerfile: Dockerfile.server
container_name: server
ports:
- 8080:8080
ui:
build:
context: ./
dockerfile: Dockerfile.ui
container_name: ui
ports:
- 4200:80
links:
- server
docker-compose up will build out an image for server and UI and deploy concurrently. I also resolved the ng not found errors by installing dev dependencies, particularly #angular-devkit/build-angular.
This tutorial helped me figure out Docker Compose: https://wkrzywiec.medium.com/how-to-run-database-backend-and-frontend-in-a-single-click-with-docker-compose-4bcda66f6de
I think updating this line
COPY . ./
with
COPY . ./app
should solve that error. It appears that the node "volume" is in that folder.
Otherwise setting the workdir also seems like a solution:
FROM node:14
WORKDIR /usr/src/app
COPY package*.json ./
...
Source: https://nodejs.org/en/docs/guides/nodejs-docker-webapp/

Dockerfile for Node.js with Python deploying to AWS Elastic Beanstalk

I want to be able to run my app on the web. I am under impression as long as I use Docker on EB everything should run similar to localhost as long as all processes defined in Dockerfile. I like to use AWS Elastic Beanstalk. I am very new to this, and it EB with Docker seems to be very easy to get going and maintain. So far I got Node portion going. I just made zip file and uploaded/deployed on EB. But Python calls don’t work for 3rd party libraries i.e. I call .py file from route but it returns error because import didn’t work. It's my understanding that it's possible to have multi-stage Docker environment. i.e https://hub.docker.com/r/nikolaik/python-nodejs/. I understand general premise but can’t figure out how to adopt it for my case.
I tried to add Python portion to Dockerfile and load necessary libraries from requrements.txt. But now I can’t deploy on AWS EB.
Here is my docker file:
FROM python:3.7 as pyth
RUN mkdir /project
WORKDIR /project
COPY requirements.txt /project/requirements.txt
RUN pip install -r requirements.txt
COPY . /project/
FROM node:8-alpine
WORKDIR /opt/app
COPY package.json package-lock.json* ./
RUN npm cache clean --force && npm install
COPY . /opt/app
ENV PORT 80
EXPOSE 80
COPY --from=pyth /project /opt/app
CMD [ "npm", "start" ]
Any help is greatly appreciated.
Already existing images that you can use that contains both dependencies installed. See https://hub.docker.com/r/nikolaik/python-nodejs/
Here is an untested example of how you can use it
FROM nikolaik/python-nodejs:python3.7-nodejs8
RUN mkdir /project
WORKDIR /project
COPY requirements.txt /project/requirements.txt
RUN pip install -r requirements.txt
RUN mkdir /opt/app
WORKDIR /opt/app
COPY package.json package-lock.json ./
RUN npm cache clean --force && npm install
COPY . /opt/app
ENV PORT 80
EXPOSE 80
CMD [ "npm", "start" ]
Note that you don't need a multistage Dockerfile.
If you want to go further and build your own image, take a look at this Dockerfile that is used to build the image in the example I gave.
Hope it helps

Local nodejs module not being found by docker

I have a nodejs module called my-common that contains a couple of js files. These js files export functions that are used throughout a lot of other modules.
My other module (called demo) contains a dependency to the the common module like this:
"dependencies": {
"my-common": "file:../my-common/",
}
When I goto the demo directory and run npm start it works fine. I then build a docker image using the following Dockerfile:
FROM node:8
ENV NODE_ENV=production
WORKDIR /usr/src/app
COPY package*.json ./
RUN npm install --only=production
COPY . .
EXPOSE 3000
CMD [ "npm", "start" ]
When I start the image I get an error that my-common can not be found. I'm guessing that the my-common module isn't being copied into the node_modules directory of the demo module.
I have tried npm link however I think it's a really really really bad idea to need sudo permission to install a global module because this could cause problems on other systems.
I have tried npm install my-common/ in the root directory and that installs the module into my HOME_DIR/node_modules however that isn't installed either into the docker container.
Everywhere I look there doesn't seem an answer to this very simple question. How can I fix this?
So, I see a couple different things.
When Docker runs npm install --only=production in the image, Docker sees file:../my-common/ and looks at the parent directory of the WORKDIR of the Docker image, which is /usr/src/app. Since nothing besides package.json has been copied into the image at that point, it can't find the module. If you want to install everything locally and then move it into the image, you can do that by removing the npm install --only=production command from the Dockerfile, and make sure your .dockerignore file doesn't ignore the node_modules directory.
If you want to install modules in the image, you need to specifically copy the my-common directory into the docker image. However, Docker doesn't allow you to copy something from a parent directory into a image. Any local content has to be in the context of the Dockerfile. You have a couple options:
Option 1: Move my-common/ into the root of your project, update your Dockerfile to copy that folder and update package.json to point to the correct location.
Dockerfile:
FROM node:8
ENV NODE_ENV=production
WORKDIR /usr/src/app
COPY my-common/ ./
COPY package*.json ./
RUN npm install --only=production
COPY . .
EXPOSE 3000
CMD [ "npm", "start" ]
package.json:
"dependencies": {
"my-common": "file:./my-common/",
}
Option 2: Move the context of the Docker image up one directory. By this I mean move the Dockerfile to the same level as my-common directory and update your Dockerfile and package.json to reflect that change.
Dockerfile:
FROM node:8
ENV NODE_ENV=production
WORKDIR /usr/src/app
RUN mkdir my-common
COPY ./my-common ./my-common
COPY ./<projectName>/package*.json .
RUN npm install --only=production
COPY ./<projectName> .
EXPOSE 3000
CMD [ "npm", "start" ]
package.json:
"dependencies": {
"my-common": "file:./my-common/",
}

Node.js + Docker Compose: node_modules disappears

I'm attempting to use Docker Compose to bring together a number of Node.js apps in my development environment. I'm running into an issue, however, with node_modules.
Here's what's happening:
npm install is run as part of the Dockerfile.
I do not have node_modules in my local directory. (I shouldn't because the installation of dependencies should happen in the container, right? It seems to defeat the purpose otherwise, since I'd need to have Node.js installed locally.)
In docker-compose.yml, I'm setting up a volume with the source code.
docker-compose build runs fine.
When I docker-compose up, the node_modules directory disappears in the container — I'm assuming because the volume is mounted and I don't have it in my local directory.
How do I ensure that node_modules sticks around?
Dockerfile
FROM node:0.10.37
COPY package.json /src/package.json
WORKDIR /src
RUN npm install -g grunt-cli && npm install
COPY . /src
EXPOSE 9001
CMD ["npm", "start"]
docker-compose.yml
api:
build: .
command: grunt
links:
- elasticsearch
ports:
- "9002:9002"
volumes:
- .:/src
elasticsearch:
image: elasticsearch:1.5
Due to the way Node.js loads modules, simply place node_modules higher in the source code path. For example, put your source at /app/src and your package.json in /app, so /app/node_modules is where they're installed.
I tried your fix, but the issue is that most people run npm install in the /usr/src/app directory, resulting in the node_modules folder ending up in the /app directory. As a result the node_modules folder ends up in both the /usr/src and /usr/src/app directory in the container and you end up with the same issue you started with.

Resources