Very Slow ng build --prod in Docker - node.js

When I try to build an angular7 project inside docker it takes around 40 minutes. The line that takes 40 minutes is
ng build --prod
92% chunk asset optimization TerserPlugin
I've ran ng build --prod outside docker on the same laptop it takes 2 minutes.
I've tried adding --build-optimizer false
and --sourceMap=false
Does not make any difference
Here is my Dockerfile
FROM node:carbon
WORKDIR /usr/src/app
COPY package.json package-lock.json ./
RUN npm install
RUN npm install -g #angular/cli#6.1.0
COPY . .
RUN ng build --prod
EXPOSE 4200
CMD [ "npm", "start" ]
HEALTHCHECK --interval=5s --timeout=30s --retries=20 CMD curl --fail http://localhost:4200 || exit 1

This issue with extremely slow builds is almost always related to the build process lacking memory.
Node will not allocate a lot of memory for a single process (512mb on 32bit systems and 1gb on 64bit systems), but running ng build with production settings uses a lot of memory.
You can use the Node paramteter max_old_space_size to set how much RAM you allow the process to use, but you have to pass the parameter directly to node so replace
ng build --prod
with
node --max_old_space_size=8192 ./node_modules/#angular/cli/bin/ng build --prod
it will allocate up to 8GB of RAM for the process, which will make it run much faster.
You can also add this to your scripts in package.json:
"scripts": {
....
"build:prod": "node --max_old_space_size=4096 ./node_modules/#angular/cli/bin/ng build --prod"
}
(If increasing the memory limit doesn't work, try running ng build --prod --verbose to see exact timings for different phases of the compilation)

As Daniel mention in answer you can use node parameter --max_old_space_size but I prefer to set it up via environment var:
NODE_OPTIONS=--max-old-space-size=4096

Related

Using max-old-space-size in NPX

I was updating the scripts in package.json which uses the "./node_modules" to use NPX commands and came across ng build with "max-old-space-size" argument. I have converted the command as following from
node --max-old-space-size=10240 ./node_modules/#angular/cli/bin/ng build --prod
to
npx --max-old-space-size=10240 ng build --prod
I have two questions:
The npx command runs without errors, but couldn't verify if it's using the max-old-space-size argument. So, am I doing it right?
Is there any better way to reduce the repeated "./node_modules" in scripts?

How to make Angular watch multiple libraries for changes and recompile when needed

This question is much the same as Make angular app watch for libraries changes and update itself. But, that question was never successfully answered as applies to the use of multiple libraries. I also reviewed Angular library and live reload and surveyed the answers and links from both questions.
My app is using two libraries: lib-1 and lib-2. When those files are edited, they are ignored and the app does not recompile. To see changes, I have to restart the server which really slows things down.
My expectation is that the app should be recompile when library files are edited, just like when other app-internal files are edited.
This is an Angular project that I have inherited, and the original author is no longer available. I am using Angular v10 and npm 6.14.11
The initial npm scripts are:
"start:staging": "ng serve --configuration-staging --host 0.0.0.0 --port 8080 --disableHostCheck",
"build:lib-1": "ng build lib-1 && cpx projects/lib-1/src/lib/theme.scss dist/lib-1",
"build:lib-2": "ng build lib-2 && cpx projects/lib-2/src/lib/theme.scss dist/lib-2",
"build:libs": "npm run build:lib-1 && npm run build:lib-2",
With those, I first run npm run build:libs, then npm run start:staging. As mentioned, this does not "watch" my libraries for changes.
I reviewed the suggestions and the other SO questions (above), have ensured that the npm-run-all, wait-on and rimraf libraries are now installed.
I have written these new npm scripts:
"clean": "rimraf dist",
"start-app": "wait-on dist/lib-1/fesm2015 dist/lib-2/fesm2015 && start:staging --poll 2000",
"watch:lib-1": "npm run build:lib-1 --watch",
"watch:lib-2": "npm run build:lib-2 --watch",
"watch-libs": "npm-run-all --parallel watch:lib-1 watch:lib-2",
"watch-all": "npm-run-all clean --parallel watch-libs start-app"
And, I am using the pre-existing start:staging script, as written.
I run npm run watch-all.
The script runs and proceeds to the point of building the libraries in parallel (bad idea?), and then throws error: sh: start:staging: command not found.
I removed the --parallel switches and tried again, and got the same error.
The start:staging script is indeed in the scripts object, and I cannot figure out why it's not being found.
I'm hoping to get some sage advice on correcting my syntax so that the app will compile and watch my library files along with the other files that are inside the app's src folder.
After a lot of sleuthing, I came across Nikola Kolev's Angular 6: build — watch multiple dependent libraries in one shell post.
While I don't have it down to one npm script like Nikola was able to do, I am able to do it by running two scripts (there are 7 total scripts involved), and that's good enough for now. I'll work on condensing to one when I get more time.
First, be sure to have wait-on, rimraf and npm-run-all installed. We're also using cpx; but, that's not about getting the libraries to be "watched" -- just including to be overly thorough.
Here are all the scripts:
"clean": "rimraf dist",
"watch-lib:lib-1": "ng build lib-1 --watch",
"watch-lib:lib-2": "ng build lib-2 --watch",
"watch-libs": "npm-run-all clean --parallel watch-lib:*",
"copy-styles:lib-1": "cpx projects/lib-1/src/lib/theme.scss dist/lib-1",
"copy-styles:lib-2": "cpx projects/lib-2/src/lib/theme.scss dist/lib-2",
"start-staging": "ng serve --configuration-staging --host 0.0.0.0 --port 8080 --disableHostCheck",
"watch-staging": "npm-run-all copy-styles:* start:staging"
When I want to work on the libraries and have them be "watched", I run npm run watch-libs in one terminal. When that is finished, I run npm run watch:staging in a second terminal. Then, I'm able to launch the app in a browser, and any edits to any of the code, in libraries or in the app itself are caught, and the app recompiles as desired.

What is the --prod flag in "npm run build --prod" for?

I'm doing the fullstackopen course. There's a part where you create the production build files of a React application and copy them to the backend directory so that they can be served as static files. To optimize the task, they suggest adding this npm script to the backend directory:
"build:ui": "rm -rf build && cd ../../osa2/materiaali/notes-new && npm run build --prod && cp -r build ../../../osa3/notes-backend/",
If I understand correctly, this removes the build folder from the backend, then changes directory to the frontend where it creates a new production build and then copies the folder to the backend. But what is the --prod flag doing? I made a small test, running npm run buildand npm run build --prod and the output seems to be the same.
Seems like that the --prod flag is ignored during builds. You need to call the build command as npm run build -- --prod. The extra “--“ makes sure the --prod flag is passed.

How to shrink size of Docker image with NodeJs

I created new Angular2 app by angular-cli and run it in Docker.
At first I init app on my local machine:
ng new project && cd project && "put my Dockerfile there" && docker build -t my-ui && docker run.
My Dockerfile
FROM node
RUN npm install -g angular-cli#v1.0.0-beta.24 && npm cache clean && rm -rf ~/.npm
RUN mkdir -p /opt/client-ui/src
WORKDIR /opt/client-ui
COPY package.json /opt/client-ui/
COPY angular-cli.json /opt/client-ui/
COPY tslint.json /opt/client-ui/
ADD src/ /opt/client-ui/src
RUN npm install
RUN ng build --prod --aot
EXPOSE 4200
ENV PATH="$PATH:/usr/local/bin/"
CMD ["npm", "start"]
Everything is OK, problem is size of image: 939MB!!! I tried to use FROM: ubuntu:16.04 and install NodeJs on it (it works), but still my image has ~450 MB. I know that node:alpine exists, but I am not able to install angular-cli in it.
How can I shrink image size? Is it necessary to run "npm install" and "ng build" in Dockerfile? I would expect to build app on localhost and copy it to image. I tried to copy dist dir and and package.json etc files, but it does not work (app start fail). Thanks.
You can certainly use my alpine-ng image if you like.
You can also check out the dockerfile, if you want to try and modify it in some way.
I regret to inform you that even based on alpine, it is still 610MB. An improvement to be sure, but there is no getting around the fact that the angular compiler is grossly huge.
For production, you do not need to distribute an image with Node.js, NPM dependencies, etc. You simply need an image that can be used to start a data volume container that provides the compiled sources, release source maps and other assets, effectively no more than what you would redistributed with a package via NPM, that you can attach to your webserver.
So, for your CI host, you can pick one of the node:alpine distributions and copy the sources and install the dependencies therein, then you can re-use the image for running containers that test the builds until you finally run a container that performs a production compilation, which you can name.
docker run --name=compile-${RELEASE} ci-${RELEASE} npm run production
After you have finished compiling the sources within a container, run a container that has the volumes from the compilation container attached and copy the sources to a volume on the container and push that to your Docker upstream:
docker run --name=release-${RELEASE} --volumes-from=compile-${RELEASE} -v /srv/public busybox cp -R /myapp/dist /srv/public
docker commit release-${RELEASE} release-${RELEASE} myapp:${RELEASE}
Try FROM mhart/alpine-node:base-6 maybe it will work.

Node and docker - how to handle babel or typescript build?

I have a node application that I want to host in a Docker container, which should be straight forward, as seen in this article:
https://nodejs.org/en/docs/guides/nodejs-docker-webapp/
In my project, however, the sources can not be run directly, they must be compiled from ES6 and/or Typescript. I use gulp to build with babel, browserify and tsify - with different setups for browser and server.
What would be the best workflow for building and automating docker images in this case? Are there any resources on the web that describes such a workflow? Should the Dockerimage do the building after npm install or should I create a shell script to do all this and simply have the Dockerfile pack it all together?
If the Dockerfile should do the build - the image would need to contain all the dev-dependencies, which are not ideal?
Note: I have been able to set up a docker container, and run it - but this required all files to be installed and built beforehand.
The modern recommendation for this sort of thing (as of Docker 17.05) is to use a multi-stage build. This way you can use all your dev/build dependencies in the one Dockerfile but have the end result optimised and free of unnecessary code.
I'm not so familiar with typescript, but here's an example implementation using yarn and babel. Using this Dockerfile, we can build a development image (with docker build --target development .) for running nodemon, tests etc locally; but with a straight docker build . we get a lean, optimised production image, which runs the app with pm2.
# common base image for development and production
FROM node:10.11.0-alpine AS base
WORKDIR /app
# dev image contains everything needed for testing, development and building
FROM base AS development
COPY package.json yarn.lock ./
# first set aside prod dependencies so we can copy in to the prod image
RUN yarn install --pure-lockfile --production
RUN cp -R node_modules /tmp/node_modules
# install all dependencies and add source code
RUN yarn install --pure-lockfile
COPY . .
# builder runs unit tests and linter, then builds production code
FROM development as builder
RUN yarn lint
RUN yarn test:unit --colors
RUN yarn babel ./src --out-dir ./dist --copy-files
# release includes bare minimum required to run the app, copied from builder
FROM base AS release
COPY --from=builder /tmp/node_modules ./node_modules
COPY --from=builder /app/dist ./dist
COPY --from=builder /app/package.json ./
CMD ["yarn", "pm2-runtime", "dist/index.js"]
One possible solution is to wrap your build procedure in a special docker image. It is often referred as Builder image. It should contain all your build dependencies: nodejs, npm, gulp, babel, tsc and etc. It encapsulates all your build process, removing the need to install these tools on the host.
First you run the builder image, mounting the source code directory as a volume. The same or a separate volume can be used as output directory.
The first image takes your code and runs all build commands.
As a first step you take your built code and pack it into production docker image as you do now.
Here is an example of docker builder image for TypeScript: https://hub.docker.com/r/sandrokeil/typescript/
It is ok to have the same docker builder for several projects as it is typically designed to be general purpose wrapper around some common tools.
But it is ok to build your own that describes more complicated procedure.
The good thing about builder image is that your host environment remains unpolluted and you are free to try newer versions of compiler/different tools/change order/do tasks in parallel just by modifing Dockerfile of your builder image. And at any time you can rollback your experiment with build procedure.
I personally prefer to just remove dev dependencies after running babel during build:
FROM node:7
# Create app directory
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
# Install app dependencies
COPY package.json /usr/src/app/
RUN npm install
# Copy app source
COPY src /usr/src/app/src
# Compile app sources
RUN npm run compile
# Remove dev dependencies
RUN npm prune --production
# Expose port and CMD
EXPOSE 8080
CMD [ "npm", "start" ]
Follow these steps:
Step 1: make sure you have your babel dependencies inside of dependencies not dev dependencies on package.json. Also Add a deploy script that is referencing to babel from the node_modules folder. you will be calling this script from within docker
This is what my package.json file looks like
{
"name": "tmeasy_api",
"version": "1.0.0",
"description": "Trade made easy Application",
"main": "build/index.js",
"scripts": {
"build": "babel -w src/ -d build/ -s inline",
"deploy" : "node_modules/babel-cli/bin/babel.js src/ -d build/",
},
"devDependencies": {
"nodemon": "^1.9.2"
},
"dependencies": {
"babel-cli": "^6.10.1",
"babel-polyfill": "^6.9.1",
"babel-preset-es2015": "^6.9.0",
"babel-preset-stage-0": "^6.5.0",
"babel-preset-stage-3": "^6.22.0"
}
}
build is for your development purposes on your local machine and deploy is to be called from within you dockerfile.
Step 2: since we want to do the babael transformation ourselves make sure to add .dockerignore with the build folder that you are using during development.
This is what my .dockerignore file looks like.
build
node_modules
Step 3. Construct your dockerfile. below is a sample of my docker file
FROM node:6
MAINTAINER stackoverflow
ENV NODE_ENV=production
ENV PORT=3000
# use changes to package.json to force Docker not to use the cache
# when we change our application's nodejs dependencies:
ADD package.json /tmp/package.json
RUN cd /tmp && npm install
RUN mkdir -p /var/www && cp -a /tmp/node_modules /var/www
# copy current working directory into docker; but it first checks for
# .dockerignore so build will not be included.
COPY . /var/www/
WORKDIR /var/www/
# remove any previous builds and create a new build folder and then
# call our node script deploy
RUN rm -f build
RUN mkdir build
RUN chmod 777 /var/www/build
RUN npm run deploy
VOLUME /var/www/uploads
EXPOSE $PORT
ENTRYPOINT ["node","build/index.js"]
I just released a great seed app for Typescript and Node.js using Docker.
You can find it on GitHub.
The project explains all of the commands that the Dockerfile uses and it combines tsc with gulp for some added benefits.
If you don't want to check out the repo, here's the details:
Dockerfile
FROM node:8
ENV USER=app
ENV SUBDIR=appDir
RUN useradd --user-group --create-home --shell /bin/false $USER &&\
npm install --global tsc-watch npm ntypescript typescript gulp-cli
ENV HOME=/home/$USER
COPY package.json gulpfile.js $HOME/$SUBDIR/
RUN chown -R $USER:$USER $HOME/*
USER $USER
WORKDIR $HOME/$SUBDIR
RUN npm install
CMD ["node", "dist/index.js"]
docker-compose.yml
version: '3.1'
services:
app:
build: .
command: npm run build
environment:
NODE_ENV: development
ports:
- '3000:3000'
volumes:
- .:/home/app/appDir
- /home/app/appDir/node_modules
package.json
{
"name": "docker-node-typescript",
"version": "1.0.0",
"description": "",
"scripts": {
"build": "gulp copy; gulp watch & tsc-watch -p . --onSuccess \"node dist/index.js\"",
"test": "echo \"Error: no test specified\" && exit 1"
},
"keywords": [],
"author": "Stephen Gardner (opensourceaugie#gmail.com)",
"license": "ISC",
"dependencies": {
"express": "^4.10.2",
"gulp": "^3.9.1",
"socket.io": "^1.2.0"
},
"devDependencies": {
"#types/express": "^4.11.0",
"#types/node": "^8.5.8"
}
}
tsconfig.json
{
"compileOnSave": false,
"compilerOptions": {
"outDir": "./dist/",
"sourceMap": true,
"declaration": false,
"module": "commonjs",
"moduleResolution": "node",
"emitDecoratorMetadata": true,
"experimentalDecorators": true,
"target": "ES6"
},
"include": [
"**/*.ts"
],
"exclude": [
"node_modules",
"**/*.spec.ts"
]
}
To get more towards the answer of your question -- the ts is being compiled from the docker-compose.yml file's calling of npm run build which then calls tsc. tsc then copies our files to the dist folder and a simple node dist/index.js command runs this file. Instead of using nodemon, we use tsc-watch and gulp.watch to watch for changes in the app and fire node dist/index.js again after every re-compilation.
Hope that helps :) If you have any questions, let me know!
For the moment, I'm using a workflow where:
npm install and tsd install locally
gulp build locally
In Dockerfile, copy all program files, but not typings/node_modules to docker image
In Dockerfile, npm install --production
This way I get only the wanted files in the image, but it would be nicer if the Dockerfile could do the build itself.
Dockerfile:
FROM node:5.1
# Create app directory
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
# Bundle app
COPY package.json index.js /usr/src/app/
COPY views/ /usr/src/app/views/
COPY build/ /usr/src/app/build/
COPY public/ /usr/src/app/public/
# Install app dependencies
RUN npm install --production --silent
EXPOSE 3000
CMD [ "node", "index.js" ]
I guess a complete automation in the "imaging process" could be established by building in the Dockerimage script and then deleting the unwanted files before installing again.
In my project, however, the sources can not be run directly, they must be compiled from ES6 and/or Typescript. I use gulp to build with babel, browserify and tsify - with different setups for browser and server. What would be the best workflow for building and automating docker images in this case?
When i understand you right, you want to deploy your web app inside a Docker container and provide different flavours for different target-environments (you mentioned different browser and server). (1)
If the Dockerfile should do the build - the image would need to contain all the dev-dependencies, which are not ideal?
It depends. If you want to provide a ready-to-go-image, it has to contain everything your web app needs to run. One advantage is, that you later only need to start the container, pass some parameters and you are ready to go.
During the development phase, that image is not really necessary, because of your usually pre-defined dev-environment. It costs time and resources, if you generate such an image after each change.
Suggested approach: I would suggest a two way setup:
During development: Use a fixed environment to develop your app. All software can run locally or inside a docker/VM. I suggest using a Docker container with your dev-setup, especially if you work in a team and everybody needs to have the same dev-basement.
Deploy Web app: As i understood you right (1), you want to deploy the app for different environments and therefore need to create/provide different configurations. To realize something like that, you could start with a shell-script which packages your app into different docker container. You run the script before your deploy. If you have Jekyll running, it calls your shell-script after each commit, after all tests ran fine.
Docker container for both development and deploy phase: I would like to refer to a project of mine and a colleague: https://github.com/k00ni/Docker-Nodejs-environment
This docker provides a whole development- and deploy-environment by maintaining:
Node.js
NPM
Gulp
Babel (auto transpiling from ECMA6 to JavaScript on a file change)
Webpack
and other JavaScript helpers inside the docker container. You just link your project folder via a volume inside the docker container. It initializes your environment (e.g. deploys all dependencies from package.json) and you are good to go.
You can use it for development purposes so that you and your team are using the same environment (Node.js version, NPM version,...) Another advantage is, that file changes lead to a re-compiling of ECMA6/ReactJS/... files to JavaScript files (No need to do this by hand after each change). We use Babel for that.
For deployment purposes, just extend this Docker image and change required parts. Instead of linking your app inside the container, you can pull it via Git (or something like that). You will use the same basement for all your work.
I found this article that should guide you in both development and production phases: https://www.sentinelstand.com/article/docker-with-node-in-development-and-production
In this article we'll create a production Docker image for a
Node/Express app. We'll also add Docker to the development process
using Docker Compose so we can easily spin up our services, including
the Node app itself, on our local machine in an isolated and
reproducable manner.
The app will be written using newer JavaScript syntax to demonstrate
how Babel can be included in the build process. Your current Node
version may not support certain modern JavaScript features, like
ECMAScript modules (import and export), so Babel will be used to
convert the code into a backwards compatible version.

Resources