npm install hangs when using Docker to install libraries - node.js

Here's the deal. I'm trying to setup my environment to develop a react native application using expo. Since I'm working on a Windows OS, I'd like to setup using Docker and docker-compose.
First I build my image, installing expo-cli from a node image. Then I use docker-compose to specify a volume and ports to expose.
Since at this point I have to project initialized, what I do is, use docker-compose run to initialize my project and then the same to install specific project libraries. But when I do, at some random point, the installation hangs.
Now it seems like it's docker related, but I'm not sure what I'm doing wrong.
Dockerfile
FROM node:12.18.3
ENV PATH /app/node_modules/.bin:$PATH
RUN npm install --no-save -g expo-cli
RUN mkdir /app
WORKDIR /app
RUN adduser user
USER user
docker-compose.yml
version: "3"
services:
app:
build:
context: .
expose:
- "19000"
- "19001"
- "19002"
- "19003"
ports:
- "19000:19000"
- "19000:19001"
- "19000:19002"
- "19000:19003"
volumes:
- ./app:/app
command: sh -c "cd myapp && npm start"
Assuming my app is called myapp
Here are the commands I use to install additional npm packages and initialize project:
docker-compose run --rm app sh -c "npx expo-cli init myapp"
docker-compose run --rm app sh -c "cd /app/myapp && npm install react-navigation --verbose"
I see that several things happen, but it always hangs somewhere and never at the same place everytime I start from skratch.
Please help!

Related

Using Nodemon with Docker is not refreshing the project when the codebase changes

I am building a Node.JS application using Postgres as the database. I am using Docker/Docker compose for the development environment. I could dockerise my environment. However there is one issue remain which is the project is not refreshing or showing the latest changes using Nodemon. Basically it is not detecting the code changes.
I have a docker-compose.yaml file with the following code.
version: '3.8'
services:
web:
container_name: web-server
build: .
ports:
- 4000:4000
restart: always
This is my Dockerfile.
FROM node:14-alpine
RUN apk update && apk upgrade
RUN apk add nodejs
RUN rm -rf /var/cache/apk/*
COPY . /
RUN cd /; npm install;
RUN npm install -g nodemon
EXPOSE 4000
CMD ["npm", "run", "docker-dev"]
This is my docker-dev script in package.json file
"docker-dev": "nodemon -L --inspect=0.0.0.0 index.js"
When I spin up the environment running docker-compose up -d and go to the localhost:4000 on the browser, I can see my application up and running there. But when I make the changes to the code and refresh the page, the new changes are not there. It seems like Nodemon is not working. I had to remove Docker images and spin up the environment again to see the latest changes.
How can I fix it?
During development, I often use a raw node image and run it like this
docker run --rm -d -v $(pwd):/app --workdir /app -u $(id -u):$(id -g) -p 3000:3000 --entrypoint /bin/bash node -c "npm install && npm start"
The different parts of the command do the following
--rm remove the container when it exits
-d run detached
-v $(pwd):/app map the current directory to /app in the container
--workdir /app set the working directory in the container to /app
-u $(id -u):$(id -g) run the container with my UID and GID so any files created in the host directory will be owned by me
-p 3000:3000 map port 3000 to the host
--entrypoint /bin/bash set the entrypoint to bash
node run the official node image
-c "npm install && npm start" on container start, install npm packages and start the app
You can do something similar. If we replace a few things, this should match your project
docker run --rm -d -v $(pwd):/app --workdir /app -u $(id -u):$(id -g) -p 4000:4000 --entrypoint /bin/sh node:14-alpine -c "npm install && npm install -g nodemon && npm run docker-dev"
I've changed the entrypoint because Alpine doesn't have bash installed. The image is node:14-alpine. The port is 4000. And on start, you want it to install nodemon and run the docker-dev script.
I put the command in a shell script so I don't have to type it all out every time.
You are doing your changes in your local file system and not in the docker container file system.
You need to use a volume of you want to see your changes reflecting real time.

Docker webpack doesn't create bundle in right place

I am using Docker to run my React universal application using docker-compose as an control tool. Unfortunately, after webpack successfully creates the bundles my pm2 inside docker can't find the script and my container exits with error.
What is the best way to debug this or solve the problem?
Dockerfile
FROM node:erbium
WORKDIR /app
RUN npm install pm2 -g
RUN npm install bower -g
COPY package*.json /app/
RUN npm install
COPY . /app/
RUN bower install
RUN npm run build
EXPOSE 8080
CMD ["pm2-runtime", "start", "ecosystem.config.js", "--env", "production"]
docker-compoe service
client:
container_name: client_folder
image: client_image
build: ./finstead-client
volumes:
- ./client_folder:/app
- /app/node_modules
restart: "always"
ports:
- "4000:8080"
the application is inside folder client_folder and I use volumes to attach my files to the docker container. But In my opinion it doesn't attach volumes after buildtime and on runtime it can't find the files.
I would appreciate a lot if you could give me advice or help me find out the root of the problem.

docker-compose volume started acting weird, no permissions on files after switching branches

We have been using docker-compose for 1 year and we had no problems. in the past week we each started getting a weird error related to permissions
at internal/main/run_main_module.js:17:47 : Error: EPERM: operation not permitted, open <PATH TO FILE>
only happens when I switch branches
compose structure:
version: "2.4"
# template:
x-base: &base-service-template
env_file:
- ./.env
working_dir: /app/
volumes:
- ./src:/app/src:cached
services:
service1:
image: service1
<<: *base-service-template
service2:
image: service2
<<: *base-service-template
we are all working on OSX. we tried giving docker permissions over the filesystem and it still didn't work.
BUT there is something that works. restarting the daemon. but i don't want to restart the daemon each time I switch branches
additional info:
the docker file base of each service looks like this
FROM node:12-alpine as builder
ENV TZ=Europe/London
RUN npm i npm#latest -g
RUN mkdir /app && chown node:node /app
WORKDIR /app
RUN apk add --no-cache python3 make g++ tini \
&& apk add --update tzdata
USER node
COPY package*.json ./
RUN npm install --no-optional && npm cache clean --force
ENV PATH /app/node_modules/.bin:$PATH
COPY . .
FROM builder as dev
USER node
CMD ["nodemon", "src/services/service/service.js"]
FROM builder as prod
USER node
ENTRYPOINT ["/sbin/tini", "--"]
CMD ["node", "src/services/service/service.js"]
I run on the dev layer so that we can leverage the nodemon code reload.
docker version: Docker version 19.03.13, build 4484c46d9d
docker-compose version: docker-compose version 1.27.4, build 40524192
So after trying some weird stuff, an answer i found was:
after doing this:
remember to restart the shell that you run your compose from.
An edit from the future: it still misbehaves sometimes so this may not fix for everyone

Install a new node dependency inside a Docker container

Considering the following development environment with Docker:
# /.env
CONTAINER_PROJECT_PATH=/projects/my_project
# /docker-compose.yml
version: '2'
services:
web:
restart: always
build: ./docker/web
working_dir: ${CONTAINER_PROJECT_PATH}
ports:
- "3000:3000"
volumes:
- .:${CONTAINER_PROJECT_PATH}
- ./node_modules:${CONTAINER_PROJECT_PATH}/node_modules
# /docker/web/Dockerfile
FROM keymetrics/pm2:latest-alpine
FROM node:8.11.4
ENV NPM_CONFIG_PREFIX=/home/node/.npm-global
RUN mkdir -p /projects/my_project
RUN chown node:node /projects/my_project
# If you are going to change WORKDIR value,
# please also consider to change .env file
WORKDIR /projects/my_project
USER node
RUN npm install pm2 -g
CMD npm install && node /projects/my_project/src/index.js
How do I install a new module inside my container? npm install on host won't work because of node_module belongs to the root user. Is there any way to run it from the container?
Edit: Is there something "one-liner" that I could run outside the container to install a module?
Assuming you don't mean to edit the Dockerfile, you can always execute a command on a running container with the -it flag:
$ docker run -it <container name> /usr/bin/npm install my_moduel
First access inside container:
$ docker exec -it <container name> /bin/bash
Then inside container:
$ npm install pm2 -g
You either
a) want to install pm2 globally, which you need to run as root, so place your install before USER node so that you instead run at the default user (root), or
b) you just want to install pm2 to use in your project, in which case just drop the -g flag which tells npm to install globally, and it will instead be in your project node_modules, and you can run it with npx pm2 or node_modules/.bin/pm2, or programmatically from your index.js (for the latter I'd suggest adding it to your package.json and no manually installing it at all).

docker compose vs docker run - node container

I am trying to setup container for react app. My goal is to use docker-compose (I want to execute one command and have everything working)
The problem is that when I am trying you do it with docker-compose and docker file (which are below) I am getting information that:
docker-compose.yml
version: '2'
services:
react:
build: ./images/react
ports:
- "3000:3000"
volumes:
- "/home/my/path:/app"
Dockerfile
FROM node:6.9
WORKDIR /app
RUN npm install
EXPOSE 3000
CMD [ "npm", "start" ]
Result
npm WARN enoent ENOENT: no such file or directory, open '/app/package.json'
But when I did it with docker run and volumes mapping I was able to see packages.json and run npm install command.
docker run -it --rm -v /home/my/path:/app node:6.9 bash
Why it is not working with docker compose?
Note that the volume that you're describing in your docker-compose.yml file will be mounted at run time, not at build time. This means that when building the image, there will not be any package.json file there (yet), from which you could install your dependencies.
When running the container image with -v /home/my/path:/app, you're actually mounting the directory first, and subsequent npm install invocations will complete succesfully.
If you intend to mount your application (including package.json) into your container, the npm install needs to happen at run time (CMD), and not at build time (RUN).
The easiest way to accomplish this would be to simply add the npm install statement to your CMD instruction (and drop the RUN npm install instruction):
CMD npm install && npm start

Resources