Install a new node dependency inside a Docker container - node.js

Considering the following development environment with Docker:
# /.env
CONTAINER_PROJECT_PATH=/projects/my_project
# /docker-compose.yml
version: '2'
services:
web:
restart: always
build: ./docker/web
working_dir: ${CONTAINER_PROJECT_PATH}
ports:
- "3000:3000"
volumes:
- .:${CONTAINER_PROJECT_PATH}
- ./node_modules:${CONTAINER_PROJECT_PATH}/node_modules
# /docker/web/Dockerfile
FROM keymetrics/pm2:latest-alpine
FROM node:8.11.4
ENV NPM_CONFIG_PREFIX=/home/node/.npm-global
RUN mkdir -p /projects/my_project
RUN chown node:node /projects/my_project
# If you are going to change WORKDIR value,
# please also consider to change .env file
WORKDIR /projects/my_project
USER node
RUN npm install pm2 -g
CMD npm install && node /projects/my_project/src/index.js
How do I install a new module inside my container? npm install on host won't work because of node_module belongs to the root user. Is there any way to run it from the container?
Edit: Is there something "one-liner" that I could run outside the container to install a module?

Assuming you don't mean to edit the Dockerfile, you can always execute a command on a running container with the -it flag:
$ docker run -it <container name> /usr/bin/npm install my_moduel

First access inside container:
$ docker exec -it <container name> /bin/bash
Then inside container:
$ npm install pm2 -g

You either
a) want to install pm2 globally, which you need to run as root, so place your install before USER node so that you instead run at the default user (root), or
b) you just want to install pm2 to use in your project, in which case just drop the -g flag which tells npm to install globally, and it will instead be in your project node_modules, and you can run it with npx pm2 or node_modules/.bin/pm2, or programmatically from your index.js (for the latter I'd suggest adding it to your package.json and no manually installing it at all).

Related

Using Nodemon with Docker is not refreshing the project when the codebase changes

I am building a Node.JS application using Postgres as the database. I am using Docker/Docker compose for the development environment. I could dockerise my environment. However there is one issue remain which is the project is not refreshing or showing the latest changes using Nodemon. Basically it is not detecting the code changes.
I have a docker-compose.yaml file with the following code.
version: '3.8'
services:
web:
container_name: web-server
build: .
ports:
- 4000:4000
restart: always
This is my Dockerfile.
FROM node:14-alpine
RUN apk update && apk upgrade
RUN apk add nodejs
RUN rm -rf /var/cache/apk/*
COPY . /
RUN cd /; npm install;
RUN npm install -g nodemon
EXPOSE 4000
CMD ["npm", "run", "docker-dev"]
This is my docker-dev script in package.json file
"docker-dev": "nodemon -L --inspect=0.0.0.0 index.js"
When I spin up the environment running docker-compose up -d and go to the localhost:4000 on the browser, I can see my application up and running there. But when I make the changes to the code and refresh the page, the new changes are not there. It seems like Nodemon is not working. I had to remove Docker images and spin up the environment again to see the latest changes.
How can I fix it?
During development, I often use a raw node image and run it like this
docker run --rm -d -v $(pwd):/app --workdir /app -u $(id -u):$(id -g) -p 3000:3000 --entrypoint /bin/bash node -c "npm install && npm start"
The different parts of the command do the following
--rm remove the container when it exits
-d run detached
-v $(pwd):/app map the current directory to /app in the container
--workdir /app set the working directory in the container to /app
-u $(id -u):$(id -g) run the container with my UID and GID so any files created in the host directory will be owned by me
-p 3000:3000 map port 3000 to the host
--entrypoint /bin/bash set the entrypoint to bash
node run the official node image
-c "npm install && npm start" on container start, install npm packages and start the app
You can do something similar. If we replace a few things, this should match your project
docker run --rm -d -v $(pwd):/app --workdir /app -u $(id -u):$(id -g) -p 4000:4000 --entrypoint /bin/sh node:14-alpine -c "npm install && npm install -g nodemon && npm run docker-dev"
I've changed the entrypoint because Alpine doesn't have bash installed. The image is node:14-alpine. The port is 4000. And on start, you want it to install nodemon and run the docker-dev script.
I put the command in a shell script so I don't have to type it all out every time.
You are doing your changes in your local file system and not in the docker container file system.
You need to use a volume of you want to see your changes reflecting real time.

npm install hangs when using Docker to install libraries

Here's the deal. I'm trying to setup my environment to develop a react native application using expo. Since I'm working on a Windows OS, I'd like to setup using Docker and docker-compose.
First I build my image, installing expo-cli from a node image. Then I use docker-compose to specify a volume and ports to expose.
Since at this point I have to project initialized, what I do is, use docker-compose run to initialize my project and then the same to install specific project libraries. But when I do, at some random point, the installation hangs.
Now it seems like it's docker related, but I'm not sure what I'm doing wrong.
Dockerfile
FROM node:12.18.3
ENV PATH /app/node_modules/.bin:$PATH
RUN npm install --no-save -g expo-cli
RUN mkdir /app
WORKDIR /app
RUN adduser user
USER user
docker-compose.yml
version: "3"
services:
app:
build:
context: .
expose:
- "19000"
- "19001"
- "19002"
- "19003"
ports:
- "19000:19000"
- "19000:19001"
- "19000:19002"
- "19000:19003"
volumes:
- ./app:/app
command: sh -c "cd myapp && npm start"
Assuming my app is called myapp
Here are the commands I use to install additional npm packages and initialize project:
docker-compose run --rm app sh -c "npx expo-cli init myapp"
docker-compose run --rm app sh -c "cd /app/myapp && npm install react-navigation --verbose"
I see that several things happen, but it always hangs somewhere and never at the same place everytime I start from skratch.
Please help!

How can I tell in Dockerfile which comand I want to execute when input "docker run" command?

It's beginning of my dockerfile:
FROM ubuntu:latest
RUN apt-get update
RUN apt-get install node
RUN apt-get install npm
RUN apt-get install mongo
RUN mkdir app
COPY . app/
WORKDIR app
When I input docker run <image name>, my container must run node index.js
What I must do for it?
Janez comment will work, however, if your container MUST start with 'node index.js' you will want to use ENTRYPOINT:
FROM ubuntu:latest
RUN apt-get update
RUN apt-get install node
RUN apt-get install npm
RUN apt-get install mongo
RUN mkdir app
COPY . app/
WORKDIR app
ENTRYPOINT ["node", "index.js"]
There is an extensive discussion on ENTYRYPOINT vs CMD here in Stackoverflow already if you'd like to know more about the difference between the two, or how they can be combined e.g.
ENTRYPOINT ["node"]
CMD ["index.js"]
with that instead you could call the container with an alternate js file i.e.
docker run -it <image-name> test.js
and now the container would start and run node test.js instead

docker compose vs docker run - node container

I am trying to setup container for react app. My goal is to use docker-compose (I want to execute one command and have everything working)
The problem is that when I am trying you do it with docker-compose and docker file (which are below) I am getting information that:
docker-compose.yml
version: '2'
services:
react:
build: ./images/react
ports:
- "3000:3000"
volumes:
- "/home/my/path:/app"
Dockerfile
FROM node:6.9
WORKDIR /app
RUN npm install
EXPOSE 3000
CMD [ "npm", "start" ]
Result
npm WARN enoent ENOENT: no such file or directory, open '/app/package.json'
But when I did it with docker run and volumes mapping I was able to see packages.json and run npm install command.
docker run -it --rm -v /home/my/path:/app node:6.9 bash
Why it is not working with docker compose?
Note that the volume that you're describing in your docker-compose.yml file will be mounted at run time, not at build time. This means that when building the image, there will not be any package.json file there (yet), from which you could install your dependencies.
When running the container image with -v /home/my/path:/app, you're actually mounting the directory first, and subsequent npm install invocations will complete succesfully.
If you intend to mount your application (including package.json) into your container, the npm install needs to happen at run time (CMD), and not at build time (RUN).
The easiest way to accomplish this would be to simply add the npm install statement to your CMD instruction (and drop the RUN npm install instruction):
CMD npm install && npm start

how to install global angular-cli inside docker container?

I created an image from my angular2 app created by angular-cli using below dockerfile.
FROM node:latest
MAINTAINER BHUSHAN GADEKAR
ENV NODE_ENV=production
ENV PORT=4200
COPY . /var/www
WORKDIR /var/www
RUN npm install
EXPOSE $PORT
ENTRYPOINT ["npm","start"]
After successful creation of image , I tried to run this image using below command
docker run --rm -p 8080:4200 -w "/var/www" bhushan001/angular2-cli npm start
Now I can see my container getting started.but it runs into error that:
ng serve "npm" "start" sh: 1: ng: not found
So I know that angular-cli is not present inside my container but I had installed it locally using npm install in dockerfile.
any inputs?
thanks in advance.
How can I troubleshoot this?
use bash -c -l 'npm start' - this ensure your ENV is populated, which you need.

Resources