docker compose vs docker run - node container - node.js

I am trying to setup container for react app. My goal is to use docker-compose (I want to execute one command and have everything working)
The problem is that when I am trying you do it with docker-compose and docker file (which are below) I am getting information that:
docker-compose.yml
version: '2'
services:
react:
build: ./images/react
ports:
- "3000:3000"
volumes:
- "/home/my/path:/app"
Dockerfile
FROM node:6.9
WORKDIR /app
RUN npm install
EXPOSE 3000
CMD [ "npm", "start" ]
Result
npm WARN enoent ENOENT: no such file or directory, open '/app/package.json'
But when I did it with docker run and volumes mapping I was able to see packages.json and run npm install command.
docker run -it --rm -v /home/my/path:/app node:6.9 bash
Why it is not working with docker compose?

Note that the volume that you're describing in your docker-compose.yml file will be mounted at run time, not at build time. This means that when building the image, there will not be any package.json file there (yet), from which you could install your dependencies.
When running the container image with -v /home/my/path:/app, you're actually mounting the directory first, and subsequent npm install invocations will complete succesfully.
If you intend to mount your application (including package.json) into your container, the npm install needs to happen at run time (CMD), and not at build time (RUN).
The easiest way to accomplish this would be to simply add the npm install statement to your CMD instruction (and drop the RUN npm install instruction):
CMD npm install && npm start

Related

npm install hangs when using Docker to install libraries

Here's the deal. I'm trying to setup my environment to develop a react native application using expo. Since I'm working on a Windows OS, I'd like to setup using Docker and docker-compose.
First I build my image, installing expo-cli from a node image. Then I use docker-compose to specify a volume and ports to expose.
Since at this point I have to project initialized, what I do is, use docker-compose run to initialize my project and then the same to install specific project libraries. But when I do, at some random point, the installation hangs.
Now it seems like it's docker related, but I'm not sure what I'm doing wrong.
Dockerfile
FROM node:12.18.3
ENV PATH /app/node_modules/.bin:$PATH
RUN npm install --no-save -g expo-cli
RUN mkdir /app
WORKDIR /app
RUN adduser user
USER user
docker-compose.yml
version: "3"
services:
app:
build:
context: .
expose:
- "19000"
- "19001"
- "19002"
- "19003"
ports:
- "19000:19000"
- "19000:19001"
- "19000:19002"
- "19000:19003"
volumes:
- ./app:/app
command: sh -c "cd myapp && npm start"
Assuming my app is called myapp
Here are the commands I use to install additional npm packages and initialize project:
docker-compose run --rm app sh -c "npx expo-cli init myapp"
docker-compose run --rm app sh -c "cd /app/myapp && npm install react-navigation --verbose"
I see that several things happen, but it always hangs somewhere and never at the same place everytime I start from skratch.
Please help!

Can't build Node container with volume with package.json file to run npm install

How does Docker build containers? I can't figure it out. I want:
build a container
pass local folder in it
install npm in the container (using dockerfile) in the volume folder( so, I can see it on my local drive)
run a command in my yaml config file
I've tried to list content of folders with ls command, but the /src/ is always empty (prints: src)
My docker-compose.yml:
version: '3'
services:
node:
build:
context: .
dockerfile: Dockerfile.node
volumes:
- ./src:/src
command: run develop
tty: true
My Dockerfile.node:
FROM node:12
WORKDIR /src
COPY ./src/package*.json ./src/
RUN ls
RUN cd ./src
RUN ls
RUN npm install
RUN ls
On the RUN npm install command I got this error:
npm WARN saveError ENOENT: no such file or directory, open '/src/package.json'
I start project with command docker-compose up --build
My folder structure is:
/
src
--package.json
docker-compose.yml
Dockerfile.node
Please help, thank you in advance.
cd ./src only available in the current RUN command, as Dockerfile each command run in a separate shell, so when it comes to run npm install at this time your working is WORKDIR that is /src not the one you are expecting using cd .src which should be /src/src.
RUN pwd
#/src
RUN cd ./src #here /src/src
RUN ls
#/src <-- back to WORKDIR, while you are expecting /src/src
RUN npm install
In short, there is WORKDIR in dockerfile not cd.
You have to option, change command
RUN cd ./src && npm i
or change the copy command and leave the rest as it is.
COPY ./src/package*.json .

docker-compose for a node project getting cannot find package.json on windows

I'm trying to get docker-compose working on my Windows 8 box. I have the following docker-compose file
version: '3'
services:
testweb:
build: .
command: npm run install
volumes:
- .:/usr/app/
working_dir: /app
ports:
- "3000:3000"
However when I run this using docker-compose I get an error saying cannot find package.json. I know this is something to do with how the paths are mapped. So I moved my folder to c:\users and tried with the same issue. I then moved to c:\users\ and tried, ended up with the same issue. The mapping on my virtual box is as follows
Does anyone know how to fix this?
Attached is my Dockerfile
FROM node:7.7.2-alpine
WORKDIR /usr/app
RUN apk update && apk add postgresql
COPY package.json .
RUN npm install --quiet
COPY . .
It may be because you set the working directory to /app in your docker-compose file, but your package.json is in /usr/app. So the container runs npm run install from the wrong directory.
For a docker container to stay up you need to have a running process. In this case its the command npm start which will keep running as long as the container is up. So you need to have a cmd or Entrypoint that will start a running process. Please try something like this. Let me know if you have any questions
FROM node:7.7.2-alpine
WORKDIR /usr/app
RUN apk update && apk add postgresql
COPY package.json .
RUN npm install --quiet
COPY . .
EXPOSE 8080
CMD [ "npm", "start" ]
Put file docker-compose.yml in project/app directory.
Then run docker compose up in project/app directory.
reference: https://forums.docker.com/t/use-docker-compose-in-get-started-tutorial-solved/129415

Install a new node dependency inside a Docker container

Considering the following development environment with Docker:
# /.env
CONTAINER_PROJECT_PATH=/projects/my_project
# /docker-compose.yml
version: '2'
services:
web:
restart: always
build: ./docker/web
working_dir: ${CONTAINER_PROJECT_PATH}
ports:
- "3000:3000"
volumes:
- .:${CONTAINER_PROJECT_PATH}
- ./node_modules:${CONTAINER_PROJECT_PATH}/node_modules
# /docker/web/Dockerfile
FROM keymetrics/pm2:latest-alpine
FROM node:8.11.4
ENV NPM_CONFIG_PREFIX=/home/node/.npm-global
RUN mkdir -p /projects/my_project
RUN chown node:node /projects/my_project
# If you are going to change WORKDIR value,
# please also consider to change .env file
WORKDIR /projects/my_project
USER node
RUN npm install pm2 -g
CMD npm install && node /projects/my_project/src/index.js
How do I install a new module inside my container? npm install on host won't work because of node_module belongs to the root user. Is there any way to run it from the container?
Edit: Is there something "one-liner" that I could run outside the container to install a module?
Assuming you don't mean to edit the Dockerfile, you can always execute a command on a running container with the -it flag:
$ docker run -it <container name> /usr/bin/npm install my_moduel
First access inside container:
$ docker exec -it <container name> /bin/bash
Then inside container:
$ npm install pm2 -g
You either
a) want to install pm2 globally, which you need to run as root, so place your install before USER node so that you instead run at the default user (root), or
b) you just want to install pm2 to use in your project, in which case just drop the -g flag which tells npm to install globally, and it will instead be in your project node_modules, and you can run it with npx pm2 or node_modules/.bin/pm2, or programmatically from your index.js (for the latter I'd suggest adding it to your package.json and no manually installing it at all).

Using SemanticUI with Docker

I setted up a Docker container running an express app. Here is the dockerfile :
FROM node:latest
# Create app directory
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
# Install app dependencies
COPY package.json /usr/src/app/
RUN npm install
RUN npm install -g nodemon
RUN npm install --global gulp
RUN npm i gulp
# Bundle app source
COPY . /usr/src/app
EXPOSE 3000
CMD [ "nodemon", "-L", "./bin/www" ]
As you can see, it uses the nodejs image and creates an app folder in my container, itself containing my application. IThe docker container, on start runs npm install and installs my modules on the container (thanks to that i don't have this node_modules folder in my local folder) i want to integrate SemanticUI which uses gulp. I so installed gulp on my container but the files created by semantic are only present on my container. They are not on my local folder. How can i dynamically make that those files created on my container are locally present to modify.
I thought that one of docker's great jobs was to not have node_modules installed on your local app folder .. maybe i am wrong if so please correct me
You can use data volume to share files with the container and host machine.
Something like this:
docker run ... -v /path/on/host:/path/in/container ...
or if you are using docker-compose, see volume configuration reference.
...
volumes:
- /path/on/host:/path/in/container

Resources