Docker Compose node_modules in container empty after file change - node.js

I am trying to run a Node.js in a Docker container via Docker Compose.
The node_modules should be created in the image and the source code should be synced from the host.
Therefore, I use 2 volumes in docker-compose.yml. One for my project source and other for the node_modules in the image.
Everything seems to be working. The node_modules are installed and nodemon starts the app. In my docker container I have a node_modules folder with all dependencies. On my host an empty node_modules is created (I am not sure if this is expected).
But, when I change a file from the project. The nodemon process detects a file change and restarts the app. Now the app crashes because it can't find modules. The node_modules folder in the Docker container is empty now.
What am I doing wrong?
My folder structure looks like this
/
├── docker-compose.yml
├── project/
│ ├── package.json
│ ├── Dockerfile
docker-compose.yml
version: '3'
services:
app:
build: ./project
volumes:
- ./project/:/app/
- /app/node_modules/
project/Dockerfile
# base image
FROM node:9
ENV APP_ROOT /app
# set working directory
RUN mkdir $APP_ROOT
WORKDIR $APP_ROOT
# install and cache app dependencies
COPY package.json $APP_ROOT
COPY package-lock.json $APP_ROOT
RUN npm install
# add app
COPY . $APP_ROOT
# start app
CMD ["npm", "run", "start"]
project/package.json
...
"scripts": {
"start": "nodemon index.js"
}
...

Mapping a volume works to make files available to the container, not the other way round.
You can fix your issue by running "npm install" as part of the CMD. You can achieve this by having a "startup" script (eg start.sh) that runs npm install && npm run start. The script should be copied in the container with a normal COPY command and be executable.
When you start your container you should see files in the node_modules folder (on host).

You could use any of these two solutions :
npm install on the host and make a volume that the container can use and that references the host node_modules folder.
Use npm install in the image/container build process - which can be a pain for a development setup since it will npm i everytime you restart the container (if you change some files).

Related

Command in dockerfile doesn't take effect

Below is my dockerfile. After the dependencies are installed, I want to delete a specific binary (ffmpeg) from the node_modules folder on the container, and then reinstall it using the install.js file that exists under the same folder in node_modules.
FROM node:16-alpine
WORKDIR /web
COPY package.json package-lock.json ./
ARG NODE_ENV
ENV NODE_ENV ${NODE_ENV:-development}
ARG _ENV
ENV _ENV ${_ENV:-dev}
RUN npm install
RUN rm /web/node_modules/ffmpeg-static/ffmpeg
RUN node /web/node_modules/ffmpeg-static/install.js
COPY . .
EXPOSE 8081
ENTRYPOINT [ "npm" ]
I want the rm and node commands to take effect on the container after the startup and when all the dependencies are installed and copied to the container, but it doesn't happen. I feel like the commands after RUN are only executed during the build process, not after the container starts up.
After the container is up, when I ssh to it and execute the two commands above directly (RUN rm... and RUN node...), my changes are taking effect and everything works perfectly. I basically want these commands to automatically run after the container is up and running.
The docker-compose.yaml file looks like this:
version: '3'
services:
serverless:
build: web
user: root
ports:
- '8081:8081'
env_file:
- env/web.env
environment:
- _ENV=dev
command:
- run
- start
volumes:
- './web:/web'
- './env/mount:/secrets/'
if this dockerfile builds, it means you have a package-lock.json. That is evidence that npm install was executed in the root directory for that image, which means that node_modules exists locally and is being copied in the last copy.
You can avoid this by creating a .dockerignore file that includes files and directories (aka node_modules) that you'd like to exclude from being passed to the Docker build context, which will make sure that they don't get copied to your final image.
You can follow this example:
https://stackoverflow.com/a/43747867/3669093
Update
Dockerfile: Replace the ENTRYPOINT with the following
ADD ./start.sh /start.sh
CMD ["/start.sh"]
Start.sh
rm /web/node_modules/ffmpeg-static/ffmpeg
node /web/node_modules/ffmpeg-static/install.js
npm
Try rm -rf /web/node_modules/ffmpeg-static/ffmpeg
I assume that is directory, not a file.

Local nodejs module not being found by docker

I have a nodejs module called my-common that contains a couple of js files. These js files export functions that are used throughout a lot of other modules.
My other module (called demo) contains a dependency to the the common module like this:
"dependencies": {
"my-common": "file:../my-common/",
}
When I goto the demo directory and run npm start it works fine. I then build a docker image using the following Dockerfile:
FROM node:8
ENV NODE_ENV=production
WORKDIR /usr/src/app
COPY package*.json ./
RUN npm install --only=production
COPY . .
EXPOSE 3000
CMD [ "npm", "start" ]
When I start the image I get an error that my-common can not be found. I'm guessing that the my-common module isn't being copied into the node_modules directory of the demo module.
I have tried npm link however I think it's a really really really bad idea to need sudo permission to install a global module because this could cause problems on other systems.
I have tried npm install my-common/ in the root directory and that installs the module into my HOME_DIR/node_modules however that isn't installed either into the docker container.
Everywhere I look there doesn't seem an answer to this very simple question. How can I fix this?
So, I see a couple different things.
When Docker runs npm install --only=production in the image, Docker sees file:../my-common/ and looks at the parent directory of the WORKDIR of the Docker image, which is /usr/src/app. Since nothing besides package.json has been copied into the image at that point, it can't find the module. If you want to install everything locally and then move it into the image, you can do that by removing the npm install --only=production command from the Dockerfile, and make sure your .dockerignore file doesn't ignore the node_modules directory.
If you want to install modules in the image, you need to specifically copy the my-common directory into the docker image. However, Docker doesn't allow you to copy something from a parent directory into a image. Any local content has to be in the context of the Dockerfile. You have a couple options:
Option 1: Move my-common/ into the root of your project, update your Dockerfile to copy that folder and update package.json to point to the correct location.
Dockerfile:
FROM node:8
ENV NODE_ENV=production
WORKDIR /usr/src/app
COPY my-common/ ./
COPY package*.json ./
RUN npm install --only=production
COPY . .
EXPOSE 3000
CMD [ "npm", "start" ]
package.json:
"dependencies": {
"my-common": "file:./my-common/",
}
Option 2: Move the context of the Docker image up one directory. By this I mean move the Dockerfile to the same level as my-common directory and update your Dockerfile and package.json to reflect that change.
Dockerfile:
FROM node:8
ENV NODE_ENV=production
WORKDIR /usr/src/app
RUN mkdir my-common
COPY ./my-common ./my-common
COPY ./<projectName>/package*.json .
RUN npm install --only=production
COPY ./<projectName> .
EXPOSE 3000
CMD [ "npm", "start" ]
package.json:
"dependencies": {
"my-common": "file:./my-common/",
}

How can I copy node_modules folder out of my docker container onto the build machine?

I am moving an application to a new build pipeline. On CI I am not able to install node to complete the NPM install step.
My idea to is to move the npm install step to a Docker image that uses Node, install the node modules and them copy the node modules back to the host so another process can package up the application.
This is my Dockerfile:
FROM node:9
# Create app directory
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
# Install app dependencies
COPY ./dashboard-interface/package.json /usr/src/app/
RUN npm install --silent --production
# Bundle app src
COPY node_modules ./dashboard-interface/node_modules #I thought this would copy the new node_modules back to the host
This runs fine and install the node modules, but when I try and copy the node_modules directory back to the host I see an error saying:
COPY node_modules ./dashboard-interface/node_modules
COPY failed: stat /var/lib/docker/tmp/docker-builder718557240/node_modules: no such file or directory
So it's clear that the copy process cannot find the node_modules directory that it has just installed the node modules too.
According to the documentation of the COPY instruction, the COPY instruction copies a file from the host to the container.
If you want the files from the container to be available outside your container, you can use Volumes. Volumes will help you have a storage for your container that is independent of the container itself, and thus you can use it for other containers in the future.
Let me try to solve the issue you are having.
Here is the Dockerfile
# Use alpine for slimer image
FROM node:9-alpine
RUN mkdir /app
WORKDIR /app
COPY /dashboard-folder/package.json .
RUN npm i --production
COPY node_modules ./root
Assumes the following that your project stucture is like so:
|root
| Dockerfile
|
\---dashboard-folder
package.json
Where root is your working directory that will recieve node_modules
Building image this image with docker build . -t name and subsequently using it like so :
docker run -it --rm ${PWD}:/app/root NAME mv node_modules ./root
Should do the trick.
The simple and sure way is to do volume mapping, for example the docker-compose yaml file will have a volumes section that looks like this:
….
volumes:
- ./: /usr/src/app
- /usr/src/app/node_modules
For docker run command, use:
-v ./:/usr/src/app
and on Dockerfile, define:
VOLUME /usr/src/app
VOLUME /usr/src/app/node_modules
But confirm first that the run of npm install did create the
node_modules directory on the host system.
The main reason that you hitting the problem is depending on the OS that is running on your host. If your host is running Linux then for sure will be no problem but if your host is on Mac or Windows, then what happened is that docker is actually running on a VM which is hidden from you and hence the path you cannot map directly to host system. Instead, you can use Volume.

Using SemanticUI with Docker

I setted up a Docker container running an express app. Here is the dockerfile :
FROM node:latest
# Create app directory
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
# Install app dependencies
COPY package.json /usr/src/app/
RUN npm install
RUN npm install -g nodemon
RUN npm install --global gulp
RUN npm i gulp
# Bundle app source
COPY . /usr/src/app
EXPOSE 3000
CMD [ "nodemon", "-L", "./bin/www" ]
As you can see, it uses the nodejs image and creates an app folder in my container, itself containing my application. IThe docker container, on start runs npm install and installs my modules on the container (thanks to that i don't have this node_modules folder in my local folder) i want to integrate SemanticUI which uses gulp. I so installed gulp on my container but the files created by semantic are only present on my container. They are not on my local folder. How can i dynamically make that those files created on my container are locally present to modify.
I thought that one of docker's great jobs was to not have node_modules installed on your local app folder .. maybe i am wrong if so please correct me
You can use data volume to share files with the container and host machine.
Something like this:
docker run ... -v /path/on/host:/path/in/container ...
or if you are using docker-compose, see volume configuration reference.
...
volumes:
- /path/on/host:/path/in/container

Node.js + Docker Compose: node_modules disappears

I'm attempting to use Docker Compose to bring together a number of Node.js apps in my development environment. I'm running into an issue, however, with node_modules.
Here's what's happening:
npm install is run as part of the Dockerfile.
I do not have node_modules in my local directory. (I shouldn't because the installation of dependencies should happen in the container, right? It seems to defeat the purpose otherwise, since I'd need to have Node.js installed locally.)
In docker-compose.yml, I'm setting up a volume with the source code.
docker-compose build runs fine.
When I docker-compose up, the node_modules directory disappears in the container — I'm assuming because the volume is mounted and I don't have it in my local directory.
How do I ensure that node_modules sticks around?
Dockerfile
FROM node:0.10.37
COPY package.json /src/package.json
WORKDIR /src
RUN npm install -g grunt-cli && npm install
COPY . /src
EXPOSE 9001
CMD ["npm", "start"]
docker-compose.yml
api:
build: .
command: grunt
links:
- elasticsearch
ports:
- "9002:9002"
volumes:
- .:/src
elasticsearch:
image: elasticsearch:1.5
Due to the way Node.js loads modules, simply place node_modules higher in the source code path. For example, put your source at /app/src and your package.json in /app, so /app/node_modules is where they're installed.
I tried your fix, but the issue is that most people run npm install in the /usr/src/app directory, resulting in the node_modules folder ending up in the /app directory. As a result the node_modules folder ends up in both the /usr/src and /usr/src/app directory in the container and you end up with the same issue you started with.

Resources