Using SemanticUI with Docker - node.js

I setted up a Docker container running an express app. Here is the dockerfile :
FROM node:latest
# Create app directory
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
# Install app dependencies
COPY package.json /usr/src/app/
RUN npm install
RUN npm install -g nodemon
RUN npm install --global gulp
RUN npm i gulp
# Bundle app source
COPY . /usr/src/app
EXPOSE 3000
CMD [ "nodemon", "-L", "./bin/www" ]
As you can see, it uses the nodejs image and creates an app folder in my container, itself containing my application. IThe docker container, on start runs npm install and installs my modules on the container (thanks to that i don't have this node_modules folder in my local folder) i want to integrate SemanticUI which uses gulp. I so installed gulp on my container but the files created by semantic are only present on my container. They are not on my local folder. How can i dynamically make that those files created on my container are locally present to modify.
I thought that one of docker's great jobs was to not have node_modules installed on your local app folder .. maybe i am wrong if so please correct me

You can use data volume to share files with the container and host machine.
Something like this:
docker run ... -v /path/on/host:/path/in/container ...
or if you are using docker-compose, see volume configuration reference.
...
volumes:
- /path/on/host:/path/in/container

Related

cannot replace to directory /var/lib/docker/overlay2/if2ip5okvavl8u6jpdtpczuog/merged/app/node_modules/#ampproject/remapping with file

On my Windows machine, I am attempting to build a containerized node.js application with the following Dockerfile:
# use latest version of nodejs
FROM node:lts-alpine
# install aurelia-cli to build the app & http-server to serve static contents
RUN npm i -g http-server
RUN npm i -g aurelia-cli
# set working directory to app
# henceforth all commands will run inside this folder
WORKDIR /app
# copy package.json related files first and install all required dependencies
COPY package*.json ./
RUN npm install
# copy the rest of the files and folders & install dependencies
COPY . ./
RUN npm run build
# by default http-server will serve contents on port 8080
# so we expose this port to host machine
EXPOSE 8080
CMD [ "http-server" , "dist" ]
However, docker build . fails at the line Copy . ./. with the message cannot replace to directory /var/lib/docker/overlay2/if2ip5okvavl8u6jpdtpczuog/merged/app/node_modules/#ampproject/remapping with file.
What do I need to do to get my container image to build?
Add node_modules to a .dockerignore file in the same directory as your Dockerfile, as outlined here: (h/t David Maze).
Less gracefully, simply delete the project's node_modules directory then rerun docker build.

Create Node.js app usin Docker without installing node on host machine

New to docker. I wanted to create a simple Node.js app using docker on my Ubuntu OS. But all the tutorials/YouTube Videos are first installing Node on host machine, then run and test the app, and then dockerizing the app. Means all the tutorials are simply teaching "How to dockerize an existing app".
If anyone can provide a little guidance on this then it would be really helpful.
First create a directory for your source code and create something like the starter app from here in app.js. You have to make one change though: hostname has to be 0.0.0.0 rather than 127.0.0.1.
Then run an interactive node container using this command
docker run -it --rm -v $(pwd):/app -p 3000:3000 node /bin/bash
This container has your host directory mapped to /app, so if you do
cd /app
ls -al
you should see your app.js file.
Now you can run it using
node app.js
If you then switch back to the host, open a browser and go to http://localhost:3000/ you should get a 'Hello world' response.
Follow this
# Create app directory
WORKDIR /usr/src/app
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm#5+)
COPY package*.json ./
RUN npm install
# If you are building your code for production
# RUN npm ci --only=production
# Bundle app source
COPY . .
EXPOSE 8080
CMD [ "node", "server.js" ]
docker build . -t <your username>/node-web-app
docker run -p 49160:8080 -d <your username>/node-web-app
You can create package*.json ./ files manually, if you don't want to install node/npm on your local machine.

Docker - build fails with operating system is not supported and The command '/bin/sh -c npm install' returned a non-zero code

I try to build an image for a client app (nextjs app), but the build keeps failing.
This is the docker file:
FROM node:12.18.3
WORKDIR /app
ENV PATH /app/node_modules/.bin:$PATH
COPY package.json /app/
COPY package-lock.json /app/
RUN npm install
COPY . /app
RUN npm build
# start app
CMD [ "npm", "start" ]
It fails on the first step with this error:
Step 1/9 : FROM node:12.18.3
operating system is not supported
I followed this post https://stackoverflow.com/a/51071057/9608006 , changed the experimental settings to true, and it did pass the failing step.
but now it fails on the npm i step
npm notice
The command '/bin/sh -c npm install' returned a non-zero code: 4294967295: failed to shutdown container: container c425947f7f17ed39ed51ac0a67231f78ba7239ad199c7df979b3b442969a0a57 encountered an error during hcsshim::System::waitBackground: failure in a Windows system call: The virtual machine or container with the specified identifier is not running. (0xc0370110): subsequent terminate failed container c425947f7f17ed39ed51ac0a67231f78ba7239ad199c7df979b3b442969a0a57 encountered an error during hcsshim::System::waitBackground: failure in a Windows system call: The virtual machine or container with the specified identifier is not running. (0xc0370110)
I also get this warning in the start of this step:
Step 6/9 : RUN npm install
---> [Warning] The requested image's platform (linux/amd64) does not match the detected host platform (windows/amd64) and no specific platform was requested
I use windows 10,
docker v20.10.5
What is the issue ?
EDIT 1 - Folder structure
the following is the base folders layer of the client app
.next
.vercel
components
enums
hooks
node_modules
pages
pubilc
store
styles
utils
.dockerIgnore
.env.local
next.config.js
package.json
server.js
You are trying to build Linux based image under Windows.
It seems there is a problem in multiarch images of nodejs with tags version 12.
Try the answer under the post that you have tried:
Click on the docker icon in the tray and switch into Linux containers.
https://stackoverflow.com/a/57548944/3040844
If you are using docker desktop.. just change docker desktop option for windows containers bydefault to linux containers and run your dockerfile again.
I think that the problem related to your base image , I used this Dockerfile for nextjs app in my side and it's working correctly :
# Dockerfile
# base image
FROM node:alpine
# create & set working directory
RUN mkdir -p /app
WORKDIR /app
# copy source files
COPY . /app
# install dependencies
RUN npm install
# start app
RUN npm run build
EXPOSE 3000
CMD npm run start
I hope that can help you to resolve your issue .
According to your dockerfile
FROM node:12.18.3
WORKDIR /app
ENV PATH /app/node_modules/.bin:$PATH
COPY package.json /app/
COPY package-lock.json /app/
RUN npm install
COPY . /app
RUN npm build
# start app
CMD [ "npm", "start" ]
You missed the right image FROM node:12.18.3
Correct way to do this FROM node:alpine3.12 or FROM ubuntu:18.04
FROM: FROM directive is probably the most crucial amongst all others for Dockerfiles. It defines the base image to use to start the build process. It can be any image, including the ones you have created previously. If a FROM image is not found on the host, Docker will try to find it (and download) from the Docker Hub or other container repository. It needs to be the first command declared inside a Dockerfile
Simplest Dockerfile with Node Image
FROM node:alpine3.12
WORKDIR /usr/src/app
COPY package*.json ./
RUN npm install
COPY ..
RUN npm run build
EXPOSE 3000
CMD npm run start

Docker Compose node_modules in container empty after file change

I am trying to run a Node.js in a Docker container via Docker Compose.
The node_modules should be created in the image and the source code should be synced from the host.
Therefore, I use 2 volumes in docker-compose.yml. One for my project source and other for the node_modules in the image.
Everything seems to be working. The node_modules are installed and nodemon starts the app. In my docker container I have a node_modules folder with all dependencies. On my host an empty node_modules is created (I am not sure if this is expected).
But, when I change a file from the project. The nodemon process detects a file change and restarts the app. Now the app crashes because it can't find modules. The node_modules folder in the Docker container is empty now.
What am I doing wrong?
My folder structure looks like this
/
├── docker-compose.yml
├── project/
│ ├── package.json
│ ├── Dockerfile
docker-compose.yml
version: '3'
services:
app:
build: ./project
volumes:
- ./project/:/app/
- /app/node_modules/
project/Dockerfile
# base image
FROM node:9
ENV APP_ROOT /app
# set working directory
RUN mkdir $APP_ROOT
WORKDIR $APP_ROOT
# install and cache app dependencies
COPY package.json $APP_ROOT
COPY package-lock.json $APP_ROOT
RUN npm install
# add app
COPY . $APP_ROOT
# start app
CMD ["npm", "run", "start"]
project/package.json
...
"scripts": {
"start": "nodemon index.js"
}
...
Mapping a volume works to make files available to the container, not the other way round.
You can fix your issue by running "npm install" as part of the CMD. You can achieve this by having a "startup" script (eg start.sh) that runs npm install && npm run start. The script should be copied in the container with a normal COPY command and be executable.
When you start your container you should see files in the node_modules folder (on host).
You could use any of these two solutions :
npm install on the host and make a volume that the container can use and that references the host node_modules folder.
Use npm install in the image/container build process - which can be a pain for a development setup since it will npm i everytime you restart the container (if you change some files).

How can I copy node_modules folder out of my docker container onto the build machine?

I am moving an application to a new build pipeline. On CI I am not able to install node to complete the NPM install step.
My idea to is to move the npm install step to a Docker image that uses Node, install the node modules and them copy the node modules back to the host so another process can package up the application.
This is my Dockerfile:
FROM node:9
# Create app directory
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
# Install app dependencies
COPY ./dashboard-interface/package.json /usr/src/app/
RUN npm install --silent --production
# Bundle app src
COPY node_modules ./dashboard-interface/node_modules #I thought this would copy the new node_modules back to the host
This runs fine and install the node modules, but when I try and copy the node_modules directory back to the host I see an error saying:
COPY node_modules ./dashboard-interface/node_modules
COPY failed: stat /var/lib/docker/tmp/docker-builder718557240/node_modules: no such file or directory
So it's clear that the copy process cannot find the node_modules directory that it has just installed the node modules too.
According to the documentation of the COPY instruction, the COPY instruction copies a file from the host to the container.
If you want the files from the container to be available outside your container, you can use Volumes. Volumes will help you have a storage for your container that is independent of the container itself, and thus you can use it for other containers in the future.
Let me try to solve the issue you are having.
Here is the Dockerfile
# Use alpine for slimer image
FROM node:9-alpine
RUN mkdir /app
WORKDIR /app
COPY /dashboard-folder/package.json .
RUN npm i --production
COPY node_modules ./root
Assumes the following that your project stucture is like so:
|root
| Dockerfile
|
\---dashboard-folder
package.json
Where root is your working directory that will recieve node_modules
Building image this image with docker build . -t name and subsequently using it like so :
docker run -it --rm ${PWD}:/app/root NAME mv node_modules ./root
Should do the trick.
The simple and sure way is to do volume mapping, for example the docker-compose yaml file will have a volumes section that looks like this:
….
volumes:
- ./: /usr/src/app
- /usr/src/app/node_modules
For docker run command, use:
-v ./:/usr/src/app
and on Dockerfile, define:
VOLUME /usr/src/app
VOLUME /usr/src/app/node_modules
But confirm first that the run of npm install did create the
node_modules directory on the host system.
The main reason that you hitting the problem is depending on the OS that is running on your host. If your host is running Linux then for sure will be no problem but if your host is on Mac or Windows, then what happened is that docker is actually running on a VM which is hidden from you and hence the path you cannot map directly to host system. Instead, you can use Volume.

Resources