I want to generate a .env file and push it to the docker image during the docker build with Dockerfile.
The problem is that the .env file is not copy to the docker image.
I tried to add COPY .env to /website/.env but it can't find it.
I execute a nodejs script during the build process who get envs from AWS and then create a .env file.
FROM node:10.15.3-alpine
ARG SERVER_ENV
# Create app directory
RUN mkdir /website
WORKDIR /website
COPY . /website/
RUN node aws ${SERVER_ENV}
COPY .env /website/.env
COPY package.json yarn.lock ./
RUN ls -al
RUN pwd
# Install Yarn
RUN npm install -g yarn#1.15.2
# Install app dependencies
RUN yarn install
# Build source files
RUN yarn run build
UPDATE
I finally found a way to fix it.
I separate task in my Dockerfile like this :
FROM mhart/alpine-node:10 as env
ARG SERVER_ENV
WORKDIR /usr/src
COPY aws.js /usr/src
RUN yarn add aws-sdk
COPY . .
RUN node aws ${SERVER_ENV}
FROM mhart/alpine-node:10 as base
WORKDIR /usr/src
COPY package.json yarn.lock /usr/src/
RUN yarn install
COPY . .
COPY --from=env /usr/src .
RUN yarn build
FROM mhart/alpine-node:10
WORKDIR /usr/src
COPY --from=base /usr/src .
Maybe you have a .dockerignore file that blocks the .env file from being copied to the image?
Related
On my Windows machine, I am attempting to build a containerized node.js application with the following Dockerfile:
# use latest version of nodejs
FROM node:lts-alpine
# install aurelia-cli to build the app & http-server to serve static contents
RUN npm i -g http-server
RUN npm i -g aurelia-cli
# set working directory to app
# henceforth all commands will run inside this folder
WORKDIR /app
# copy package.json related files first and install all required dependencies
COPY package*.json ./
RUN npm install
# copy the rest of the files and folders & install dependencies
COPY . ./
RUN npm run build
# by default http-server will serve contents on port 8080
# so we expose this port to host machine
EXPOSE 8080
CMD [ "http-server" , "dist" ]
However, docker build . fails at the line Copy . ./. with the message cannot replace to directory /var/lib/docker/overlay2/if2ip5okvavl8u6jpdtpczuog/merged/app/node_modules/#ampproject/remapping with file.
What do I need to do to get my container image to build?
Add node_modules to a .dockerignore file in the same directory as your Dockerfile, as outlined here: (h/t David Maze).
Less gracefully, simply delete the project's node_modules directory then rerun docker build.
I'm trying to containerize an angular-9 application.
dockerfile
FROM node:14.17.0-alpine as build-step
RUN mkdir -p /app
WORKDIR /app
COPY package.json /app
COPY . /app
# Install & build packages
RUN npm install
# Listing 1
RUN ls -la
RUN npm run build:docker
# Listing 2
Run ls -la
# Setup nginx
FROM nginx:alpine
COPY nginx.conf /etc/nginx/nginx.conf
COPY --from=build-step /app/dist/ /usr/share/nginx/html/
# Expose port 80
EXPOSE 80
Package.json
script: {
"build:docker": "node --max-old-space-size=10240 ./node_modules/#angular/cli/bin/ng build --prod --output-path=dist"
}
#Listing 1(in dokerfile) shows node-modules folder but Listing 2(in dokerfile) doesn't.
Also in the end, it throws error
COPY failed: stat app/dist/: file does not exist
I'm unable to understand why it is not fetching files which were build in docker.
I'm adding the screenshot of logs below.
What about this Dockerfile
FROM node:14.17.0-alpine as build-step
WORKDIR /app
COPY package.json .
COPY package-lock.json .
RUN npm ci
# node_modules installed
COPY . .
RUN npm run build:docker
# dist filled with your build app
FROM nginx:alpine
COPY nginx.conf /etc/nginx/nginx.conf
COPY --from=build-step /app/dist/ /usr/share/nginx/html
you should have a .dockerignore file in the same directory containing
dist
node_modules
This should install exactly the versions of package-lock.json via npm ci. It doesn't copy dist or node_module directories through the .dockerignore and speeds up your build.
If this all doesn't help, please provide the output of RUN npm run build:docker.
I'm making an app with React JS, Next Js, npm and also I would have to change the .npmrc for it to run.
I don't know how I could make a DockerFile for these technologies and at the same time this dockerfile has to change the .npmrc
my docker file
FROM node:lts as dependencies
WORKDIR /emercore-arg-manager
COPY package.json yarn.lock ./
RUN echo "#lala-lalal:registry=https://npm.pkg.github.com/" >> ~/.npmrc
RUN echo "//npm.pkg.github.com/:_authToken=asdasdasdasdasdsad" >> ~/.npmrc
RUN echo "//registry.npmjs.org/:_authToken=assdasdasdasdsaasd" >> ~/.npmrc
RUN yarn install --frozen-lockfile
FROM node:lts as builder
WORKDIR /emercore-arg-manager
COPY . .
COPY --from=dependencies /emercore-arg-manager/node_modules ./node_modules
RUN yarn build
FROM node:lts as runner
WORKDIR /emercore-arg-manager
ENV NODE_ENV production
# If you are using a custom next.config.js file, uncomment this line.
# COPY --from=builder /my-project/next.config.js ./
COPY --from=builder /emercore-arg-manager/public ./public
COPY --from=builder /emercore-arg-manager/.next ./.next
COPY --from=builder /emercore-arg-manager/node_modules ./node_modules
COPY --from=builder /emercore-arg-manager/package.json ./package.json
EXPOSE 3000
CMD ["yarn", "start:dev"]
It does not work for me, and I think it is a lot of content for a dockerfile with these technologies, could someone help me to put together a shorter one and make it work?
The command that i use in my deskpot is yarn install and yarn start:dev (and its working)
I have a private repository of a node_module which I install by including it in package.json
ssh://git#github.com/iamsaquib/<pivate-repo>.git
When I am copying all server files inside docker image and try to do a npm install it is unable to install the package and throws I don't have proper access rights. I think I have to authorize by copying my id_rsa.pub inside Dockerfile and add it as authorized key, what is the correct way to do this?
Dockerfile
FROM node:12-slim
ENV NODE_ENV=development
WORKDIR /app
USER root
COPY . .
RUN ./install.sh
RUN ./build.sh
EXPOSE 8000
CMD ["./run.sh"]
You need to mount SSH private key (/home/yourname/.ssh/id_rsa).
You should avoid putting private key in Docker images. One work around could be multi-stage image (security might still be debatable).
FROM node:12-slim as installer
ENV NODE_ENV=development
WORKDIR /app
USER root
COPY /home/yourname/.ssh /home/root/.ssh
COPY /home/yourname/.gitconfig /home/root/.gitconfig
COPY . .
RUN ./install.sh
RUN ./build.sh
RUN rm -rf /home/root/.ssh
RUN rm -rf /home/root/.gitconfig
# Final image
FROM node:12-slim
WORKDIR /app
ENV NODE_ENV=development
USER root
COPY --from=installer /app .
EXPOSE 8000
CMD ["./run.sh"]
I am Docker beginner.
I was able to implement docker for my nodejs project, but when I try to pull it I am getting the error
Error: Cannot find module 'my_db'
(my_db is a module that I wrote that handles my mysql functionality).
So I am guessing my modules are not bundled into the docker image, right?
I moved my modules to a folder name my_node_modules/ so they won't be ignored.
I also modified the Dockerfile as follow:
FROM node:11.10.1
ENV NODE_ENV production
WORKDIR /usr/src/app
COPY ["package.json", "package-lock.json*", "npm-shrinkwrap.json*", "./my_node_modules/*", "./"]
RUN npm install --production --silent && mv node_modules ../
COPY . .
EXPOSE 3000
CMD node index.js
What am I missing?
Thanks
I would do something like this. First create a .dockerignore:
.git
node_modules
The above ensures that the node_modules folder is excluded from the actual build context.
You should add any temporary things to your .dockerignore. This will also speed up the actual build, since the build context will be smaller.
In my docker file I would then first only copy package.json and any existing lock file in order to be able to cache this layer:
FROM node:11.10.1
ENV NODE_ENV production
WORKDIR /usr/src/app
# Only copy package* before installing to make better use of cache
COPY package*.json .
RUN npm install --production --silent
# Copy everything
COPY . .
EXPOSE 3000
CMD node index.js
Like I also wrote in my comment, I have no idea why you are doing this mv node_modules ../? This will move the node_modules directory out from the /usr/src/app folder, which is not what you want.
It would also be nice to see how you are actually including your module.
If you own module resides in the following folder my_node_modules/my_db it will be copied when doing COPY . . in the above docker file. Then in your index.js file you should be able to use the module like this:
const db = require('./my_node_modules/my_db');
COPY . . this step will override everything in the current directory and copying node modules from Host is not recommended and maybe it breaks the container in case of host biners compiled for Window and you are using Linux container.
So better to refactor your Dockerfile and install modules inside docker instead of copying from the host.
FROM node:11.10.1
ENV NODE_ENV production
WORKDIR /usr/src/app
COPY . .
RUN npm install --production --silent
EXPOSE 3000
CMD node index.js
Also will suggest using .dockerignore
# add git-ignore syntax here of things you don't want copied into docker image
.git
*Dockerfile*
*docker-compose*
node_modules