How to install git and gatsby from docker - node.js

I am trying to install git and gatsby from docker. Though i am able to install git, which seems to be running when i run "git status" after running docker exec -it sh. But, gatsby does not work.
FROM node:alpine
# Also exposing VSCode debug ports
EXPOSE 8000 9929 9230
ARG SSG_HOME=/opt/ssg
WORKDIR $SSG_HOME
#Install Git
RUN apk update && apk upgrade && \
apk add --no-cache bash git openssh
#Install Gatsby
RUN apk add --update npm
RUN npm install gatsby-cli
COPY . $SSG_HOME
RUN npm run setup
ENTRYPOINT ["npm","run"]
CMD ["start-docker"]
I expect that it'll recognize the keyword gatsby, but it shows gatsby not found
$ docker exec -it db6e5a3518c0 sh
/opt/ssg # gatsby
sh: gatsby: not found
/opt/ssg #

You're only installing it in a particular directory. Instead, go global:
RUN npm install -g gatsby-cli

Related

Docker yarn install stuck forever on M1 mac

I have an M1 mac. I'm trying to build a docker image. I'm trying to build it on x86_64. Everything works fine until yarn install. It's stuck forever. Here's my Dockerfile:
FROM --platform=linux/x86_64 public.ecr.aws/docker/library/node:16
RUN curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip" && unzip awscliv2.zip
RUN ./aws/install && aws --version
RUN mkdir code
WORKDIR /code
COPY package.json yarn.lock /code/
RUN yarn install --production --frozen-lockfile && yarn cache clean
EXPOSE 3000
ADD lib /code/lib
USER node
CMD ["node", "lib/index.js"]
If I don't set --platform=linux/x86_64 it's failing on installing aws-cli. If I skip aws cli (and don't set platform), yarn install finishes within a minute, and the build succeeds.
I've also tried --platform=linux/amd64, same result.
What causes yarn install take forever?

Run npm update in docker without using the cache on that specific update

Background:
I'm writing code in node.js, using npm and docker. I'm trying to get my docker file to use cache when I build it so it doesn't take too long.
We have a "common" repo that we use to keep logic that is used in a variety of repositories and this gets propagated is npm packages.
The problem:
I want the docker file NOT use the cache on my "common" package.
Docker file:
FROM node:12-alpine as X
RUN npm i npm#latest -g
RUN mkdir /app && chown node:node /app
WORKDIR /app
RUN apk add --no-cache python3 make g++ tini \
&& apk add --update tzdata
USER node
COPY package*.json ./
COPY .npmrc .npmrc
RUN npm install --no-optional && npm cache clean --force
ENV PATH /app/node_modules/.bin:$PATH
COPY . .
package.json has this line:
"dependencies": {
"#myorg/myorg-common-repo": "~1.0.13",
I have tried adding these lines in a variety of places and nothing seems to work:
RUN npm uninstall #myorg/myorg-common-repo && npm install #myorg/myorg-common-repo
RUN npm update #myorg/myorg-common-repo --force
Any ideas on how I can get docker to build and not use the cache on #myorg/myorg-common-repo ?
So I finally managed to solve this using this answer:
What we want to do is invalidate the cache for a specific block in the Docker file and then run our update command. This is done by adding a build argument to the command (CLI or Makefile) like so:
docker-compose -f docker-compose-dev.yml build --build-arg CACHEBUST=0
And then Adding this additional block to the Docker file:
ARG CACHEBUST=1
USER node
RUN npm update #myorg/myorg-common-repo
This does what we want.
The ARG CACHEBUST=1 invalidates the cache and the npm update command runs without it.

Running Angular app as docker image using Node js

Trying to build angular application in docker and run as container in my local using Node js.
I have used build image using below Dockerfile, but i am not sure what i am missing while running. Can someone point me out?
Dockerfile:
FROM node:10.15.3
ENV HOME=/home
WORKDIR $HOME
RUN npm config set strict-ssl false \
&& npm config set proxy http://proxy.xxxxxx.com:8080
COPY package.json .
RUN npm install
Image created with below command successfully
docker build -t example .
I am trying to run the image using below command, but it is not helping
docker run -p 4201:4200 example
your Dockerfile does not run/serve your application, in order to do that you have to:
install angular/cli
copy the app
run/serve the app
FROM node:10.15.3
RUN npm config set strict-ssl false \
&& npm config set proxy http://proxy.xxxxxx.com:8080
# get the app
WORKDIR /src
COPY . .
# install packages
RUN npm ci
RUN npm install -g #angular/cli
# start app
CMD ng serve --host 0.0.0.0
hope this helps.
Container need a foreground process running, then it will not exit. If not, the container will directly exit.
For your case, you need to COPY your nodejs project to container when docker build, and also start the project in CMD like CMD [ "npm", "start" ]. As the web server not exit, then your container will not exit.
A good article here for your reference on how to dockerizing a Node.js web app.
Just update your Dockerfile to achieve your goal for more options see here:
# base image
FROM node:12.2.0
RUN npm config set strict-ssl false \
&& npm config set proxy http://proxy.xxxxxx.com:8080
# install chrome for protractor tests
RUN wget -q -O - https://dl-ssl.google.com/linux/linux_signing_key.pub | apt-key add -
RUN sh -c 'echo "deb [arch=amd64] http://dl.google.com/linux/chrome/deb/ stable main" >> /etc/apt/sources.list.d/google.list'
RUN apt-get update && apt-get install -yq google-chrome-stable
# set working directory
WORKDIR /app
# add `/app/node_modules/.bin` to $PATH
ENV PATH /app/node_modules/.bin:$PATH
# install and cache app dependencies
COPY package.json /app/package.json
RUN npm install
RUN npm install -g #angular/cli#7.3.9
# add app
COPY . /app
# start app
CMD ng serve --host 0.0.0.0
Give a shot for the following Dockerfile as well!
FROM node:alpine
# get the app
WORKDIR /src
# install packages
RUN npm ci
RUN npm install -g #angular/cli
COPY package.json .
RUN npm install
COPY . .
# start app
CMD ["ng", "serve", "-o"]

The command 'dotnet' not found in Dockerfile

I trie to build an image with docker to run a project, but When i run docker build, the step 5/12:
/bin/sh: 1: dotnet: not found
The command '/bin/sh -c dotnet restore' returned a non-zero code: 127
Note: I'm running .Net Core and Node Modules, so i put all in the Dockerfile, is that OK? I'm new with Docker
This is my dockerfile:
# Otenemos el SDK de .Net Core en la versión 2.1
FROM microsoft/dotnet:2.1-sdk
FROM node:11
WORKDIR /FrontEnd
#Copiamos el archivo .csproj y restauramos paquetes.
COPY *.csproj ./
RUN dotnet restore
COPY package.json ./
RUN npm install
# Copiamos todo lo demás
COPY . ./
FROM microsoft/dotnet:2.1-aspnetcore-runtime
WORKDIR /FrontEnd
ENTRYPOINT [ "dotnet", "run" ]
EDIT
This is my new Dockerfile
Seems working in console, but in Chrome doesn't.
This is the command that i run in console:
docker run --rm name_proj:latest
FROM microsoft/dotnet:2.1-sdk AS build
WORKDIR /FrontEnd
COPY *.csproj ./
COPY package.json ./
RUN dotnet restore
RUN apt-get update -yq && apt-get upgrade -yq && apt-get install -yq curl git nano
RUN curl -sL https://deb.nodesource.com/setup_11.x | bash - && apt-get install -yq nodejs build-essential
RUN npm install -g npm
RUN npm install
COPY . ./
#FROM microsoft/dotnet:2.1-aspnetcore-runtime
ENTRYPOINT [ "dotnet", "run" ]
The log in console:
When i open localhost:5001 is not working
You can only inherit from one container via FROM. Every time you use a FROM, you are switching to using a new container as the base. When you do
FROM microsoft/dotnet:2.1-sdk
FROM node:11
The effect is the same as just
FROM node:11
And in the node contianer, there's no .NET Core installed.
If you want a contianer where both dotnet and npm commands are available, you need to build it yourself. Here's an answer on stackoverflow that shows you how do to that: How to integrate 'npm install' into ASP.NET CORE 2.1 Docker build. The short version is that you start from one container (maybe FROM microsoft/dotnet:2.1-sdk) and you install the node packages into it manually.

How to cache the RUN npm install instruction when docker build a Dockerfile

I am currently developing a Node backend for my application.
When dockerizing it (docker build .) the longest phase is the RUN npm install. The RUN npm install instruction runs on every small server code change, which impedes productivity through increased build time.
I found that running npm install where the application code lives and adding the node_modules to the container with the ADD instruction solves this issue, but it is far from best practice. It kind of breaks the whole idea of dockerizing it and it cause the container to weight much more.
Any other solutions?
Ok so I found this great article about efficiency when writing a docker file.
This is an example of a bad docker file adding the application code before running the RUN npm install instruction:
FROM ubuntu
RUN echo "deb http://archive.ubuntu.com/ubuntu precise main universe" > /etc/apt/sources.list
RUN apt-get update
RUN apt-get -y install python-software-properties git build-essential
RUN add-apt-repository -y ppa:chris-lea/node.js
RUN apt-get update
RUN apt-get -y install nodejs
WORKDIR /opt/app
COPY . /opt/app
RUN npm install
EXPOSE 3001
CMD ["node", "server.js"]
By dividing the copy of the application into 2 COPY instructions (one for the package.json file and the other for the rest of the files) and running the npm install instruction before adding the actual code, any code change wont trigger the RUN npm install instruction, only changes of the package.json will trigger it. Better practice docker file:
FROM ubuntu
MAINTAINER David Weinstein <david#bitjudo.com>
# install our dependencies and nodejs
RUN echo "deb http://archive.ubuntu.com/ubuntu precise main universe" > /etc/apt/sources.list
RUN apt-get update
RUN apt-get -y install python-software-properties git build-essential
RUN add-apt-repository -y ppa:chris-lea/node.js
RUN apt-get update
RUN apt-get -y install nodejs
# use changes to package.json to force Docker not to use the cache
# when we change our application's nodejs dependencies:
COPY package.json /tmp/package.json
RUN cd /tmp && npm install
RUN mkdir -p /opt/app && cp -a /tmp/node_modules /opt/app/
# From here we load our application's code in, therefore the previous docker
# "layer" thats been cached will be used if possible
WORKDIR /opt/app
COPY . /opt/app
EXPOSE 3000
CMD ["node", "server.js"]
This is where the package.json file added, install its dependencies and copy them into the container WORKDIR, where the app lives:
ADD package.json /tmp/package.json
RUN cd /tmp && npm install
RUN mkdir -p /opt/app && cp -a /tmp/node_modules /opt/app/
To avoid the npm install phase on every docker build just copy those lines and change the ^/opt/app^ to the location your app lives inside the container.
Weird! No one mentions multi-stage build.
# ---- Base Node ----
FROM alpine:3.5 AS base
# install node
RUN apk add --no-cache nodejs-current tini
# set working directory
WORKDIR /root/chat
# Set tini as entrypoint
ENTRYPOINT ["/sbin/tini", "--"]
# copy project file
COPY package.json .
#
# ---- Dependencies ----
FROM base AS dependencies
# install node packages
RUN npm set progress=false && npm config set depth 0
RUN npm install --only=production
# copy production node_modules aside
RUN cp -R node_modules prod_node_modules
# install ALL node_modules, including 'devDependencies'
RUN npm install
#
# ---- Test ----
# run linters, setup and tests
FROM dependencies AS test
COPY . .
RUN npm run lint && npm run setup && npm run test
#
# ---- Release ----
FROM base AS release
# copy production node_modules
COPY --from=dependencies /root/chat/prod_node_modules ./node_modules
# copy app sources
COPY . .
# expose port and define CMD
EXPOSE 5000
CMD npm run start
Awesome tuto here: https://codefresh.io/docker-tutorial/node_docker_multistage/
I've found that the simplest approach is to leverage Docker's copy semantics:
The COPY instruction copies new files or directories from and adds them to the filesystem of the container at the path .
This means that if you first explicitly copy the package.json file and then run the npm install step that it can be cached and then you can copy the rest of the source directory. If the package.json file has changed, then that will be new and it will re-run the npm install caching that for future builds.
A snippet from the end of a Dockerfile would look like:
# install node modules
WORKDIR /usr/app
COPY package.json /usr/app/package.json
RUN npm install
# install application
COPY . /usr/app
I imagine you may already know, but you could include a .dockerignore file in the same folder containing
node_modules
npm-debug.log
to avoid bloating your image when you push to docker hub
you don't need to use tmp folder, just copy package.json to your container's application folder, do some install work and copy all files later.
COPY app/package.json /opt/app/package.json
RUN cd /opt/app && npm install
COPY app /opt/app
I wanted to use volumes, not copy, and keep using docker compose, and I could do it chaining the commands at the end
FROM debian:latest
RUN apt -y update \
&& apt -y install curl \
&& curl -sL https://deb.nodesource.com/setup_12.x | bash - \
&& apt -y install nodejs
RUN apt -y update \
&& apt -y install wget \
build-essential \
net-tools
RUN npm install pm2 -g
RUN mkdir -p /home/services_monitor/ && touch /home/services_monitor/
RUN chown -R root:root /home/services_monitor/
WORKDIR /home/services_monitor/
CMD npm install \
&& pm2-runtime /home/services_monitor/start.json

Resources