How to build a react/vue application outside of a docker container - node.js

I have several applications (vue & react each of the applications is built on a different version of the node). I want to set up a project deployment so that I can run a docker container with the correct version of the node for each of the projects. A build (npm i & npm run build) should happen in the container, but I go to give the result from the container to /var/www/project_name already on the server itself.
Next, set up a container with nginx which, depending on the subdomain, will give the desired build
My question is how to return the folder with files from the container to the operating system area?
my docker-compose file:
version: "3.1"
services:
redis:
restart: always
image: redis:alpine
container_name: redis
build-adminapp:
build: adminapp/
container_name: adminapp
working_dir: /var/www/adminapp
volumes:
- ./adminapp:/var/www/adminapp
build-clientapp:
build: clientapp/
container_name: clientapp
working_dir: /var/www/clientapp
volumes:
- ./clientapp:/var/www/clientapp`
my docker files:
FROM node:10-alpine as build
# Create app directory
WORKDIR /var/www/adminapp/
COPY . /var/www/adminapp/
RUN npm install
RUN npm run build
second docker file:
FROM node:12-alpine as build
# Create app directory
WORKDIR /var/www/clientapp/
COPY . /var/www/clientapp/
RUN npm install
RUN npm run build

If you already have a running container, you can use docker cp command to move files between local machine and docker containers.

Related

Using Docker with Node image to develop a VuejS (NuxtJs) app

The situtaion
I have to work on a VueJs (NuxtJs) spa, so I'm trying to use Docker with a Node image to avoid installing it on my pc, but can't figure out how to make it work.
The project
The source cose is in its own application folder, since it is versioned, and at the root level there is the docker-compose.yaml file
The folder structure
my-project-folder
├ application
| └ ...
└ docker-compose.yaml
The docker-compose.yaml
version: "3.3"
services:
node:
# container_name: prova_node
restart: 'no'
image: node:lts-alpine
working_dir: /app
volumes:
- ./application:/app
The problem
The container start but quit immediately with exit status 0 (so it executed correctly), but this way I can't use it to work on the project.
Probably there is something I'm missing about the Node image or Docker in general; what i would like to to do is connecting to the docker container to run npm commands like install, run start etc and then check the application on the browser on localhost:3000 or whatever it is.
I would suggest to use Dockerfile with base image as node and then create your entrypoint which runs the application. That will eliminate the need to use volumes which is used when we want to maintain some state for our containers.
Your Dockerfile may look something like this:
FROM node:lts-alpine
RUN mkdir /app
COPY application/ /app/
EXPOSE 3000
CMD npm start --prefix /app
You can then either run it directly through docker run command or use docker-compose.yaml as following :
version: "3.3"
services:
node:
# container_name: prova_node
restart: 'no'
build:
context: .
ports:
- 3000:3000

Node application won't start in Docker if there's a shared volume

I generated a Gatsby website localy. Then I setup a Docker container to run it. It's a standard setup : copying package.json, installing the modules, copying the local files and starting the dev script with a shared volume.
But when it starts, I run into an error :
gatsby_1 | ERROR
gatsby_1 |
gatsby_1 | There was a problem loading the local develop command. Gatsby may not be installed in your site's "node_modules" directory. Perhaps you need to run "npm install"? You might need to delete your "package-lock.json" as well.
Here is the Dockerfile :
FROM node:14.11.0
RUN npm install -g gatsby-cli
WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 8000
CMD ["gatsby", "develop", "-H", "0.0.0.0"]
And the docker-compose.yml :
version: "3"
services:
db:
image: mariadb:latest
environment:
MYSQL_ROOT_PASSWORD: root
volumes:
- ./db:/var/lib/mysql
adminer:
image: adminer:latest
ports:
- 8082:8080
wordpress:
image: wordpress:5.5.1-php7.4-apache
ports:
- 8081:80
volumes:
- ./wordpess/plugins:/var/www/html/wp-content/plugins
- ./wordpess/themes:/var/www/html/wp-content/themes
gatsby:
build: ./gatsby
ports:
- 8080:8000
volumes:
- ./gatsby:/app
But, if I remove the volume for the Gatsby container, everything rune well.
So my guess is there's something wrong with permissions somewhere, but I can't figure it out.
Here is a repo with the whole project.

Node.js docker container not updating to changes in volume

I am trying to host a development environment on my Windows machine which hosts a frontend and backend container. So far I have only been working on the backend. All files are on the C Drive which is shared via Docker Desktop.
I have the following docker-compose file and Dockerfile, the latter is inside a directory called backend within the root directory.
Dockerfile:
FROM node:12.15.0-alpine
WORKDIR /usr/app
COPY package*.json ./
RUN npm install
EXPOSE 5000
CMD [ "npm", "start" ]
docker-compose.yml:
version: "3"
services:
backend:
container_name: backend
build:
context: ./backend
dockerfile: Dockerfile
volumes:
- ./backend:/usr/app
environment:
- APP_PORT=80
ports:
- '5000:5000'
client:
container_name: client
build:
context: ./client
dockerfile: Dockerfile
volumes:
- ./client:/app
ports:
- '80:8080'
For some reason, when I make changes in my local files they are not reflecting inside the container. I am testing this by slightly modifying the outputs of one of my files, but I am having to rebuild the container each time to see the changes take effect.
I have worked with Docker in PHP applications before, and have basically done the same thing. So I am unsure why this is not working with by Node.js app. I am wondering if I am just missing something glaringly obvious as to why this is not working.
Any help would be appreciated.
The difference between node and PHP here is that php automatically picks up file system changes between requests, but a node server doesn't.
I think you'll see that the file changes get picked up if you restart node by bouncing the container with docker-compose down then up (no need to rebuild things!).
If you want node to pick up file system changes without needing to bounce the server you can use some of the node tooling. nodemon is one: https://www.npmjs.com/package/nodemon. Follow the installation instructions for local installation and update your start script to use nodemon instead of node.
Plus I really do think you have a mistake in your dockerfile and you need to copy the source code into your working directory. I'm assuming you got your initial recipe from here: https://dev.to/alex_barashkov/using-docker-for-nodejs-in-development-and-production-3cgp. This is the docker file is below. You missed a step!
FROM node:10-alpine
WORKDIR /usr/src/app
COPY package*.json ./
RUN npm install
COPY . .
CMD [ "npm", "start" ]

docker container crashes with exited with code 139

I have this docker file build/and run an node application
# ---- Base Node ----
FROM node:10 AS base
# Create app directory
WORKDIR /app
# ---- Dependencies ----
FROM base AS dependencies
COPY package.json ./
# install app dependencies including 'devDependencies'
RUN npm install
# ---- Copy Files/Build ----
FROM dependencies AS build
WORKDIR /app
COPY . /app
# Build the app
RUN npm run build
WORKDIR /app/dist
# install npm models in dist
RUN npm install --only=production
# --- Release with Alpine ----
FROM node:10-alpine AS release
# Create app directory
WORKDIR /app
# optional
ENV NODE_ENV=development
ENV MONGO_HOST=mongodb://localhost/chronas-api
ENV MONGO_PORT=27017
ENV PORT=80
# copy app from build
COPY --from=build /app/dist/ ./
CMD ["node", "index.js"]
and using this docker-compose file
version: '3'
services:
database:
image: mongo
container_name: mongo
ports:
- "27017:27017"
app:
build: .
container_name: chronas_api
ports:
- "80:80"
- "5858:5858"
links:
- database
environment:
- JWT_SECRET='placeholder'
- MONGO_HOST=mongodb://database/chronas-api
- APPINSIGHTS_INSTRUMENTATIONKEY='placeholder'
- TWITTER_CONSUMER_KEY=placeholder
- TWITTER_CONSUMER_SECRET=placeholder
- TWITTER_CALLBACK_URL=placeholder
- PORT=80
depends_on:
- database
stdin_open: true
tty: true
always when I try to write to the mongodb the node container crashes with this error:
exited with code 139
can anyone help? When I run the application only with docker it works fine
The issue may be caused by the fact that node:10-alpine is a tag where the underlying image may change with updates, so when you are building the same app without using compose it won't pull the most recent image, docker-compose will do a pull from the docker hub instead.
Images based on alpine may have some dependency issues that are quite hard to debug from one version to another, you can find some possibilities for this particular issue here
I was using the tag node:8-alpine in my application and I found out that the current latest node:8.15.1-alpine is causing the Exited with code 139 issue that was not present in the previous image node:8.15.0-alpine. Downgrading may be the easiest solution to solve this kind of issue, check if you are using bcrypt too.
Another option is to use a debian based image that will be less likely to have this kind of issues (just consider it's slighly bigger in size).

docker compose run with command error

From this documentation, it seems like that I can execute a single command from a service like this:
docker-compose run SERVICE CMD
But when I run
docker-compose up pwa npm test
I get the error
ERROR: No such service: npm
From my configurations, it will execute npm start, but I'd like to know how to execute other commands.
Files
Dockerfile:
From node:8
WORKDIR /app
copy package.json /app/
RUN npm install --quiet
CMD npm start
docker-compose.yml:
version: '3'
services:
pwa:
build: .
ports:
- '3000:3000'
volumes:
- ./src:/app/src
- ./public:/app/public
Versions
Docker version: 17.03
Docker compose version: 1.11.2
As docs say, the command is docker-compose run, not docker-compose up. The later expects all service names.
Do as this:
docker-compose run pwa npm test

Resources