SvelteKit change port to different one - node.js

I want to change port in production build of node-adapter. I use port 3000 internally for different service, so I need to change Svelte app to have port 3001.
My Dockerfile:
# Container setup ---------
FROM node:16-alpine as build
RUN mkdir /appbuild
COPY . /appbuild
WORKDIR /appbuild
# clean install all dependencies
RUN npm ci
# remove potential security issues
RUN npm audit fix
RUN npm run build
# Container setup ---------
FROM node:16-alpine
WORKDIR /app
# copy dependency list
COPY --from=build /appbuild/package*.json ./
# clean install dependencies, no devDependencies, no prepare script
RUN npm ci --production --ignore-scripts
# remove potential security issues
RUN npm audit fix
# copy built SvelteKit app to /app
COPY --from=build /appbuild/build ./
CMD ["node", "./index.js"]
I found that if I start app like PORT=3001 node build from root (after calling npm run build) it works, but for dockerfile, CMD must be provided and PORT cannot be passed.
How to change default port to another one?

Have you tried using ENV?
ENV PORT=3001
CMD ["node", "./index.js"]

Related

An Angular app taking too much time to build under dockerized container

I updated my angular app 9 to 14 gradually and whenever I try to build under the docker environment, taking too much time to finish the npm installation. here is my docker file -
FROM node:14.21.2-slim as buildContainer
WORKDIR /app
COPY ./package.json /app/
RUN rm -rf node_modules
RUN npm cache clean --force
RUN npm set audit false
RUN npm install --verbose
COPY . /app
RUN npm run build
FROM nginx:1.16.0-alpine
# Get all the code needed to run the app
COPY --from=buildContainer /app/dist/ /usr/share/nginx/html
COPY ./dockers/templates/nginx.conf /etc/nginx/nginx.conf
# Expose the port the app runs in
EXPOSE 80
# Serve the app
CMD ["nginx", "-g", "daemon off;"]
it almost take 20 mins to finish up everything. Don't know what's going wrong here. If anyone have any solution it will be great help. The screenshot -
Thanks in advance.

cannot replace to directory /var/lib/docker/overlay2/if2ip5okvavl8u6jpdtpczuog/merged/app/node_modules/#ampproject/remapping with file

On my Windows machine, I am attempting to build a containerized node.js application with the following Dockerfile:
# use latest version of nodejs
FROM node:lts-alpine
# install aurelia-cli to build the app & http-server to serve static contents
RUN npm i -g http-server
RUN npm i -g aurelia-cli
# set working directory to app
# henceforth all commands will run inside this folder
WORKDIR /app
# copy package.json related files first and install all required dependencies
COPY package*.json ./
RUN npm install
# copy the rest of the files and folders & install dependencies
COPY . ./
RUN npm run build
# by default http-server will serve contents on port 8080
# so we expose this port to host machine
EXPOSE 8080
CMD [ "http-server" , "dist" ]
However, docker build . fails at the line Copy . ./. with the message cannot replace to directory /var/lib/docker/overlay2/if2ip5okvavl8u6jpdtpczuog/merged/app/node_modules/#ampproject/remapping with file.
What do I need to do to get my container image to build?
Add node_modules to a .dockerignore file in the same directory as your Dockerfile, as outlined here: (h/t David Maze).
Less gracefully, simply delete the project's node_modules directory then rerun docker build.

How to write Dockerfile to serve Angular app and Node server

My Angular app runs fine locally but I haven't figured out how to do the same with a Docker image. Outside of Docker, the UI runs on port 4200 with ng serve and the API serves data from 8080 with node server.js.
My Dockerfile is set up so it can get the Node server running and available on 8080, but the Angular UI won't run. I've tried several options but right now I have:
FROM node:14.17.3
COPY package*.json ./
EXPOSE 4200 8080
RUN npm install -g #angular/cli
RUN npm install --only=production
COPY . ./
RUN ng serve
CMD ["node", "server.js"]
It fails on ng serve with the error: The serve command requires to be run in an Angular project, but a project definition could not be found. I do have an angular.json file in the root. I'm not sure what I am missing. I read that ng serve shouldn't be used in this situation but the alternatives I've seen haven't made a difference.
Workspace:
EDIT 8/10/21: Based on the answers here and a bunch of research, this will display the UI with nginx:
FROM node:12.16.1-alpine as build
WORKDIR /usr/src/app
COPY package*.json ./
RUN npm ci --only=production
COPY . .
# RUN npm install -g #angular/cli
# RUN npm run build --prod
FROM nginx:1.15.8-alpine
COPY --from=build /usr/src/app/dist /usr/share/nginx/html
# CMD ["node", "server.js"]
However, the npm run build step fails because ng is not found despite installing #angular/cli. I have to run this manually to build the dist folder. And I can't run node server.js alongside this. It seems I can only get the front end or back end, not both.
Use below command at the end to run ng serve with host 0.0.0.0 which means it listens to all interfaces.
CMD ["ng","serve","--host", "0.0.0.0"]
But I would suggest using ngInx.
Steps to follow:
Create a docker file under the root of your project, and add the below code. It takes care of: downloading dependencies, building angular project, and deploy it to ngInx server.
#Download Node Alpine image
FROM node:12.16.1-alpine As build
#Setup the working directory
WORKDIR /usr/src/ng-app
#Copy package.json
COPY package.json package-lock.json ./
#Install dependencies
RUN npm install
#Copy other files and folder to working directory
COPY . .
#Build Angular application in PROD mode
RUN npm run build
#Download NGINX Image
FROM nginx:1.15.8-alpine
#Copy built angular files to NGINX HTML folder
COPY --from=build /usr/src/ng-app/dist/pokemon-app/ /usr/share/nginx/html
Build docker image:
docker build -t my-ng-app .
Spinning the docker container with below command expose your app at port 80
docker run -dp 3000:80 my-ng-app
Check out my article on this - https://askudhay.com/how-to-dockerize-an-angular-application, and please let me know if you still have any questions.
I figured out a solution that will run the full application. Most answers here focus on running the front end (the nginx suggestion was helpful). It seemed a Docker container could enable the UI or server but not both. I came across Docker Compose, which will run the front and back ends in separate images. My solution:
Dockerfile.ui
# Define node version
FROM node:12.16.1-alpine as build
# Define container directory
WORKDIR /usr/src/app
# Copy package*.json for npm install
COPY package*.json ./
# Run npm clean install, including dev dependencies for #angular-devkit
RUN npm ci
# Run npm install #angular/cli
RUN npm install -g #angular/cli
# Copy all files
COPY . .
# Run ng build through npm to create dist folder
RUN npm run build --prod
# Define nginx for front-end server
FROM nginx:1.15.8-alpine
# Copy dist from ng build to nginx html folder
COPY --from=build /usr/src/app/dist /usr/share/nginx/html
Dockerfile.server
# Define node version
FROM node:12.16.1-alpine
# Define container directory
WORKDIR /usr/src/app
# Copy package*.json for npm install
COPY package*.json ./
# Run npm clean install, prod dependencies only
RUN npm ci --only=production
# Copy all files
COPY . .
# Expose port 8080 for server
EXPOSE 8080
# Run "node server/run.js"
CMD ["node", "server/run.js"]
docker-compose.yml
version: '3'
services:
server:
build:
context: ./
dockerfile: Dockerfile.server
container_name: server
ports:
- 8080:8080
ui:
build:
context: ./
dockerfile: Dockerfile.ui
container_name: ui
ports:
- 4200:80
links:
- server
docker-compose up will build out an image for server and UI and deploy concurrently. I also resolved the ng not found errors by installing dev dependencies, particularly #angular-devkit/build-angular.
This tutorial helped me figure out Docker Compose: https://wkrzywiec.medium.com/how-to-run-database-backend-and-frontend-in-a-single-click-with-docker-compose-4bcda66f6de
I think updating this line
COPY . ./
with
COPY . ./app
should solve that error. It appears that the node "volume" is in that folder.
Otherwise setting the workdir also seems like a solution:
FROM node:14
WORKDIR /usr/src/app
COPY package*.json ./
...
Source: https://nodejs.org/en/docs/guides/nodejs-docker-webapp/

Docker app won`t work but application works if not in docker

I have built a react and node app and it works. I am trying to build an docker image and run it it compiles but when i try to access it thru thr browser it says This site can’t be reachedlocalhost refused to connect. This is the dockerfile that i`ve written because I think this is the problem:
# pull official base image
FROM node:13.12.0-alpine
# set working directory
WORKDIR /client
# add `/app/node_modules/.bin` to $PATH
ENV PATH /client/node_modules/.bin:$PATH
# install app dependencies
COPY package.json ./
COPY package-lock.json ./
RUN npm install --silent
RUN npm install react-scripts#3.4.1 -g --silent
# add app
COPY . ./
# start app
CMD ["npm", "start"]
FROM node:12
# Create app directory
WORKDIR ./
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm#5+)
COPY package*.json ./
RUN npm install
# If you are building your code for production
# RUN npm ci --only=production
# Bundle app source
COPY . .
EXPOSE 5000
CMD [ "node", "index.js" ]
First make sure you can reach your server - you app may be running locally but it doesn't use the same server as the one from your dockerfile e.g. npm start starts a local server.
Remove everything related to the react app stage from your dockerfile and just make sure you can hit your server on port 5000 and that it servers pages - you can host an index.html with <body>Hello world</body> or something. When you're able to do that it's just a matter of adding the react app bundled files to the server static (public) folder
Is your node server listening on port 5000 ?
Typically with node js server you have to set a PORT and HOST env variables that are used by the server like process.env.PORT and process.env.HOST
Other issues
# start app
CMD ["npm", "start"]
npm start is (usually) used to start a local server while you develop your react app
Instead of starting the app you should run a bundle command that will produce static files to be hosted - js html etc...
Typically npm run build
This should make a folder ./dist or ./build with all the static content you should put on the server (perhaps that's your /client folder for?)
Some tweaks
# pull official base image
FROM node:13.12.0-alpine AS builder
# set working directory
WORKDIR /client
# Copy source
COPY . .
# add `/app/node_modules/.bin` to $PATH (you probably don't need this)
ENV PATH /client/node_modules/.bin:$PATH
# install app dependencies
RUN npm install --silent
RUN npm install react-scripts#3.4.1 -g --silent (you can move this to dependencies)
# Build bundle
RUN npm run build
# Next stage
FROM node:12
# Create app directory
WORKDIR ./
# Copy source
COPY . .
RUN npm install
# If you are building your code for production
# RUN npm ci --only=production
# Copy js bundle
COPY --from=builder /client/dist ./public
EXPOSE 5000
CMD [ "node", "index.js" ]
This might need a little more tweaking but that's the general idea
you don't need to install react-scripts globally it can just be a local dependency - it's used by npm start and npm build to build your app so you depend on it - it should be part of package.json dependencies
Adding /client/node_modules/.bin to PATH shouldn't be needed, maybe it's a leftover or a tell that something else isn't properly setup

Creating React application for production with Docker build?

I am creating a React application using docker build with the following Dockerfile:
# build env
FROM node:13.12.0-alpine as build
WORKDIR /app
ENV PATH /app/node_modules/.bin:$PATH
COPY package.json ./
COPY package-lock.json ./
RUN npm ci
RUN npm install react-scripts -g
RUN npm install --save #fortawesome/fontawesome-free
RUN apk add nano
RUN apk add vim
COPY . ./
RUN npm run build
# production env
FROM nginx:stable-alpine
COPY --from=build /app/build /usr/share/nginx/html
EXPOSE 80
CMD ["nginx", "-g", "daemon off;"]
I believe the Dockerfile is not of extreme importance here however. In my source code there is a master configuration file, which I want to leave out of the docker image to be able to deploy my React App easily. This causes a compilation error during the Dockerfile command RUN npm run build, since the compilator does not find a file that is referenced by another file. For development versions this was not an issue, since npm start is not that sensitive.
I would add the configuration file as a docker volume in the final application, so the code would be able to find it without problems. I am just wondering how to approach a situation like this, since it has not come up earlier on my path?
Also feel free to comment on or optimize my Dockerfile, as I am unsure of e.g. whether Nginx is the way to go in these production builds for website front-end applications.
If your app currently requires the configuration file, it's akin to "hard-coding" the values into it at build time, as you've noticed. If you do need to be able to dynamically swap in another configuration file at runtime, you would need to use e.g. fetch() to load it, not bundle it (as require does).
If configuring things at build-time is fine, then I'd also suggest looking at CRA custom environment variables; you could then inject the suitable values as environment variables at build time.
Beyond that, if you're looking for critique for your Dockerfile, from one Aarni to another:
Your package.json is broken if you need to do anything beyond npm ci or yarn during a build to install stuff. react-scripts should be a dev dependency and Font Awesome should be a regular dependency.
You don't need nano and vim in the temporary container, and even if you did, it'd be better to apk add them in a single step.
You shouldn't need to modify the PATH in the build container.
Using Nginx is absolutely fine.
# build env
FROM node:13.12.0-alpine as build
WORKDIR /app
COPY package*.json ./
RUN npm ci
COPY . ./
RUN npm run build
# production env
FROM nginx:stable-alpine
COPY --from=build /app/build /usr/share/nginx/html
EXPOSE 80
CMD ["nginx", "-g", "daemon off;"]
Here is a sample of my react docker file. May be you can use this if you want to optimize.
PS: i am running it from kubernetes.
# ############################# Stage 0, Build the app #####################
# pull official base image
FROM node:13.12.0-alpine as build-stage
# set working directory
WORKDIR /app
# add `/app/node_modules/.bin` to $PATH
ENV PATH /app/node_modules/.bin:$PATH
# install app dependencies
COPY package*.json ./
#RUN npm install
RUN npm install
# add app
COPY . ./
#build for production
RUN npm run-script build
# #### Stage 1, push the compressed built app into nginx ####
FROM nginx:1.17
COPY --from=build-stage /app/build/ /usr/share/nginx/html

Resources