Unable to deploy React Application to Kubernetes - node.js

I am trying to deploy an application created using Create-React App to Kubernetes through Docker.
When the docker file tries to create the container using Jenkins pipeline, it fails with the below error :
"Starting the development server...
Failed to compile.
./src/index.js
Module not found: Can't resolve './App.js' in '/app/src'
The folder structure is exactly similar to the default 'create-react app' folder structure.
Also below is the Dockerfile:
FROM node:10.6.0-jessie
# set working directory
RUN mkdir /app
WORKDIR /app
COPY . .
# add `/usr/src/app/node_modules/.bin` to $PATH
#ENV PATH /usr/src/app/node_modules/.bin:$PATH
RUN npm install
#RUN npm install react-scripts -g --silent
# start app
CMD ["npm", "start"]
I am unable to understand where I might be going wrong.
Edit 1: I would also like to mention that I am able to run the docker container on my local machine using this config.
So any help would be appreciated.
Update 1 :
I was able to do a kubectl exec -it pod_name -- bash to the container inside the pod. I found out due to some reason the "App.js" file was getting copied to the container as "app.js". Since linux is case sensitive so it was not able to find the file. Changing the import statement in index.js fixed the problem. But I still have no idea as to what might have caused the file to get copied with a lower-case since in my local the file exists as "App.js".

The problem you're having will be omitted when you adjust your deployment process to a more production-ready setup.
What you're doing currently is installing all (development) dependencies on every Kubernetes node, compiling your application, and then starting a development webserver. This makes your deployed builds inconsistent and increases load and bloat on the deployment nodes.
Instead what you want to do is create a production-ready build by running npm run build on a build machine, which will compile your application and output to the build folder in your project. You then want to transfer this folder to your server in a .zip file, which will need a production-ready webserver installed (Nginx is highly recommended and industry standard) to serve the static files from your build.

Related

How to generate a production build of an API done with NESTJS

I am generating the production version of an API I made using the NESTJS framework and would like to know which files I should upload to the server. When I run the "npm run start: prod" compile it generates the "dist" folder but I tried to run only with it but it is not enough to run my application. Do I need to upload all files to the server? I did several tests removing the folders I used during development but only managed to run in production mode when I was all the same in dev mode.
I looked in the documentation for something about this but found nothing. can anybody help me?
Thank you
Honestly, you should only really need the dist folder as that's the JS 'complied' files. To run your application, commonly you'd use this command node dist/main.js. As to what files you upload it's up to you. Me personally, I use a lot of continuous integration so I would just clone to repo into my container/server and use yarn start:prod. This is so everytime I deploy I'm generating the required files to run in a production environment.
Like #Kim Kern mentioned, some node modules are native built using node-gyro; so it's also always best to build your node_modules on the server/container when deploying. Your deployment script should look something like this
git clone git#github.com:myuser/myrepo.git /var/www/
cd /var/www/
node -v && \
yarn && \
yarn build && \
yarn start:prod
The above script should
1) pull the required repo into a 'hosted' directory
2) check the node version
3) install node_modules and build native scripts etc
4) build the production distribution
5) run the production JS scripts
If you look in your package.json file you'll notice the different scripts that are run when you use yarn start, yarn start:dev and yarn start:prod. When in dev you'll notice the use of ts-node which is a typescript node runner type thing (can't remember the correct phrase). Also the start:dev script uses nodemode to restart the ts-node script. You'll also see the start:prod script uses node dist/main.js and that the prestart:prod script runs rm -rf dist && tsc which removes the dist folder and 'compiles' the javascript required for a production environment.
However, the drawback of a typescript application on your server without continuous integration is that there is the possibility of typescript compilation errors which you wouldn't see or know about until running the prod scripts. I would recommend putting a procedure in place to compile the javascipt from typescript before making a deployment as you don't want to delete the current dist build before knowing the next release will build and run!
For me this approach worked and all you need is the dist folder for this:
Create a prod build of your application using npm run start:prod, this would create a dist folder within your application source
Copy the dist folder to your server.
For getting all the node_modules dependencies on your server just copy your package.json file into the dist folder (that you have copied onto the server) and then run npm install from there.
If you are using pm2 to run your node applications just run pm2 start main.js from within the dist folder
Mostly, you will only need the dependencies in node_modules. You should build the libraries on your server though instead of copying them from your dev machine. Libraries like bcrypt have machine specific code and probably won't run on a different machine. (30% of the npm libraries have native bindings.)
So for your deployment I would recommend to checkout your git repository on your server and then just run npm run start:prod (which builds the project every time) directly there.
Just use the Nest-CLI and build with
nest build
Afterwards you get a dist folder with the compiled Code.
You can then place it on a server an run e.g. with PM2 proccess manager:
production=true pm2 start dist/main.js
In former command the environment variable production is set to true. That could e.g. be usefull when running the Nest.js server over HTTPS.
If you want to run a HTTPS secured server you also have to include the certificates in the starting process of the server. When the environment variable production is set and true the certificates get included in the starting proccess of the Nest.js application in main.ts like following:
async function bootstrap() {
let appConfig = {}
if (process.env.production) {
console.log('process env production: ', process.env.production)
const httpsOptions = {
key: fs.readFileSync('/etc/certs/letsencrypt/live/testtest.de/privkey.pem'),
cert: fs.readFileSync('/etc/certs/letsencrypt/live/testtest.de/fullchain.pem'),
}
// prod config
appConfig = {
httpsOptions,
}
}
const app = await NestFactory.create<NestExpressApplication>(
AppModule,
appConfig,
)
app.enableCors()
app.setGlobalPrefix('v1')
await app.listen(3300)
}
bootstrap()
We don't build our application on production, but instead build it when creating our docker container.
The steps for us roughly are:
Run npm install and whatever tooling you need to build the application.
Create docker container and copy dist/, node_modules and package.json
Inside the docker container run npm rebuild bcrypt --update-binary
We are using NX for monorepo where we hold our API's. And we use docker for our images and containers. When we have to create docker image, only run: npx nx build <project> and this generate build on dist/apps/<project>. This folder goes to the docker image, with the package.json and that's it. You don't need to add node_modules, because they are on the package.json. Just be sure to include npm install on your Dockerfile.

Can't find module error when building docker for NodeJS app

I wrote a DockerFile for a node application. This is the docker file:
FROM node:10.15.0
COPY frontend/ frontend/
WORKDIR frontend/
RUN npm install
RUN npm start
When I try to build this Dockerfile, I get this error: ERROR in ./app/main.js Module not found: Error: Can't resolve './ResetPwd' in '/frontend/app'
So I added RUN ls & RUN ls /app in Dockerfile. Both of the files are there! I'm not familiar with NodeJS and it's build process at all. Can anybody help me with this?
Point: I'm not sure if it helps or not, but I'm using Webpack too.
The problem was that our front-end developer considered that node imports are case insensitive and he was using windows. I tried to run Dockerfile on mac and that's why it couldn't find the modules. Module name was resetPass!
This question saved me!
hope this helps somebody else.
I have an angular app and I was trying to containerize it using docker.
I build the app on a windows machine. and I was trying to build it inside a linux container.
the app was building fine on my windows machine and failing with the following error in the docker environment:
ERROR in folder1/folder2/name.component.ts: - error TS2307: Cannot find module '../../../folder1/File.name'.
import { Interface1} from '../../../folder1/File.name';
Cannot find module '../../../node_modules/rxjs/Observable.d.ts'.
import { Observable } from 'rxjs/observable';
it was driving me nuts.
I saw this question and at first did not think that it was what was going on. the next day I decided to build the same app in a linux environment just to make sure. Used WSL 2 and boom:
the real problem!
ERROR in error TS1149: File name '/../../node_modules/rxjs/observable.d.ts' differs from already included file name '/../../node_modules/rxjs/Observable.d.ts' only in casing.
6 import { Observable } from 'rxjs/observable';
SO it was a casing issue. I corrected the casing and it builds fine!
I cant say if this will work for sure since I don't know if npm start actually triggers webpack, but if it doesn't you'll have to add an extra RUN line after the COPY frontend / line
There are a few issues here, try using this docker file instead
FROM node:10.15.0
# Copy dependency files and install packages
WORKDIR frontend
COPY frontend/package.* .
RUN npm install
# Copy src down and other stuff
COPY frontend /
# cd to the file with the package.json
WORKDIR /appDir/frontend
# Command that executes when container starts up
CMD ["npm", "start"]
Make sure that you also update your .dockerignore to include node_modules. You'll have to build and run the container with the following commands.
docker build -t frontendApp .
docker run -p 8080:8080 frontendApp
The -p and 8080:8080 have to do with exposing internal ports to the outside world so you can view it in a browser, just change it to whatever port web pack is using to display your stuff.
I had to rebuild the disruptive package, like in this issue for node-sass
The command would be npm rebuild <package-name>
For me, this was npm rebuild node-sass

How to use remote node_modules (inside container) on WebStorm?

I installed all npm dependencies inside to the container. So I don't want to install dependencies to my host machine. Everything is okay, it works. But there is a problem with Webstorm.
It says "Unresolved function" for npm dependencies.
How to fix that problem? How can I say "Hey webstorm, node_modules directory is inside the container :)"
WebStorm expects node_modules to be located in the project folder.
You can try setting up NODE_PATH in Node.js run configuration template: Run | Edit Configurations..., expand Templates node, select Node.js configuration, specify NODE_PATH in Environment variables field
Please see comments in https://youtrack.jetbrains.com/issue/WEB-19476.
But I'm not sure it will work for modules installed in container...
Even though you expose the container's node_modules folder it's likely to not work as expected because npm dependencies are built according their host environment, which will not be the same as your local dev machine.
This statement applies even stronger if you want to run some CLI developments tools
- which sometimes are compiled binary files.
TLDR;
Trick is to update the path inside docker container from /your-path to /opt/project.
Detail solution
The issue with webstorm is they don't allow you to define the path from where you can pick node_modules. But they have a default path from which they pick it. I was facing the same issue while I wanted to integrate remote debugging for a backend node service running inside docker.
You need to update your docker file. Suppose you were using Dockerfile was something like this
# pull official base image
FROM node:12.11-buster
# set working directory
WORKDIR /app
COPY ./package.json ./package-lock.json /app
RUN npm install
Now this won't be detecting the node_modules for remote debugging or any other integration you need in webstorm.
But if you update the dockerfile to something like this
# pull official base image
FROM node:12.11-buster
# set working directory
# This done particularly for enabling debugging in webstorm.
WORKDIR /opt/project
COPY ./package.json ./package-lock.json ./.npmrc /opt/project
RUN npm install
Then the webstorm is able to detect everything as expected.
Trick is to update the path from /your-path to /opt/project
And your docker-compose file should look something like this:
version: "3.7"
services:
backend-service:
build:
dockerfile: ./Dockerfile.local
context: ./
command: nodemon app.js
volumes:
- ./:/opt/project
- /opt/project/node_modules/
ports:
- 6060:6060
You can check more details around this in this blog

Syncing local code inside docker container without having container service running

I have created a docker image which has an executable node js app.
I have multiple modules which are independent of themselves. These modules are created as a package inside docker using npm link command hence can be required in my node js index file.
The directory structure is as
|-node_modules
|-src
|-app
|-index.js
|-independent_modules
|-some_independent_task
|-some_other_independent_task
While building the image I have created npm link for every independent module in the root node_modules. This creates a node_modules folder inside every independent module, which is not present in local. This is only created inside the container.
I require these modules in src/app/index.js and proceed with my task.
This docker image does not use a server to keep the container running, hence the container stops when the process ends.
I build the image using
docker build -t demoapp
To run the index.js in the dev environment I need to mount the local src directory to docker src directory to reflect the changes without rebuilding the image.
For mounting and running I use the command
docker run -v $(pwd)/src:/src demoapp node src/index.js
The problem here is, in local, there is no dependencies installed i.e no node_modules folder is present. Hence while mounting local directory into docker, it replaces it with an empty one, hence the dependencies installed inside docker in node_modules vanish out.
I tried using .dockerignore to not mount the node_modules folder but it didn't work. Also, keeping empty node_modules in local also doesn't work.
I also tried using docker-compose to keep volumes synced and hide out node_modules from it, but I think this only syncs when the docker is running with any server i.e docker container keeps running.
This is the docker-compose.yml I used
# docker-compose.yml
version: "2"
services:
demoapp_container:
build: .
image: demoapp
volumes:
- "./src:/src"
- "/src/independent_modules/some_independent_task/node_modules"
- "/src/independent_modules/some_other_independent_task/node_modules"
container_name: demoapp_container
command: echo 'ready'
environment:
- NODE_ENV=development
I read this here that using this it will skip the `node_modules from syncing.
But this also doen't works for me.
I need to execute this index.js every time within a stopped docker container with the local code synced to the docker workdir and skipping the dependencies folder i.e node_modules.
One more thing if it could happen will be somewhat helpful. Every time I do docker-compose up or docker-compose run it prints ready. Can I have something, where I can override the command in docker-compose with the command passed from CLI.
Something like docker-compose run | {some command}.
You've defined a docker-compose file but you're not actually using it.
Since you use docker run, this is the command you should try:
docker run \
-v $(pwd)/src:/src \
-v "/src/independent_modules/some_independent_task/node_modules"
-v "/src/independent_modules/some_other_independent_task/node_modules"
demoapp \
node src/index.js
If you want to use the docker-compose, you should change command to be node src/index.js. Then you can use docker-compose up instead of the whole docker run ....

How can you get Grunt livereload to work inside Docker?

I'm trying to use Docker as a dev environment in Windows.
The app I'm developing uses Node, NPM and Bower for setting up the dev tools, and Grunt for its task running, and includes a live reload so the app updates when the code changes. Pretty standard. It works fine outside of Docker but I keep running into the Grunt error Fatal error: Unable to find local grunt. no matter how I try to do it inside Docker.
My latest effort involves installing all the npm and bower dependencies to an app directory in the image at build time, as well as copying the app's Gruntfile.js to that directory.
Then in Docker-Compose I create a Volume that is linked to the host app, and ask Grunt to watch that volume using Grunt's --base option. It still won't work. I still get the fatal error.
Here are the Docker files in question:
Dockerfile:
# Pull base image.
FROM node:5.1
# Setup environment
ENV NODE_ENV development
# Setup build folder
RUN mkdir /app
WORKDIR /app
# Build apps
#globals
RUN npm install -g bower
RUN echo '{ "allow_root": true }' > /root/.bowerrc
RUN npm install -g grunt
RUN npm install -g grunt-cli
RUN apt-get update
RUN apt-get install ruby-compass -y
#locals
ADD package.json /app/
ADD Gruntfile.js /app/
RUN npm install
ADD bower.json /app/
RUN bower install
docker-compose.yml:
angular:
build: .
command: sh /host_app/startup.sh
volumes:
- .:/host_app
net: "host"
startup.sh:
#!/bin/bash
grunt --base /host_app serve
The only way I can actually get the app to run at all in Docker is to copy all the files over to the image at build time, create the dev dependencies there and then, and run Grunt against the copied files. But then I have to run a new build every time I change anything in my app.
There must be a way? My Django app is able to do a live reload in Docker no problems, as per Docker's own Django quick startup instructions. So I know live reload can work with Docker.
PS: I have tried leaving the Gruntfile on the Volume and using Grunt's --gruntfile option but it still crashes. I have also tried creating the dependencies at Docker-Compose time, in the shared Volume, but I run into npm errors to do with unpacking tars. I get the impression that the VM can't cope with the amount of data running over the shared file system and chokes, or maybe that the Windows file system can't store the Linux files properly. Or something.

Resources