How to use remote node_modules (inside container) on WebStorm? - node.js

I installed all npm dependencies inside to the container. So I don't want to install dependencies to my host machine. Everything is okay, it works. But there is a problem with Webstorm.
It says "Unresolved function" for npm dependencies.
How to fix that problem? How can I say "Hey webstorm, node_modules directory is inside the container :)"

WebStorm expects node_modules to be located in the project folder.
You can try setting up NODE_PATH in Node.js run configuration template: Run | Edit Configurations..., expand Templates node, select Node.js configuration, specify NODE_PATH in Environment variables field
Please see comments in https://youtrack.jetbrains.com/issue/WEB-19476.
But I'm not sure it will work for modules installed in container...

Even though you expose the container's node_modules folder it's likely to not work as expected because npm dependencies are built according their host environment, which will not be the same as your local dev machine.
This statement applies even stronger if you want to run some CLI developments tools
- which sometimes are compiled binary files.

TLDR;
Trick is to update the path inside docker container from /your-path to /opt/project.
Detail solution
The issue with webstorm is they don't allow you to define the path from where you can pick node_modules. But they have a default path from which they pick it. I was facing the same issue while I wanted to integrate remote debugging for a backend node service running inside docker.
You need to update your docker file. Suppose you were using Dockerfile was something like this
# pull official base image
FROM node:12.11-buster
# set working directory
WORKDIR /app
COPY ./package.json ./package-lock.json /app
RUN npm install
Now this won't be detecting the node_modules for remote debugging or any other integration you need in webstorm.
But if you update the dockerfile to something like this
# pull official base image
FROM node:12.11-buster
# set working directory
# This done particularly for enabling debugging in webstorm.
WORKDIR /opt/project
COPY ./package.json ./package-lock.json ./.npmrc /opt/project
RUN npm install
Then the webstorm is able to detect everything as expected.
Trick is to update the path from /your-path to /opt/project
And your docker-compose file should look something like this:
version: "3.7"
services:
backend-service:
build:
dockerfile: ./Dockerfile.local
context: ./
command: nodemon app.js
volumes:
- ./:/opt/project
- /opt/project/node_modules/
ports:
- 6060:6060
You can check more details around this in this blog

Related

Syncing node_modules in docker container with host machine

I would like to dockerize my react application and I have one question on doing so. I would like to install node_modules on the containter then have them synced to the host, so that I can run the npm commands on the container not the host machine. I achieved this, but the node_modules folder that is synced to my computer is empty, but is filled in the container. This is an issue since I am getting not installed warnings in the IDE, because the node_modules folder in the host machine is empty.
docker-compose.yml:
version: '3.9'
services:
frontend:
build:
dockerfile: Dockerfile
context: ./frontend
volumes:
- /usr/src/app/node_modules
- ./frontend:/usr/src/app
Dockerfile:
FROM node:18-alpine3.15
WORKDIR /usr/src/app
COPY package*.json ./
RUN npm install && \
mkdir -p node_modules/.cache && \
chmod -R 777 node_modules/.cache
COPY ./ ./
CMD npm run start
I would appreciate any tips and/or help.
You can't really "share" node_modules like this because there certain OS-specific steps which happen during installation. Some modules have compilation steps which need to target the host machine. Other modules have bin declarations which are symlinked, and symlinks cannot be "mounted" or shared between a host and container. Even different versions of node cannot share node_modules without rebuilding.
If you are wanting to develop within docker, you have two options:
Editing inside a container with VSCode (maybe other editors do this too?). I've tried this before and it's not very fun and is kind of tedious - it doesn't quite work the way you want.
Edit files on your host machine which are mounted inside docker. Nodemon/webpack/etc will see the changes and rebuild accordingly.
I recommend #2 - I've seen it used at many companies and is a very "standard" way to do development. This does require that you do an npm install on the host machine - don't get bogged down by trying to avoid an extra npm install.
If you want to make installs and builds faster, your best bet is to mount your npm cache directory into your docker container. You will need to find the npm cache location on both your host and your docker container by running npm get cache in both places. You can do this on your docker container by doing:
docker run --rm -it <your_image> npm get cache
You would mount the cache folder like you would any other volume. You can run a separate install in both docker and on your host machine - the files will only be downloaded once, and all compilations and symlinking will happen correctly.

is it necesary copy dependencies in Dockerfile when using containers for dev only?

I want to create a dev enviroment for a node app with Docker. I have seen examples of Dockerfile with similar configurations as the following:
FROM node:14-alpine
WORKDIR /express-api-server
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 3000
CMD ["node", "index.js"]
I know that you can use volumes in docker-compose.yml to map host directories with directories of the containers using Volumes, thus you can make changes in your code and save data in a mongo database and preserve those changes locally when deleting or stopping container.
My question is: if I want to use the container for dev purposes only, there is any benefit on copying the package.json and package-lock.json, and installing dependencies?
I can use volumes to map the node_modules and the package files alonside with the code, so there's no need for me to take those action when building the image the first time.
correct. both work.
you just need to balance pros and cons.
the most obvious advantage is that having one dockerfile for dev and prod is easier and garanty that the environment is the same.
i personally have a single dockerfile for dev / test / prod for max coherence. and i mount volume with code and dependencies for dev.
when i do "npm install" i do it on the host. it instantly restarts the project without needing to rebuild. then when i want to publish to prod i do a docker build. it rebuils everything. ignoring mounts.
if you do like me, check that host nodejs version and docker's nodejs version is the same.

Unable to run (Linux container) or create image (Windows container) a Gatsby React site (win binaries error, matching manifest error) through Docker

I have my website wrapped up and wanted to containerize it for experience as I've never used Docker before. It's built on Gatsby. I did a fresh install of Docker and am running into two issues:
If I try to create an image in a Linux container, it seems to work, but I can't actually run it. I get the following error: "Error in "/app/node_modules/gatsby-transformer-sharp/gatsby-node.js": 'win32-x64' binaries cannot be used on the 'linuxmusl-x64' platform. Please remove the 'node_modules/sharp' directory and run 'npm install' on the 'linuxmusl-x64' platform."
I tried the above, uninstalling and reinstalling sharp in my project to no avail.I'm not even using sharp nor do I know what it is, though.
If I switch to Windows containers, I can't even create an image as I get the following:
"no matching manifest for windows/amd64 10.0.18363 in the manifest list entries"
My Dockerfile is as follows:
FROM node:13.12.0-alpine
# set working directory
WORKDIR /app
# add `/app/node_modules/.bin` to $PATH
ENV PATH /app/node_modules/.bin:$PATH
# install app dependencies
COPY package.json ./
COPY package-lock.json ./
RUN npm install --silent
RUN npm install react-scripts#3.4.1 -g --silent
# add app
COPY . ./
# start app
CMD ["npm", "start"]
and my .dockerignore contains
node_modules
build
Dockerfile
Dockerfile.prod
.git
Things I've tried:
This tutorial > https://mherman.org/blog/dockerizing-a-react-app/ (Where I got the Dockerfile text)
This tutorial >https://www.robinwieruch.de/docker-create-react-app-development (And its Dockerfile at one point)
Changing the FROM for node: to 14.4.0, 14, with or without -alpine.
Uninstalling and re-installing sharp
Uninstalling sharp entirely and trying to run it that way (I still get the sharp error for some reason)
Reading the documentation. Which for whatever reason only tells you how to launch a default application (such as create-react-app) or one pulled from somewhere, but not how to do so for our own website.
Thanks

Can't find module error when building docker for NodeJS app

I wrote a DockerFile for a node application. This is the docker file:
FROM node:10.15.0
COPY frontend/ frontend/
WORKDIR frontend/
RUN npm install
RUN npm start
When I try to build this Dockerfile, I get this error: ERROR in ./app/main.js Module not found: Error: Can't resolve './ResetPwd' in '/frontend/app'
So I added RUN ls & RUN ls /app in Dockerfile. Both of the files are there! I'm not familiar with NodeJS and it's build process at all. Can anybody help me with this?
Point: I'm not sure if it helps or not, but I'm using Webpack too.
The problem was that our front-end developer considered that node imports are case insensitive and he was using windows. I tried to run Dockerfile on mac and that's why it couldn't find the modules. Module name was resetPass!
This question saved me!
hope this helps somebody else.
I have an angular app and I was trying to containerize it using docker.
I build the app on a windows machine. and I was trying to build it inside a linux container.
the app was building fine on my windows machine and failing with the following error in the docker environment:
ERROR in folder1/folder2/name.component.ts: - error TS2307: Cannot find module '../../../folder1/File.name'.
import { Interface1} from '../../../folder1/File.name';
Cannot find module '../../../node_modules/rxjs/Observable.d.ts'.
import { Observable } from 'rxjs/observable';
it was driving me nuts.
I saw this question and at first did not think that it was what was going on. the next day I decided to build the same app in a linux environment just to make sure. Used WSL 2 and boom:
the real problem!
ERROR in error TS1149: File name '/../../node_modules/rxjs/observable.d.ts' differs from already included file name '/../../node_modules/rxjs/Observable.d.ts' only in casing.
6 import { Observable } from 'rxjs/observable';
SO it was a casing issue. I corrected the casing and it builds fine!
I cant say if this will work for sure since I don't know if npm start actually triggers webpack, but if it doesn't you'll have to add an extra RUN line after the COPY frontend / line
There are a few issues here, try using this docker file instead
FROM node:10.15.0
# Copy dependency files and install packages
WORKDIR frontend
COPY frontend/package.* .
RUN npm install
# Copy src down and other stuff
COPY frontend /
# cd to the file with the package.json
WORKDIR /appDir/frontend
# Command that executes when container starts up
CMD ["npm", "start"]
Make sure that you also update your .dockerignore to include node_modules. You'll have to build and run the container with the following commands.
docker build -t frontendApp .
docker run -p 8080:8080 frontendApp
The -p and 8080:8080 have to do with exposing internal ports to the outside world so you can view it in a browser, just change it to whatever port web pack is using to display your stuff.
I had to rebuild the disruptive package, like in this issue for node-sass
The command would be npm rebuild <package-name>
For me, this was npm rebuild node-sass

Unable to deploy React Application to Kubernetes

I am trying to deploy an application created using Create-React App to Kubernetes through Docker.
When the docker file tries to create the container using Jenkins pipeline, it fails with the below error :
"Starting the development server...
Failed to compile.
./src/index.js
Module not found: Can't resolve './App.js' in '/app/src'
The folder structure is exactly similar to the default 'create-react app' folder structure.
Also below is the Dockerfile:
FROM node:10.6.0-jessie
# set working directory
RUN mkdir /app
WORKDIR /app
COPY . .
# add `/usr/src/app/node_modules/.bin` to $PATH
#ENV PATH /usr/src/app/node_modules/.bin:$PATH
RUN npm install
#RUN npm install react-scripts -g --silent
# start app
CMD ["npm", "start"]
I am unable to understand where I might be going wrong.
Edit 1: I would also like to mention that I am able to run the docker container on my local machine using this config.
So any help would be appreciated.
Update 1 :
I was able to do a kubectl exec -it pod_name -- bash to the container inside the pod. I found out due to some reason the "App.js" file was getting copied to the container as "app.js". Since linux is case sensitive so it was not able to find the file. Changing the import statement in index.js fixed the problem. But I still have no idea as to what might have caused the file to get copied with a lower-case since in my local the file exists as "App.js".
The problem you're having will be omitted when you adjust your deployment process to a more production-ready setup.
What you're doing currently is installing all (development) dependencies on every Kubernetes node, compiling your application, and then starting a development webserver. This makes your deployed builds inconsistent and increases load and bloat on the deployment nodes.
Instead what you want to do is create a production-ready build by running npm run build on a build machine, which will compile your application and output to the build folder in your project. You then want to transfer this folder to your server in a .zip file, which will need a production-ready webserver installed (Nginx is highly recommended and industry standard) to serve the static files from your build.

Resources