npm ERR! missing script: serve - node.js

Getting error while starting the docker container. I am using nodemon to listen to the file changes.
DockerFile
FROM node:alpine
WORKDIR '/app'
COPY package.json .
RUN npm install
COPY . .
CMD ["npm","run","serve"]
Package.json
{
"dependencies": {
"express": "*",
"nodemon": "*"
},
"scripts": {
"serve": "nodemon index.js",
"start": "node index.js"
}
}
build command
docker build -f Dockerfile.dev -t test/nodeapp1 .
cmdLine docker cmd ->
docker run -p 3000:8080 -v /app/node_modules -v pwd:/app test/nodeapp1.
Iam new to docker, and not able to figure out the cause.

Make this changes in your dockerfile
FROM node:alpine
ENV NPM_CONFIG_PREFIX=/home/node/.npm-global
ENV HOME=/home/node/app
ENV PATH="/home/node/.npm-global/bin:${PATH}"
USER node
RUN npm install -g nodemon
RUN mkdir -p ${HOME}
WORKDIR ${HOME}
ADD package.json ${HOME}
RUN cd ${HOME} && npm install
CMD [ "npm" ,"run", "serve" ]
Build the docker container
docker build -f Dockerfile -t prac/nodeApp .
Run the docker container
docker run -p 3000:8080 -v /app/node_modules -v pwd:/app prac/nodeApp

Changing the WORKDIR to a new value worked.
FROM node:alpine
WORKDIR '/dir'
COPY package.json .
RUN npm install
COPY . .
CMD [ "npm" ,"run", "serve" ]

Your docker run -v options are wrong. You probably actually meant to write
docker run ... -v $PWD:/app ...
docker run ... -v $(pwd):/app ...
to use the current directory (from the PWD environment variable or from the pwd command, respectively) as a bind mount.
I tend to not recommend this pattern, especially for Node applications where the host dependencies are minimal and you're not interacting much with other containers. It's probably easier to just install Node locally (if you don't already have it) and do live development against that; when you want to use Docker to deploy your application, use the version you've COPYed into the image, and don't separately use a -v option to inject your code over it.

Related

Using Nodemon with Docker is not refreshing the project when the codebase changes

I am building a Node.JS application using Postgres as the database. I am using Docker/Docker compose for the development environment. I could dockerise my environment. However there is one issue remain which is the project is not refreshing or showing the latest changes using Nodemon. Basically it is not detecting the code changes.
I have a docker-compose.yaml file with the following code.
version: '3.8'
services:
web:
container_name: web-server
build: .
ports:
- 4000:4000
restart: always
This is my Dockerfile.
FROM node:14-alpine
RUN apk update && apk upgrade
RUN apk add nodejs
RUN rm -rf /var/cache/apk/*
COPY . /
RUN cd /; npm install;
RUN npm install -g nodemon
EXPOSE 4000
CMD ["npm", "run", "docker-dev"]
This is my docker-dev script in package.json file
"docker-dev": "nodemon -L --inspect=0.0.0.0 index.js"
When I spin up the environment running docker-compose up -d and go to the localhost:4000 on the browser, I can see my application up and running there. But when I make the changes to the code and refresh the page, the new changes are not there. It seems like Nodemon is not working. I had to remove Docker images and spin up the environment again to see the latest changes.
How can I fix it?
During development, I often use a raw node image and run it like this
docker run --rm -d -v $(pwd):/app --workdir /app -u $(id -u):$(id -g) -p 3000:3000 --entrypoint /bin/bash node -c "npm install && npm start"
The different parts of the command do the following
--rm remove the container when it exits
-d run detached
-v $(pwd):/app map the current directory to /app in the container
--workdir /app set the working directory in the container to /app
-u $(id -u):$(id -g) run the container with my UID and GID so any files created in the host directory will be owned by me
-p 3000:3000 map port 3000 to the host
--entrypoint /bin/bash set the entrypoint to bash
node run the official node image
-c "npm install && npm start" on container start, install npm packages and start the app
You can do something similar. If we replace a few things, this should match your project
docker run --rm -d -v $(pwd):/app --workdir /app -u $(id -u):$(id -g) -p 4000:4000 --entrypoint /bin/sh node:14-alpine -c "npm install && npm install -g nodemon && npm run docker-dev"
I've changed the entrypoint because Alpine doesn't have bash installed. The image is node:14-alpine. The port is 4000. And on start, you want it to install nodemon and run the docker-dev script.
I put the command in a shell script so I don't have to type it all out every time.
You are doing your changes in your local file system and not in the docker container file system.
You need to use a volume of you want to see your changes reflecting real time.

expo build:web docker cannot find correct entry point

NOOB to building an expo web app. I’m trying to build a docker container with my expo web app. After I run “expo build:web” and the build successfully finishes, I get a run error when I try to execute.
I have a custom entry point defined in my app.json: “entryPoint”: “./index.js”,
When I run the app in the docker container and connect to http://localhost:19006 I get:
./node_modules/expo/AppEntry.js:3
Module not found: Can't resolve '../../App'
I can see the entry point defined in the “asset-manifest.json” file. But I don’t know the steps to call that as my starting point and why it’s trying to use “../../App” instead.
Here is my Dockerfile if it helps:
FROM node:12.20.2 as build
ARG NODE_ENV=production
ENV NODE_ENV $NODE_ENV
ARG PORT=19006
ENV PORT $PORT
EXPOSE $PORT 19001 19002
ENV NPM_CONFIG_PREFIX=/home/node/.npm-global
ENV PATH /home/node/.npm-global/bin:$PATH
RUN npm i --unsafe-perm -g npm#latest expo-cli#latest sharp-cli
RUN mkdir /opt/web && chown node:node /opt/web
WORKDIR /opt/web
ENV PATH /opt/web/.bin:$PATH
USER node
COPY package.json ./
COPY .env.production ./.env
COPY ./private ./private
RUN yarn install --silent
RUN ls -al
WORKDIR /opt/web/app
COPY ./web-build .
RUN ls -al
ENTRYPOINT ["npm", "run"]
CMD ["web"]
I figured it out!
In my package.json file, I had the line:
"main": "node_modules/expo/AppEntry.js",
I changed it to:
"main": "index.js",
This worked in yarn start mode but not in production so I never gave it another look.
I also had to change the docker file a little by using "serve":
FROM node:12.20.2 as build
ARG NODE_ENV=production
ENV NODE_ENV $NODE_ENV
ARG PORT=19006
ENV PORT $PORT
EXPOSE $PORT 19001 19002
ENV NPM_CONFIG_PREFIX=/home/node/.npm-global
ENV PATH /home/node/.npm-global/bin:$PATH
RUN npm i --unsafe-perm -g npm#latest expo-cli#latest serve
RUN mkdir /opt/web && chown node:node /opt/web
WORKDIR /opt/web
ENV PATH /opt/web/.bin:$PATH
USER node
COPY package.json ./
COPY .env.production ./.env
COPY ./private ./private
RUN yarn install --silent
RUN ls -al
WORKDIR /opt/web/app
COPY ./web-build .
RUN ls -al
CMD ["serve","--no-port-switching","-p","19006"]

How to run Docker image and configure it with nginx

I have made a Docker image for a nodeJS and it is running in Local perfectly but in Production, I have to configure it with Nginx(Which I installed in the host machine). We normally did like
location /location_of_app_folder {
proxy_pass http://api.prv:51967/info;
}
How will I configure this in nginx for docker image and how to run docker image. We used pm2 in nodeJS wch I added in Docker file But it is running till I press ctrl+C.
FROM keymetrics/pm2:latest-alpine
RUN mkdir -p /app
WORKDIR /app
COPY package.json ./
COPY .npmrc ./
RUN npm config set registry http://private.repo/:_authToken=authtoken.
RUN npm install utilities#0.1.9
RUN apk update && apk add yarn python g++ make && rm -rf /var/cache/apk/*
RUN set NODE_ENV=production
RUN npm config set registry https://registry.npmjs.org/
RUN npm install
COPY . /app
RUN ls -al -R
EXPOSE 51967
CMD [ "pm2-runtime", "start", "pm2.json" ]
I am running the container with the command:
sudo docker run -it --network=host docker_repo_name
expose the docker image port and use the same nginx configuration, ex:
sudo docker run -it -p 51967:51967 docker_repo_name

Running gulp in Docker compose - does not create files

I have issues where gulp is not making any files. It says finished, but no file is being created.
If I log in to my docker instance using:
docker exec -t -i myservice-service /bin/bash
and if I run the gulp command, then it creates it properly
Then all the files defined in the gulpfile.js are created. In other words, public/dist/ is populated with the main.js and other css files.
This is my Dockerfile.
FROM node:9
RUN mkdir -p /usr/src/app
RUN mkdir -p /usr/src/logs
WORKDIR /usr/src/app
# GULP Installation
RUN npm install -g gulp
RUN npm install gulp
COPY package*.json /usr/src/app/
COPY .npmrc /usr/src/app/
RUN cd /usr/src/app/ && npm install && npm install -g nodemon
COPY . /usr/src/app
RUN chown -R node:node /usr/src/app && chown -R node:node /usr/src/logs
USER node
EXPOSE 3000
RUN gulp
CMD ["npm", "run-script", "start" ]
And this is my composer file (development):
version: "3"
services:
myservice-service:
build: .
image: myservice-service
container_name: myservice-service
volumes:
- .:/usr/src/app
- /usr/src/app/node_modules
environment:
- NODE_ENV=dev
ports:
- 3000:3000
command: nodemon --delay 2 ./bin/www
I run it as:
docker-compose -f docker-compose.development.yml up --build
When I run it like that, it does not create any files. I get the same output on the screen, when I run the command manually.
I have spent hours trying to make it work, I tried with setting permissions and what not, but it just does not work.
My expectation was to have public/dist/ populated with files.
Any help is appreciated.
UPDATE. It works, but I have doubts:
I manage to make it work by using command inside the composerfile itself.
So in my case:
command: bash -c "gulp && nodemon --delay 2 ./bin/www"
In my reasoning, gulp should be done inside the Dockerfile itself, not on the composer files. But then again, it is out of my scope of knowledge.
The Dockerfile is run at build time and will COPY all the files in your local directory into the container, then run gulp and create any files.
You then mount the local folder over the docker containers file system, pretty much overwriting what was done in the docker file with the original files, as gulp ran on the files in the container, it did not effect the original files so you are undoing the changes.
The solutions are either to do as as you have mentioned in your question (add it to the command in docker-compose.yml or run it via docker-compose exec) or write a custom entrypoint script that will run gulp and then the command, something like:
bin/entrypoint.sh
#!/bin/sh
gulp
exec "$#"
Dockerfile
FROM node:9
COPY bin/entrypoint.sh /entrypoint.sh
RUN chmod 755 /entrypoint.sh
RUN mkdir -p /usr/src/app
RUN mkdir -p /usr/src/logs
WORKDIR /usr/src/app
# GULP Installation
RUN npm install -g gulp
RUN npm install gulp
COPY package*.json /usr/src/app/
COPY .npmrc /usr/src/app/
RUN cd /usr/src/app/ && npm install && npm install -g nodemon
COPY . /usr/src/app
RUN chown -R node:node /usr/src/app && chown -R node:node /usr/src/logs
USER node
EXPOSE 3000
ENTRYPOINT ["/entrypoint.sh"]
CMD ["npm", "run-script", "start" ]
This will make your build a little less predictable though as it will run gulp each time the container starts (e.g. after every deployment) if you use the same Dockerfile in dev and production.

Installing npm dependencies inside docker and testing from volume

I want to use Docker to create development environments for a simple node.js project. I'd like to install my project's dependencies (they are all npm packages) inside the docker container (so they won't touch my host) and still mount my code using a volume. So, the container should be able
to find the node_modules folder at the path where I mount the volume, but I should not see it from the host.
This is my Dockerfile:
FROM node:6
RUN mkdir /code
COPY package.json /code/package.json
WORKDIR /code
RUN npm install
This is how I run it:
docker build --tag my-dev-env .
docker run --rm --interactive --tty --volume $(pwd):/code my-dev-env npm test
And this is my package.json:
{
"private": true,
"name": "my-project",
"version": "0.0.0",
"description": "My project",
"scripts": {
"test": "jasmine"
},
"devDependencies": {
"jasmine": "2.4"
},
"license": "MIT"
}
It fails because it can't find jasmine, so it's not really installing it:
> jasmine
sh: 1: jasmine: not found
Can what I want be accomplished with Docker? An alternative would be to install the packages globally. I also tried npm install -g to no avail.
I'm on Debian with Docker version 1.12.1, build 23cf638.
The solution is to also declare /code/node_modules as a volume, only without bind-mounting it to any directory in the host. Like this:
docker run --rm --interactive --tty --volume /code/node_modules --volume $(pwd):/code my-dev-env npm test
As indicated by #JesusRT, npm install was working just fine but bind-mounting $(pwd) to /code alone was shadowing the existing contents of /code in the image. We can recover whatever we want from /code in the container by declaring it as a data volume -- in this case, just /code/node_modules, as shown above.
A very similar problem is already discussed in Docker-compose: node_modules not present in a volume after npm install succeeds.
The issue here is that you are overwriting the /code folder.
Note that you are executing the npm install command at building time, so the image that is created has in the /code folder the node_modules folder. The problem is that you are mounting a volume in the /code folder when executing the docker run command, so this folder will be overwritten with the content of your local machine.
One approach could be executing the npm install before the npm test command:
docker run --rm --interactive --tty my-dev-env npm install && npm test
Also, in order to the execution of the command jasmine works properly, you will have to modify your package.json as follow below:
"scripts": {
"test": "./node_modules/.bin/jasmine"
}

Resources