I have an Angular project which I built an image, it runs in local, in other words, I can launch it with npm start, and see my app on localhost:4200.
But when I build it, I can't see the project in the browser on the same url. It's just my browser telling me it's unable to connect. I also used Postman same error.
Dockerfile :
FROM node:latest
# Create app directory
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
# Install app dependencies
COPY package.json /usr/src/app/
RUN npm install
# Bundle app source
COPY . /usr/src/app
EXPOSE 4200
CMD [ "npm", "start" ]
The command
docker build -t test-front .
...
docker run -it -p 4200:4200 test-front
It builds the image and run it well, but I can't access it on the browser.
I used the answer to this question to remake my Dockerfile:
Cannot find module for a node js app running in a docker compose environment
What I did:
I did a `docker exec -it NAME_OF_THE_CONTAINER /bin/bash` to get inside the container, I then retried `npm start` inside the container.
Since the port 4200 was in use, it asked me to use another port:
root#a66d5cff3e64:/usr/src/app# npm start
> xxx#0.0.0 start
> ng serve
? Port 4200 is already in use.
Would you like to use a different port? Yes
✔ Browser application bundle generation complete.
...
** Angular Live Development Server is listening on localhost:43225, open your browser on http://localhost:43225/ **
The project was launching correctly in the container, no error message.
I think, the container doesn't expose the port correctly.
Checking port
I used the command:
user#user:~/front$ docker port <container_name>
4200/tcp -> 0.0.0.0:4200
4200/tcp -> :::4200
This is what I have.
Small detail
I'd like to precise, that using node:boron in Dockerfile, made the docker run break. It didn't worked when I used docker run command.
With the message:
> xxxxxx#0.0.0 start /usr/src/app
> ng serve
/usr/src/app/node_modules/#angular/cli/bin/ng:26
);
^
SyntaxError: Unexpected token )
at createScript (vm.js:56:10)
at Object.runInThisContext (vm.js:97:10)
at Module._compile (module.js:549:28)
at Object.Module._extensions..js (module.js:586:10)
at Module.load (module.js:494:32)
at tryModuleLoad (module.js:453:12)
at Function.Module._load (module.js:445:3)
at Module.runMain (module.js:611:10)
at run (bootstrap_node.js:394:7)
at startup (bootstrap_node.js:160:9)
npm ERR! Linux 5.11.0-22-generic
npm ERR! argv "/usr/local/bin/node" "/usr/local/bin/npm" "start"
npm ERR! node v6.17.1
npm ERR! npm v3.10.10
npm ERR! code ELIFECYCLE
npm ERR! xxxx#0.0.0 start: `ng serve`
npm ERR! Exit status 1
npm ERR!
npm ERR! Failed at the xxxx#0.0.0 start script 'ng serve'.
npm ERR! Make sure you have the latest version of node.js and npm installed.
npm ERR! If you do, this is most likely a problem with the xxx package,
npm ERR! not with npm itself.
npm ERR! Tell the author that this fails on your system:
npm ERR! ng serve
npm ERR! You can get information on how to open an issue for this project with:
npm ERR! npm bugs xxx
npm ERR! Or if that isn't available, you can get their info via:
npm ERR! npm owner ls xxx
npm ERR! There is likely additional logging output above.
npm ERR! Please include the following file with any support request:
npm ERR! /usr/src/app/npm-debug.log
I've changed my last line to CMD [ "npm", "run", "ng", "serve", "-host", "0.0.0.0" ].
I can build the image, but when I run it, I get:
> t-maj-voltron-front#0.0.0 ng
> ng "serve" "0.0.0.0"
Project '0.0.0.0' does not exist.
npm notice
npm notice New minor version of npm available! 7.15.1 -> 7.19.0
npm notice Changelog: https://github.com/npm/cli/releases/tag/v7.19.0
npm notice Run npm install -g npm#7.19.0 to update!
npm notice
a working Dockerfile:
# base image
FROM node:12.14
# install chrome for protractor tests
RUN wget -q -O - https://dl-ssl.google.com/linux/linux_signing_key.pub | apt-key add -
RUN sh -c 'echo "deb [arch=amd64] http://dl.google.com/linux/chrome/deb/ stable main" >> /etc/apt/sources.list.d/google.list'
RUN apt-get update && apt-get install -yq google-chrome-stable
# set working directory
WORKDIR /app
# add `/app/node_modules/.bin` to $PATH
ENV PATH /app/node_modules/.bin:$PATH
# install and cache app dependencies
COPY package.json /app/package.json
RUN npm install
RUN npm install -g #angular/cli#7.3.9
# add app
COPY . /app
EXPOSE 4200
# start app
CMD ng serve --host 0.0.0.0
I believe you're missing the port forwarding between the exposed port and your local host.
You can portforward via the -p or --expose flag and the <localport>:<containerport>
In your case:
docker run -it -p 4200:4200 test-front
Will forward traffic from localhost:4200 into your container
EDIT
I did some more reading and it looks like there's an additional step when working with angular in docker.
try editing your npm start command to ng serve --host 0.0.0.0. As explained very thoroughly by Juan adding the --host flag will "listen to all the interfaces from the container"
Related
I am trying to create a task in our Azure pipeline to validate our javascript.
We have a node container which performs an npm install when spun up:
node:
image: node:12-alpine
user: "node"
working_dir: /home/node/app
environment:
- NODE_ENV=development
volumes:
- ./:/home/node/app
expose:
- "8081"
command: "npm install"
To perform my task I have created a make command in the Makefile:
js-check: ## Run Jshint
docker-compose run node npm install && npm run jshint
Which I then call in the build job as follows:
- script: make js-check
displayName: 'Run JSHint'
Locally when I call the make js-check it performs the install, followed by the jshint which outputs 0 vulnerabilities found. However when the pipeline reaches this task remotely it fails claiming missing write access to /home/node/app
npm WARN checkPermissions Missing write access to /home/node/app
npm ERR! code EACCES
npm ERR! syscall access
npm ERR! path /home/node/app
npm ERR! errno -13
npm ERR! Error: EACCES: permission denied, access '/home/node/app'
npm ERR! [Error: EACCES: permission denied, access '/home/node/app'] {
npm ERR! errno: -13,
npm ERR! code: 'EACCES',
npm ERR! syscall: 'access',
npm ERR! path: '/home/node/app'
npm ERR! }
npm ERR!
npm ERR! The operation was rejected by your operating system.
npm ERR! It is likely you do not have the permissions to access this file as the current user
npm ERR!
npm ERR! If you believe this might be a permissions issue, please double-check the
npm ERR! permissions of the file and its containing directories, or try running
npm ERR! the command again as root/Administrator.
Your Makefile runs two commands; the shell interprets the && marker before it gets to Docker. That command is equivalent to:
js-check: ## Run Jshint
docker-compose run node npm install
npm run jshint # (without Docker)
It looks like your environment already has Node installed. You need to resolve the permissions issues (generally the CI system will check out source trees as a user that can run commands), and then you can use this native Node:
js-install: package.json
npm install
js-check: js-install
npm run jshint
This has the advantage of only depending on normal Javascript development tools; you don't need the extra docker-compose.yml file or administrator privileges just to run your unit tests.
If you really need to run this in Docker, you can either run this as two separate commands, or make the single container command be a shell that can interpret the && itself:
js-install1: package.json docker-compose.yml
docker-compose run node npm install
js-check1: js-install1
docker-compose run node npm run jshint
js-check2: package.json docker-compose.yml
docker-compose run node \
sh -c "npm install && npm run jshint"
Above error is because /home/node directory is owned by the node user in the default node image. The /app directory is created and owned by root. See this open issue about above error for more information.
I reproduced the same error with your compose file. When i changed the user to root. The error was gone.
So you can try changing the user to root instead of node in your compose file.
node:
image: node:12-alpine
user: "root"
working_dir: /home/node/app
As David pointed out, you also need to change your makefile to docker-compose run node sh -c "npm install && npm run jshint". if you want to run the commands in docker.
Another workaround is to build and run your container from a dockerfile instead of the compose file. See below simple example dockerfile.
from node:12-alpine
ENV NODE_ENV=development
RUN mkdir -p /home/node/app
RUN chown -R node:node /home/node/app
USER node
WORKDIR /home/node/app
COPY . ./
RUN npm install
CMD [ "npm", "run", "jshint" ]
Then change the Makefile like example:
js-check: ## Run Jshint
docker build --tag nodejshint:1.0 . && docker run --detach --name jshintContainer nodejshint:1.0
I'm building a Docker container for my Node.js + Vue application.
Since I have a global css library in another repository I have added this line in my package.json file:
"lib-css": "git+ssh://git#git.lib.com:9922/username/lib-css.git#development",
That way when I run npm install I install also my CSS library. The problem is that on my local env it asks for my password and I can insert it, but in the Docker build the process fails with the following error:
Step 7/10 : RUN npm install
---> Running in db10ca83586d
npm WARN deprecated babel-preset-es2015#6.24.1: 🙌 Thanks for using Babel: we recommend using babel-preset-env now: please read babeljs.io/env to update!
npm ERR! Error while executing:
npm ERR! /usr/bin/git ls-remote -h -t ssh://git#git.lib.com:9922/username/lib-css.git
npm ERR!
npm ERR! Host key verification failed.
npm ERR! fatal: Could not read from remote repository.
npm ERR!
npm ERR! Please make sure you have the correct access rights
npm ERR! and the repository exists.
npm ERR!
npm ERR! exited with error code: 128
npm ERR! A complete log of this run can be found in:
npm ERR! /root/.npm/_logs/2017-12-11T08_49_11_152Z-debug.log
This is my current Dockerfile:
FROM node:carbon
WORKDIR /usr/src/app
RUN mkdir -p /root/.ssh
COPY .secrets /root/.ssh/id_rsa
RUN chmod 700 /root/.ssh && chmod 600 /root/.ssh/*
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm#5+)
COPY package*.json ./
RUN npm install
# If you are building your code for production
# RUN npm install --only=production
# Bundle app source
COPY . .
EXPOSE 8081
CMD [ "npm", "run dev" ]
~
My .secrets file contains my private key associated to the repository.
How can I make this works?
When running "docker-compose up", I get the following error:
npm info lifecycle server#1.0.0~dev: server#1.0.0
> server#1.0.0 dev /code/app
> nodemon -L ./bin/www --exec babel-node
sh: 0: getcwd() failed: No such file or directory
path.js:1144
cwd = process.cwd();
^
Error: ENOENT: no such file or directory, uv_cwd at Error (native)
at Object.resolve (path.js:1144:25)
at Function.Module._resolveLookupPaths (module.js:361:17)
at Function.Module._resolveFilename (module.js:431:31)
at Function.Module._load (module.js:388:25)
at Module.require (module.js:468:17)
at require (internal/module.js:20:19)
at Object.<anonymous>
(/usr/local/lib/node_modules/nodemon/bin/nodemon.js:3:11)
at Module._compile (module.js:541:32)
at Object.Module._extensions..js (module.js:550:10)
npm info lifecycle server#1.0.0~dev: Failed to exec dev script
npm ERR! Linux 4.9.36-moby
npm ERR! argv "/usr/local/bin/node" "/usr/local/bin/npm" "run" "dev"
npm ERR! node v6.3.1
npm ERR! npm v3.10.3
npm ERR! code ELIFECYCLE
npm ERR! server#1.0.0 dev: `nodemon -L ./bin/www --exec babel-node`
npm ERR! Exit status 1
npm ERR!
npm ERR! Failed at the server#1.0.0 dev script 'nodemon -L ./bin/www --
exec babel-node'.
My dockerfile looks like this:
FROM joakimbeng/node-zeromq
RUN mkdir /code/
RUN mkdir /code/app/
COPY package.json /code/
WORKDIR /code
RUN npm install -g nodemon babel-cli
RUN npm install
WORKDIR /code/app
CMD ["npm", "run", "dev"]
And my service like this:
node:
build: ./node/
ports:
- "3000:3000"
volumes:
- ../code:/code/app
links:
- mongodb
- python
environment:
- NODE_ENV=dev
- NODE_PATH=/code/node_modules
- MONGODB_ADDRESS=mongodb
- PYTHON_ADDRESS=python
I've tried to delete all containers and images and run the whole thing again, but the same error appears. It seems to build fine when running "docker-compose build".
What I'm trying to accomplish here is:
1. Let the container handle all the dependencies (node modules)
2. Mount my code base to the container
3. Use nodemon for hot reload
I ended up with something similar to what I did initially. Not sure what caused the error in my OP, but the difference seems to be that I mount my dependencies in a different directory.
Dockerfile:
FROM joakimbeng/node-zeromq
RUN mkdir /code/
RUN mkdir /dependencies/
COPY package.json /dependencies/
WORKDIR /dependencies/
RUN npm install -g nodemon babel-cli
npm install
WORKDIR /code/
CMD bash -c "npm run dev"
Service in docker-compose:
node:
build: ./node/
ports:
- "3000:3000"
volumes:
- ../code/:/code
links:
- mongodb
- python
environment:
- NODE_ENV=dev
- NODE_PATH=/dependencies/node_modules
- MONGODB_ADDRESS=mongodb
- PYTHON_ADDRESS=python
This way my dependencies are only installed on build.
Your issue is the volume sharing. When you share a volume from host to the container. If the folder already exists in the container then the host container will shadow the container folder.
If you have 10 files inside container and 0 files on your host, then after volume mapping your container will see 0 files. Because the the host folder is mounted and it has nothing. So you Dockerfile statement
RUN npm install
Is effectively gone, if the host volume doesn't have the npm install done. Luckily the solution is simple. You can change your CMD to below
CMD bash -c "npm install && npm run dev"
In case you don't want to change the Dockerfile you can add the below in your docker-compose.yml file for the node service
command: bash -c "npm install && npm run dev"
Edit (14-Aug):
If you want your dependencies to be in image then you need to make few changes in your docker-compose.yml, what you need is the internal code to be left alone and just linking the node_modules from that directory to a you app directory
node:
build: ./node/
ports:
- "3000:3000"
volumes:
- ../code:/code/app
command: bash -c "ln -fs /code/node_modules /code/app/node_modules && exec npm run dev"
links:
- mongodb
- python
environment:
- NODE_ENV=dev
- NODE_PATH=/code/node_modules
- MONGODB_ADDRESS=mongodb
- PYTHON_ADDRESS=python
Another point i notice is that your running package.json install in /code and putting your code /code/app which is probably wrong when you run the image. But with the new edit I have suggested above, this should work
I'm learning docker and I'm having some troubles with the volume in a development environment with Nodejs.
I having the following simple dockerfile that aims to start my unit tests from a NodeJS parent image:
FROM node:4-onbuild
VOLUME ["/usr/src/app"]
CMD [ "npm", "test" ]
I'm running my container this way:
docker run -v /C/Users/myUser/dockertest:/usr/src/app notest
But I keep receiving the following error:
npm info it worked if it ends with ok
npm info using npm#2.15.9
npm info using node#v4.5.0
npm ERR! Linux 4.4.15-moby
npm ERR! argv "/usr/local/bin/node" "/usr/local/bin/npm" "test"
npm ERR! node v4.5.0
npm ERR! npm v2.15.9
npm ERR! path /usr/src/app/package.json
npm ERR! code ENOENT
npm ERR! errno -2
npm ERR! syscall open
npm ERR! enoent ENOENT: no such file or directory, open '/usr/src/app/package.json'
npm ERR! enoent This is most likely not a problem with npm itself
npm ERR! enoent and is related to npm not being able to find a file.
npm ERR! enoent
npm ERR! Please include the following file with any support request:
npm ERR! /usr/src/app/npm-debug.log
I don't understand why I'm not able to find the package.json file inside of the /usr/src/app folder inside of my container.
The fact is that if I'm using the following docker file :
FROM node:4-onbuild
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
CMD [ "npm", "test" ]
My tests will be running in the container, but only after building step (so I have to build each time I make a modification...) . I need to play these tests any time I want, by launching a docker command (restart ? exec ?) quickly
Thanks for your help
You want to COPY everything build relevant and link your codebase on containerstart.
FROM node:4.4
# Enviroment variables
ENV HOMEDIR /data
RUN mkdir -p ${HOMEDIR}
WORKDIR ${HOMEDIR}
# install all dependencies
COPY package.json ./
RUN npm install
# add node content initially
COPY . .
CMD ["npm", "test"]
And then link it on start:
version: '2'
services:
myApp:
build:
context: .
dockerfile: Dockerfile
image: "myApp"
volumes:
- .:/data
You should now be able to change your codebase without restarting/rebuilding your container.
I have a problem with my Docker on Windows (through Docker Toolbox). May be someone can help.
My Dockerfile without ONBUILD:
FROM node:5.9.1
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
COPY package.json /usr/src/app/
RUN npm install
COPY . /usr/src/app
CMD [ "npm", "start" ]
EXPOSE 3000
Working ok (docker build -t test . and start it: docker run -it --rm --name testrun test)
But if i change Dockerfile to ONBUILD option:
FROM node:5.9.1
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
ONBUILD COPY package.json /usr/src/app/
ONBUILD RUN npm install
ONBUILD COPY . /usr/src/app
CMD [ "npm", "start" ]
EXPOSE 3000
I get an errors:
npm info it worked if it ends with ok
npm info using npm#3.7.3
npm info using node#v5.9.1
npm ERR! Linux 4.1.19-boot2docker
npm ERR! argv "/usr/local/bin/node" "/usr/local/bin/npm" "start"
npm ERR! node v5.9.1
npm ERR! npm v3.7.3
npm ERR! path /usr/src/app/package.json
npm ERR! code ENOENT
npm ERR! errno -2
npm ERR! syscall open
npm ERR! enoent ENOENT: no such file or directory, open '/usr/src/app/package.js
on'
npm ERR! enoent ENOENT: no such file or directory, open '/usr/src/app/package.js
on'
npm ERR! enoent This is most likely not a problem with npm itself
npm ERR! enoent and is related to npm not being able to find a file.
npm ERR! enoent
npm ERR! Please include the following file with any support request:
npm ERR! /usr/src/app/npm-debug.log
Whats im doing wrong? (im novice in Docker :) ). Maybe I'm wrong use ONBUILD? But like anything not clear there is no.
As mentioned in Dockerfile man page:
The ONBUILD instruction adds to the image a trigger instruction to be executed at a later time, when the image is used as the base for another build.
Since you are not using another image starting with "FROM test", those instruction are never executed, meaning the test image does not include what those commands were supposed to do.
Unless you left out some details in your question, you're not using ONBUILD correctly.
The ONBUILD option is to queue commands to run in a subsequent build. The commands you specified above wouldn't be executed unless you wrapped your image by including it in the FROM reference in another Dockerfile.
Please see the Dockerfile reference on this subject for additional information.