I am looking for a way to initialize projects using the docker run command. I will be using node as an example.
Attempt
This is the command I tried.
docker run --rm -it -v "$PWD":/usr/app -w "/usr/app" --name foo_bar node:lts "npm init"
This however results in this error.
node:internal/modules/cjs/loader:1050
throw err;
^
Error: Cannot find module '/npm init'
at Module._resolveFilename (node:internal/modules/cjs/loader:1047:15)
at Module._load (node:internal/modules/cjs/loader:893:27)
at Function.executeUserEntryPoint [as runMain] (node:internal/modules/run_main:81:12)
at node:internal/main/run_main_module:23:47 {
code: 'MODULE_NOT_FOUND',
requireStack: []
}
Node.js v18.14.0
Not sure why it is complaining about '/npm init' so an explanation of what is really going wrong here is also welcome.
Expected Result
The npm init cli tool would start asking me to set the project name, version, et cetera and the following output would be in my current output directory.
I tried a variant of the command
docker run --rm -it -v "$PWD":/usr/app -w "/usr/app" --name foo_bar node:lts bash
This connects me to the container where I have run npm init to create the project which works but I want to do these in one go.
docker container run has the following syntax
$ docker run [OPTIONS] IMAGE [COMMAND] [ARG...]
When you wrap your command along with it's arguments in quotes its considered as a single command i.e. "npm init" is considered a command (COMMAND = npm init) instead of it being parsed as COMMAND = npm; ARG = init.
Consider the below example:
Fix:
docker run --rm -it -v "$PWD":/usr/app -w "/usr/app" --name foo_bar node:lts npm init
Now, why does "npm init" give an error Error: Cannot find module '/npm init' but not command not found. It's because of the way node:lts image is built. If you look at node's Dockerfile on github at the bottom there are two lines.
ENTRYPOINT ["docker-entrypoint.sh"]
CMD [ "node" ]
node image's ENTRYPOINT executes "docker-entrpoint.sh" file and it's parameter is "node". On inspecting docker-entrypoint.sh
There's a comment
# Run command with node if the first argument contains a "-" or is not a system command.
This is reason why you get Module not found error when you write "npm init" as it's not system command it is run with node. But when its a system command it executes it as is.
Related
I'm trying to run the npm init command to spit out a package.json file on my local machines current working directory, by running a Node image on docker.
I have attempted to do it like so but haven't had the desired results described above.
docker run -d -v $pwd:~/tmp node:18-alpine3.14 "cd ~/tmp && npm init"
The command I'm passing above at the end gets passed to the Node application rather than the container it is held inside. I know this because the container exits with Error: Cannot find module '/cd ~/tmp && npm init'.
How can i execute commands for the container to receive rather than Node inside it in this example?
You cloud use sh -c "some command" as command, but I think it's cleaner, like below.
Using the workdir flag and also using your local user and group id so that you don't have to fix the permissions later on.
docker run --rm \
--user "$(id -u):$(id -g)" \
--workdir /temp \
--volume "$PWD:/tmp" \
--tty
--interactive
node:18-alpine3.14 npm init
I am also using tty and interactive so that you can answer the questions from npm init. If you dont want questions, use npm init -y.
We have a project which to ease development uses node to handle some dependencies as well as our test scripts to spin up a docker image to run tests. The goal is that the devs don't need to run bats locally and can just run npm test which'll spin up the image and run tests.
I can get it working locally in my Git Bash multiple ways - however I cannot get it working via the npm script - it's caused me a day or 2 now of spamming different SO approaches to try and tackle. Here's what I'm working with:
{
"scripts": {
"test": "bats test/**/*.bats",
"test-ci": "bats --formatter junit -T test/**/*.bats",
"test-docker": "docker run -it -v /c/source/git/devops/dotnetcore-nuget-push/:/app node:slim npm test",
"test-docker-test": "docker run -it -v $(pwd):/app -w //app node:slim npm test",
"test-docker-debug": "docker run -it -v /c/source/git/devops/dotnetcore-nuget-push:/app -w //app node:slim bash -c \"echo $(pwd) && ls\"",
"test-docker2": "docker run -it -v /c/source/git/devops/dotnetcore-nuget-push/:/app -w //app node:slim npm test",
},
}
Ideally we'll just have test in the end - just trying a few things to make sure they work the same.
Here's what works locally in Git Bash:
docker run -it -v /$(pwd):/app -w //app node:slim npm test
docker run -it -v /$(pwd):/$(pwd) -w /$(pwd) node:slim npm test
MSYS_NO_PATHCONV=1 docker run -it -v $(pwd):$(pwd) -w $(pwd) node:slim npm test
So I've tried adding some of those and a debugging one to my npm scripts but the only thing that works is when I hardcode the volume path -v /c/source/git/devops/dotnetcore-nuget-push/:/app. Ideally this wouldn't be hard coded but when I use $(pwd) I get errors that it can't find my package.json now.. here's it using $(pwd) instead of hard coded path - it tries to look in the workdir/app path - however the workdir IS app, it should be looking for the package.json right where the app is mounted - similar to how it is when I do the hard coded path.
I've looked at the following resources - but none are using npm
Using absolute path with docker run command not working
Docker mounted volume adds ;C to end of windows path when translating from linux style path
https://github.com/moby/moby/issues/24029
TL;DR
Replace $(pwd) by %cd%.
Explanation
It seems you are running npm from Windows. There is a pitfall here: Even if you use Git Bash (or whatever shell you like), the subshells spawned by npm won't necessarily use the same shell - they'll use the default shell (in Windows it's specified by the COMSPEC environment variable). In a Linux environment that will usually be bash, but in Windows it'll be cmd.exe and not Git Bash! And cmd.exe won't know what to do with $(pwd) in the command line and will forward it to docker verbatim.
That means that scripts in your package.json cannot use shell-specific features if you intend to use them cross-platform. For more complex operations it's usually easiest to have node .scripts/someScript.js there or the like, and to provide the actual logic in a JavaScript file in a platform-agnostic way.
If it will suffice for your needs to have a script that runs correctly only under Windows, then you can use %cd% instead of $(pwd) (the cmd way to reference the current working directory). It's best to then put the whole argument in double quotes though because cmd also passes the command line arguments differently to the new process.
Further reading: https://docgov.dev/posts/npm-scripts/
I am currently located within my folder of the vue.js project. The source files are located in ./src/ folder and test files are located within ./tests/
Usually to run unit tests locally I run the following commands:
npm install
npm ci
npm run test:ci
and it produces, among the others ./report/coverage.lcov file
However, I want to use node:12-alpine docker image to run unit tests inside of it. DO NOT offer to use Dockerfile. I want to run it using docker run --rm node:12-alpine .... and copy the content of ./report folder into my local folder when docker run command is complete. However, I could not figure out how I can do that? What docker run arguments I should use?
Why bot mount your report target as a volume while running?
I am able to run it locally by creating script.sh file
cd /tmp/run
npm install
npm ci
npm run test:ci
in project directory and running:
docker run --rm -v ${PWD}/:/tmp/run -u 0 node:12-alpine sh /tmp/run/script.sh
This is great as I do not need to install another node version locally and container is deleted after run... However, I could not replace the last line via:
cd /tmp/run && npm install ... as it raised an error. Did not really wanted to introduce an extra script. Using --entrypoint outputs another error.
Yes, I can do this and it is very simple
docker run --rm -v ${PWD}/:/tmp/run -u 0 --workdir=/tmp/run node:14-alpine npm install && npm ci && npm run test:ci
I am new to docker and I ran these two commands in my mac terminal
docker pull amazonlinux
docker run -v $(pwd):/lambda-project -it amazonlinux
After running these two commands, I entered into the Linux terminal, where I installed Nodejs and few node modules
curl -o- https://raw.githubusercontent.com/creationix/nvm/v0.33.6/install.sh | bash
. ~/.nvm/nvm.sh
nvm install 6.11.5
npm -v
npm install serverless -global
everything worked fine so far, I was able to run npm -v and it showed me npm version and also serverless -v worked fine.
Then I did exit and I came out of the container into my local terminal.
Then I entered into my container again by using below command
docker run -v $(pwd):/lambda-project -it amazonlinux
This time my installations are gone. npm -v gave me the command not found.
My question is that how can I save the state or modules installed into a container and how can I log in again into the container to work further after exiting from the container.
With each docker run command you are starting another new container. You can run the command docker ps --all. You will see all containers (including exited ones) and their IDs. You can restart an exited container with the command docker restart <id>. The container is now running. With the command docker attach <id> you are back in the container. All installed libraries should still be present, but:
The downloaded shell script sets some shell variables. After attaching to the container, you can run the shell script again: . ~/.nvm/nvm.sh. Now you can access npm. This shell command prints out what it did and what you should do to keep those changes.
If you want to keep all those changes and use it regularly you can write a Dockerfile which builds an image with all those libs already installed. This official page gets you started in writing Dockerfiles: https://docs.docker.com/develop/develop-images/dockerfile_best-practices/
I am very new to docker and playing with it. I am trying to run nodejs app in docker container. I took ubuntu:14.04 as base image and build my own nodeJS baked image. My Dockerfile content looks like below
FROM ubuntu:14.04
MAINTAINER nmrony
#install packages, nodejs and npm
RUN apt-get -y update && \
apt-get -y install build-essential && \
curl -sL https://deb.nodesource.com/setup | bash - && \
apt-get install -y nodejs
#Copy the sources to Container
COPY ./src /src
CMD ["cd /src"]
CMD ["npm install"]
CMD ["nodejs", "/src/server.js"]
I run container using following command
docker run -p 8080:8080 -d --name nodejs_expreriments nmrony/exp-nodejs
It runs fine. But when I try browse http:localhost:8080 it does not run.
When I run docker logs nodejs_expreriments, I got the following error
Error: Cannot find module 'express'
at Function.Module._resolveFilename (module.js:338:15)
at Function.Module._load (module.js:280:25)
at Module.require (module.js:364:17)
at require (module.js:380:17)
at Object.<anonymous> (/src/server.js:1:77)
at Module._compile (module.js:456:26)
at Object.Module._extensions..js (module.js:474:10)
at Module.load (module.js:356:32)
at Function.Module._load (module.js:312:12)
at Function.Module.runMain (module.js:497:10)
I run another container with interactive shell and found that npm is not installed. Can someone help me why NPM is not installed on container? Am I doing something wrong?
Your fundamental problem is that you can only have exactly one CMD in a Docker file. Each RUN/COPY command builds up a layer during docker build, so you can have as many of those as you want. However, exactly one CMD gets executed during a docker run. Since you have three CMD statements, only one of them actually gets executed (presumably, the last one).
(IMO, if the Dockerfile team would have chosen the word BUILD instead of RUN and RUN instead of CMD, so that docker build does BUILD statements and docker run does RUN statements, this might have been less confusing to new users. Oh, well.)
You either want to convert your first two CMDs to RUNs (if you expect them to happen during the docker build and be baked into the image) or perhaps put all three CMDs in a script that you run. Here's a few solutions:
(1) The simplest change is probably to use WORKDIR instead of cd and make your npm install a RUN command. If you want to be able to npm install during building so that your server starts up quickly when you run, you'll want to do:
#Copy the sources to Container
COPY ./src /src
WORKDIR /src
RUN npm install
CMD nodejs server.js
(2) If you're doing active development, you may want to consider something like:
#Copy the sources to Container
WORKDIR /src
COPY ./src/package.json /src/package.json
RUN npm install
COPY /src /src
CMD nodejs server.js
So that you only have to do the npm install if your package.json changes. Otherwise, every time anything in your image changes, you rebuild everything.
(3) Another option that's useful if you're changing your package file often and don't want to be bothered with both building and running all the time is to keep your source outside of the image on a volume, so that you can run without rebuilding:
...
WORKDIR /src
VOLUME /src
CMD build_and_serve.sh
Where the contents of build_and_serve.sh are:
#!/bin/bash
npm install && nodejs server.js
And you run it like:
docker run -v /path/to/your/src:/src -p 8080:8080 -d --name nodejs_expreriments nmrony/exp-nodejs
Of course, that last option doesn't give you a portable docker image that you can give someone with your server, since your code is outside the image, on a volume.
Lots of options!
For me this worked:
RUN apt-get update \
&& apt-get upgrade -y \
&& curl -sL https://deb.nodesource.com/setup_8.x | bash - \
&& apt-get install -y nodejs \
&& npm install -g react-tools
My debian image apt-get was getting a broken/old version of npm, so passing a download path fixed it.