install node packages without restarting docker compose - node.js

Is there a way to install node packages in a running docker environment without restarting?
I have running a few containers via docker-compose and need to use npm i <packagename> while the containers are running.
So far, I have found no consistent answer to google.

docker exec -it [container-id] /bin/bash
cd to the workspace and npm install packagename . Should install the package you want and also add it in package.json

Use docker exec + npm install <package-name>.
https://docs.docker.com/engine/reference/commandline/exec/

You can run any command in an active container using docker exec. In your case it will be:
// Replace <your-container-id> and <your-package-name>
docker exec -it <your-container-id> "npm install <your-package-name>"
or if you want to use a container name instead of a container id you can use:
// Replace <your-container-id> and <your-package-name>
docker exec -it $(docker ps | grep <your-container-name> | awk '{ print $1 }') "npm install <your-package-name>"
Here you have more informations about the docker exec docker exec command. (Docker Docs)

Related

How to generate a package.json file on local machine in current directory, using single line "docker run" command from node image

I'm trying to run the npm init command to spit out a package.json file on my local machines current working directory, by running a Node image on docker.
I have attempted to do it like so but haven't had the desired results described above.
docker run -d -v $pwd:~/tmp node:18-alpine3.14 "cd ~/tmp && npm init"
The command I'm passing above at the end gets passed to the Node application rather than the container it is held inside. I know this because the container exits with Error: Cannot find module '/cd ~/tmp && npm init'.
How can i execute commands for the container to receive rather than Node inside it in this example?
You cloud use sh -c "some command" as command, but I think it's cleaner, like below.
Using the workdir flag and also using your local user and group id so that you don't have to fix the permissions later on.
docker run --rm \
--user "$(id -u):$(id -g)" \
--workdir /temp \
--volume "$PWD:/tmp" \
--tty
--interactive
node:18-alpine3.14 npm init
I am also using tty and interactive so that you can answer the questions from npm init. If you dont want questions, use npm init -y.

How to use npm install with docker? Installing node_modules without installing npm

I' m trying to run npm install without installing npm:
sudo docker run -it -v $PWD/../src:/usr/src/app node:latest npm install
However I don't know where the WORKDIR of node:latest is located. I want node_modules installed in the folder $PWD/../src. I also don't want to create a dockerfile just for that.
This is actually a valid use case for using Docker where you just want to have a quick temporary environment to execute your scripts.
In case you do not know the WORKDIR of any image, you can still overwrite it when creating the container as described here.
sudo docker run --rm -it \
-w /any/directory \
-v $PWD/../src:/any/directory \
node:latest \
npm install
NOTE I added the flag --rm so that the container will automatically be cleaned up once the npm install command finishes running.

Docker - Restore installed libraries after exit

I am new to docker and I ran these two commands in my mac terminal
docker pull amazonlinux
docker run -v $(pwd):/lambda-project -it amazonlinux
After running these two commands, I entered into the Linux terminal, where I installed Nodejs and few node modules
curl -o- https://raw.githubusercontent.com/creationix/nvm/v0.33.6/install.sh | bash
. ~/.nvm/nvm.sh
nvm install 6.11.5
npm -v
npm install serverless -global
everything worked fine so far, I was able to run npm -v and it showed me npm version and also serverless -v worked fine.
Then I did exit and I came out of the container into my local terminal.
Then I entered into my container again by using below command
docker run -v $(pwd):/lambda-project -it amazonlinux
This time my installations are gone. npm -v gave me the command not found.
My question is that how can I save the state or modules installed into a container and how can I log in again into the container to work further after exiting from the container.
With each docker run command you are starting another new container. You can run the command docker ps --all. You will see all containers (including exited ones) and their IDs. You can restart an exited container with the command docker restart <id>. The container is now running. With the command docker attach <id> you are back in the container. All installed libraries should still be present, but:
The downloaded shell script sets some shell variables. After attaching to the container, you can run the shell script again: . ~/.nvm/nvm.sh. Now you can access npm. This shell command prints out what it did and what you should do to keep those changes.
If you want to keep all those changes and use it regularly you can write a Dockerfile which builds an image with all those libs already installed. This official page gets you started in writing Dockerfiles: https://docs.docker.com/develop/develop-images/dockerfile_best-practices/

Running commands for docker container

This is how I'm running a command in a docker container:
$ docker run -it --rm --name myapp myimage:latest
$ node --version
Is it possible to to run this as one command? Can I pass a command to the docker run-command?
Something like
$ docker run -it --rm --name myapp myimage:latest "node --version"
Of course this is just a simple example. Later I will execute some more complex commands...
The "docker run" command essentially will make the container run, and execute the "CMD" or "ENTRYPOINT" in the Dockerfile. Unless - the command in your dockerfile does not run a command prompt - "running" the container may not get you the prompt.
For example if you want that everytime you run the container - it gets you the command prompt then have the line below in your Dockerfile.:
CMD ["bash"]
If you want to run the same commands everytime you run the command - then you could create a script file with your commands, copy them to the container, and execute the script file as a CMD directive.
The general form of the command is actually:
docker run [OPTIONS] IMAGE[:TAG|#DIGEST] [COMMAND] [ARG...]
See documentation for more details.
The answer to the minimal example in your question is simply:
docker run -it --rm --name myapp myimage:latest node --version
If you want to run multiple commands in sequence, you can:
Run container
Execute your commands against the running container using docker exec
Remove it
I am having some trouble understanding what you are trying to do.
you can just :
docker run -d --name myapp myimage:latest -f /dev/null and then your container is up and you can run any command in it
you can pass a command to the docker run but once the command ends the container will exit

Docker image for sailsjs development on macosx hangs

I have a docker image build on Arch Linux (sailsjs-dev) with node and sailsjs which I would like to use for development, mounting the app directory inside the container as follows:
docker run --rm --name testapp -p 1337:1337 -v $PWD:/app \
sailsjs-dev sails lift
$PWD is the directory with the sails project.
This works fine on linux, but if I try to run it on macosx (with docker-machine) it hangs forever at the very beginning, with log level set on silly (in config/log.js):
info: Starting app...
There is no other output, this is all we get.
Note, the same docker image works perfectly also on mac with an express app. What could be peculiar of sail that causes the problem?
I can also add that on a mac docker uses a virtualbox instance named docker machine.
We solved it running npm install from within the docker container:
docker run --rm --name testapp -p 1337:1337 -ti -v $PWD:/app \
sailsjs-dev /bin/bash
npm install --no-bin-links
--no-bin-links avoids the creation of symlinks.

Resources