Terminal progress bars not displaying in the Docker console. - node.js

Im trying to include one of npm's terminal progress bars to better visualize how a long process is progressing. When I run it from the standard "node index.js" it goes off without a hitch, but when running from a simple docker image, nothing is posted to the terminal. My index.js reads as such:
const _cliProgress = require('cli-progress');
// create a new progress bar instance and use shades_classic theme
const bar1 = new _cliProgress.Bar({}, _cliProgress.Presets.shades_classic);
// start the progress bar with a total value of 200 and start value of 0
bar1.start(200, 0);
// update the current value in your application..
bar1.update(100);
// stop the progress bar
bar1.stop();
And this is my docker file:
FROM node:latest
#create work directory
RUN mkdir -p /src
#establish the app folder as the work directory
WORKDIR /src
COPY package.json /src
COPY package-lock.json /src
RUN npm i
COPY . /src
CMD [ "node", "index.js" ]
The terminal displays nothing from these packages but does display the normal console.logs.This problem exists for the other package I tried to use as well.
Any information on why this is having a different than the expected result would be greatly appreciated. Thanks.

You have to run docker, with --tty , -t flag, which will allocate a pseudo-TTY
docker run -t --rm test
You can check the following questions for a more detailed explanation on that flag:
Confused about Docker -t option to Allocate a pseudo-TTY
What does it mean to attach a tty/std-in-out to dockers or lxc?

Related

How can I execute a command inside a docker from a node app?

I have a node app running, and I need to access a command that lives in an alpine docker image.
Do I have to use exec inside of javascript?
How can I install latex on an alpine container and use it from a node app?
I pulled an alpine docker image, started it and installed latex.
Now I have a docker container running on my host. I want to access this latex compiler from inside my node app (dockerized or not) and be able to compile *.tex files into *.pdf
If I sh into the alpine image I can compile '.tex into *.pdf just fine, but how can I access this software from outside the container e.g. a node app?
If you just want to run the LaTeX engine over files that you have in your local container filesystem, you should install it directly in your image and run it as an ordinary subprocess.
For example, this Javascript code will run in any environment that has LaTeX installed locally, Docker or otherwise:
const { execFileSync } = require('node:child_process');
const { mkdtemp, open } = require('node:fs/promises');
const tmpdir = await mkdtemp('/tmp/latex-');
let input;
try {
input = await open(tmpdir + '/input.tex', 'w');
await input.write('\\begin{document}\n...\n\\end{document}\n');
} finally {
input?.close();
}
execFileSync('pdflatex', ['input'], { cwd: tmpdir, stdio: 'inherit' });
// produces tmpdir + '/input.pdf'
In a Docker context, you'd have to make sure LaTeX is installed in the same image as your Node application. You mention using an Alpine-based LaTeX setup, so you could
FROM node:lts-alpine
RUN apk add texlive-full # or maybe a smaller subset
WORKDIR /app
COPY package*.json ./
RUN npm ci
COPY ./ ./
CMD ["node", "main.js"]
You should not try to directly run commands in other Docker containers. There are several aspects of this that are tricky, including security concerns and managing the input and output files. If it's possible to directly invoke a command in a new or existing container, it's also very straightforward to use that permission to compromise the entire host.

How can I pass docker environment variables to an npm script?

As the title says...As a noobie I've given this a good attempt but can't seem to figure it out.
I have a dockerfile
FROM node:12
WORKDIR /app
COPY . /app
RUN npm config set registry https://registry.npmjs.org/
RUN npm install
EXPOSE 8080
ENTRYPOINT ["npm", "start",]
CMD [ "--", "-e=$ENVIRONMENT", "-t=$TESTS" ]
and a script in my package.json like so:
"scripts": {
"start": "node main.js"
}
Main.js is expecting two arguments. e & t.
I am struggling to pass these in to the container to then give to the script to run main.js (note there is a reason why im running it through a script ive just made this example simple)
To run my npm script I can do this:
npm start -- -e=abc -t=xyz
So I have tried this but no joy:
docker run -e ENVIRONMENT=abc -e TESTS=xyz myimage
Thanks
When you use the JSON form of CMD (or ENTRYPOINT or RUN), there is no interpolation at all; your script should literally see the string -e=$ENVIRONMENT as the argument. Instead you need to use the shell form, which will wrap this in a shell that expands environment variables. You can't do this with this particular split of ENTRYPOINT and CMD, but at the same time, it's not really necessary; just put the whole thing in CMD.
# No ENTRYPOINT
# No quoting; Docker wraps this in `sh -c ...`
CMD npm start -- -e="$ENVIRONMENT" -t="$TESTS"
You can also handle these directly in your application. The yargs library for example has a .env() function that allows environment variables to be used directly as options. You could also make process.env.TESTS be the default value for the option if it's not provided directly. This approach gets around the trouble of constructing (and possibly extending) a valid command line with the combination of arguments you need.

NodeJS App path broken in Docker container

I got this strange issue when trying to run my NodeJS App inside a Docker Container.
All the path are broken
e.g. :
const myLocalLib = require('./lib/locallib.');
Results in Error:
Cannot find module './lib/locallib'
All the file are here (show by ls command inside the lib direcotry)
I'm new to Docker so I may have missed something in my setup
There's my Dockerfile
FROM node:latest
COPY out/* out/
COPY src/.env.example src/
COPY package.json .
RUN yarn
ENTRYPOINT yarn start
Per request : File structure
Thank you.
You are using the COPY command wrong. It should be:
COPY out out

Why isn't my server restarting / code updating using Docker + Nodejs?

My docker file is super simple:
FROM node:4-onbuild
RUN npm install gulp -g;
EXPOSE 8888
This image will automatically run the start script in package.json which I have set simply as gulp.
If I run gulp on my host machine, and make a change to node file, it automatically restarts server:
var gulp = require('gulp');
var nodemon = require('gulp-nodemon');
gulp.task('default', function() {
nodemon({
script: 'server.js', // starts up server on port 4000
env: { 'NODE_ENV': 'development' }
})
});
Figuring everything is okay I run this: docker run -d -p 1234:4000 -v $(pwd):/usr/src/app my-image
Going to http://192.168.99.100:1234/ shows 'Hello World!' from my server.js file. Updating the file does NOT update what I see by hitting that URL again. If I exec into the container, I see the file is updated. Since the container started node via the same gulp command, I don't understand why the node server wouldn't have restarted and shown the update.
The TL;DR of this is that you need to set nodemon to poll the filesystem for changes as follows: https://github.com/remy/nodemon#application-isnt-restarting
In some networked environments (such as a container running nodemon reading across a mounted drive), you will need to use the legacyWatch: true which enabled Chokidar's polling.
Via the CLI, use either --legacy-watch or -L
The longer version is this (with one key assumption - you're using docker on Mac or similar):
On Mac or similar, docker doesn't run natively and instead runs inside of a virtual machine (generally virtual box via docker-machine). Virtual machines generally don't propagate filesystem inotify events, which is what most watchers rely on to restart or perform an action when a file changes. As the virtual machine doesn't propagate the events from the host, Docker never receives the events. Your original docker file would probably work on a native linux machine.
There's an open issue and much more detailed discussion of this here.

Swagger express gives error inside docker container

I have an app that uses the swagger-express-mw library and I start my app the following way:
SwaggerExpress.create({ appRoot: __dirname }, (err, swaggerExpress) => {
// initialize application, connect to db etc...
}
Everything works fine on my local OSX machine. But when I use boot2docker to build an image from my app and run it, I get the following error:
/usr/src/app/node_modules/swagger-node-runner/index.js:154
config.swagger = _.defaults(readEnvConfig(),
^
TypeError: Cannot assign to read only property 'swagger' of [object Object]
My dockerfile looks like this (nothing special):
FROM node:latest
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
COPY package.json /usr/src/app/
RUN npm set progress=false
RUN npm install
COPY . /usr/src/app
EXPOSE 5000
CMD ["npm", "run", "dev"]
Has anyone else encountered similar situations where the local environment worked but the app threw an error inside the docker container?
Thanks!
Your issue doesn't appear to be something wrong with your docker container or machine setup. The error is not a docker error, that is a JavaScript error.
The docker container appears to be running your JavaScript module in Strict Mode, in which you cannot assign to read-only object properties (https://msdn.microsoft.com/library/br230269%28v=vs.94%29.aspx). On your OSX host, from the limited information we have, it looks like it is not running in strict mode.
There are several ways to specify the "strictness" of your scripts. I've seen that you can start Node.js with a flag --use_strict, but I'm not sure how dependable that is. It could be that NPM installed a different version of your dependent modules, and in the different version they specify different rules for strict mode. There are several other ways to define the "strictness" of your function, module, or app as a whole.
You can test for strict mode by using suggestions here: Is there any way to check if strict mode is enforced?.
So, in summary: your issue is not inheritantly a docker issue, but it is an issue with the fact that your javascript environments are running in different strict modes. How you fix that will depend on where strict is being defined.
Hope that helps!

Resources