NodeJS App path broken in Docker container - node.js

I got this strange issue when trying to run my NodeJS App inside a Docker Container.
All the path are broken
e.g. :
const myLocalLib = require('./lib/locallib.');
Results in Error:
Cannot find module './lib/locallib'
All the file are here (show by ls command inside the lib direcotry)
I'm new to Docker so I may have missed something in my setup
There's my Dockerfile
FROM node:latest
COPY out/* out/
COPY src/.env.example src/
COPY package.json .
RUN yarn
ENTRYPOINT yarn start
Per request : File structure
Thank you.

You are using the COPY command wrong. It should be:
COPY out out

Related

How can I execute a command inside a docker from a node app?

I have a node app running, and I need to access a command that lives in an alpine docker image.
Do I have to use exec inside of javascript?
How can I install latex on an alpine container and use it from a node app?
I pulled an alpine docker image, started it and installed latex.
Now I have a docker container running on my host. I want to access this latex compiler from inside my node app (dockerized or not) and be able to compile *.tex files into *.pdf
If I sh into the alpine image I can compile '.tex into *.pdf just fine, but how can I access this software from outside the container e.g. a node app?
If you just want to run the LaTeX engine over files that you have in your local container filesystem, you should install it directly in your image and run it as an ordinary subprocess.
For example, this Javascript code will run in any environment that has LaTeX installed locally, Docker or otherwise:
const { execFileSync } = require('node:child_process');
const { mkdtemp, open } = require('node:fs/promises');
const tmpdir = await mkdtemp('/tmp/latex-');
let input;
try {
input = await open(tmpdir + '/input.tex', 'w');
await input.write('\\begin{document}\n...\n\\end{document}\n');
} finally {
input?.close();
}
execFileSync('pdflatex', ['input'], { cwd: tmpdir, stdio: 'inherit' });
// produces tmpdir + '/input.pdf'
In a Docker context, you'd have to make sure LaTeX is installed in the same image as your Node application. You mention using an Alpine-based LaTeX setup, so you could
FROM node:lts-alpine
RUN apk add texlive-full # or maybe a smaller subset
WORKDIR /app
COPY package*.json ./
RUN npm ci
COPY ./ ./
CMD ["node", "main.js"]
You should not try to directly run commands in other Docker containers. There are several aspects of this that are tricky, including security concerns and managing the input and output files. If it's possible to directly invoke a command in a new or existing container, it's also very straightforward to use that permission to compromise the entire host.

ENV variables within cloud run server are no accessible

So,
I am using NUXT
I am deploying to google cloud run
I am using dotenv package with a .env file on development and it works fine.
I use the command process.env.VARIABLE_NAME within my dev server on Nuxt and it works great, I make sure that the .env is in git ignore so that it doesnt get uploaded.
However, I then deploy my application using the google cloud run... I make sure I go to the Enviroments tab and add in exactly the same variables that are within the .env file.
However, the variables are coming back as "UNDEFINED".
I have tried all sorts of ways of fixing this, but the only way I can is to upload my .env with the project - which I do not wish to do as NUXT exposes this file in the client side js.
Anyone come across this issue and know how to sort it out?
DOCKERFILE:
# base node image
FROM node:10
WORKDIR /user/src/app
ENV PORT 8080
ENV HOST 0.0.0.0
COPY package*.json ./
RUN npm install
# Copy local nuxt code to the container
COPY . .
# Build production app
RUN npm run build
# Start the service
CMD npm start
Kind Regards,
Josh
Finally I found a solution.
I was using Nuxt v1.11.x
From version equal to or greater than 1.13, Nuxt comes with Runtime Configurations, and this is what you need.
in your nuxt.config.js:
export default {
publicRuntimeConfig: {
BASE_URL: 'some'
},
privateRuntimeConfig: {
TOKEN: 'some'
}
}
then, you can access like:
this.$config.BASE_URL || context.$config.TOKEN
More details here
To insert value to the environment variables is not required to do it in the Dockerfile. You can do it through the command line at the deployment time.
For example here is the Dockerfile that I used.
FROM node:10
WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .
CMD ["npm","start"]
this is the app.js file
const express = require('express')
const app = express()
const port = 8080
app.get('/',(req,res) => {
const envtest = process.env.ENV_TEST;
res.json({message: 'Hello world',
envtest});
});
app.listen(port, () => console.log(`Example app listening on port ${port}`))
To deploy use a script like this:
gcloud run deploy [SERVICE] --image gcr.io/[PROJECT-ID]/[IMAGE] --update-env-vars ENV_TEST=TESTVARIABLE
And the output will be like the following:
{"message":"Hello world","envtest":"TESTVARIABLE"}
You can check more detail on the official documentation:
https://cloud.google.com/run/docs/configuring/environment-variables#command-line

Terminal progress bars not displaying in the Docker console.

Im trying to include one of npm's terminal progress bars to better visualize how a long process is progressing. When I run it from the standard "node index.js" it goes off without a hitch, but when running from a simple docker image, nothing is posted to the terminal. My index.js reads as such:
const _cliProgress = require('cli-progress');
// create a new progress bar instance and use shades_classic theme
const bar1 = new _cliProgress.Bar({}, _cliProgress.Presets.shades_classic);
// start the progress bar with a total value of 200 and start value of 0
bar1.start(200, 0);
// update the current value in your application..
bar1.update(100);
// stop the progress bar
bar1.stop();
And this is my docker file:
FROM node:latest
#create work directory
RUN mkdir -p /src
#establish the app folder as the work directory
WORKDIR /src
COPY package.json /src
COPY package-lock.json /src
RUN npm i
COPY . /src
CMD [ "node", "index.js" ]
The terminal displays nothing from these packages but does display the normal console.logs.This problem exists for the other package I tried to use as well.
Any information on why this is having a different than the expected result would be greatly appreciated. Thanks.
You have to run docker, with --tty , -t flag, which will allocate a pseudo-TTY
docker run -t --rm test
You can check the following questions for a more detailed explanation on that flag:
Confused about Docker -t option to Allocate a pseudo-TTY
What does it mean to attach a tty/std-in-out to dockers or lxc?

Trying to dockerize a node.js file but keep getting error

FROM node:7
WORKDIR ~/Desktop/CS612
COPY package.json ~/Desktop/CS612
RUN npm install
COPY . ~/Desktop/CS612
CMD node server.js
EXPOSE 3000
Okay I have switched it and was able to get this far:
Step 5/7 : COPY . ~/Desktop/CS612/
---> 885080c48872
Step 6/7 : CMD node server.js
---> Running in 7ffbaeec889f
---> 61654068c183
Removing intermediate container 7ffbaeec889f
Step 7/7 : EXPOSE 3000
---> Running in 6862095ac871
---> abb84902c53b
Removing intermediate container 6862095ac871
Successfully built abb84902c53b
Successfully tagged restaurants:latest
Danas-MacBook-Air:CS612 DanaCarlin$ docker run restaurants
module.js:538
throw err;
^
Error: Cannot find module '/~/Desktop/CS612/server.js'
at Function.Module._resolveFilename (module.js:536:15)
at Function.Module._load (module.js:466:25)
at Function.Module.runMain (module.js:676:10)
at startup (bootstrap_node.js:187:16)
at bootstrap_node.js:608:3
Why am I getting this error now? It makes no sense, that is definitely a file that holds the requests and responses
WORKDIR ~/Desktop/CS612
WORKDIR specifies the working directory inside the container, not the directory on Danas-MacBook-Air. The host working directory is closer to what Docker calls the build context on your MacBook air.
Also, Docker needs absolute paths within the container. You're creating a directory in the container /~/Desktop/CS612, which is where all the subsequent Dockerfile commands will run from as you finish the build. Probably not what you want.
tl;dr ditch the relative paths (~/) in the Dockerfile. Example: COPY . /Desktop/CS612.
edit: to reflect changes in the original question

Swagger express gives error inside docker container

I have an app that uses the swagger-express-mw library and I start my app the following way:
SwaggerExpress.create({ appRoot: __dirname }, (err, swaggerExpress) => {
// initialize application, connect to db etc...
}
Everything works fine on my local OSX machine. But when I use boot2docker to build an image from my app and run it, I get the following error:
/usr/src/app/node_modules/swagger-node-runner/index.js:154
config.swagger = _.defaults(readEnvConfig(),
^
TypeError: Cannot assign to read only property 'swagger' of [object Object]
My dockerfile looks like this (nothing special):
FROM node:latest
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
COPY package.json /usr/src/app/
RUN npm set progress=false
RUN npm install
COPY . /usr/src/app
EXPOSE 5000
CMD ["npm", "run", "dev"]
Has anyone else encountered similar situations where the local environment worked but the app threw an error inside the docker container?
Thanks!
Your issue doesn't appear to be something wrong with your docker container or machine setup. The error is not a docker error, that is a JavaScript error.
The docker container appears to be running your JavaScript module in Strict Mode, in which you cannot assign to read-only object properties (https://msdn.microsoft.com/library/br230269%28v=vs.94%29.aspx). On your OSX host, from the limited information we have, it looks like it is not running in strict mode.
There are several ways to specify the "strictness" of your scripts. I've seen that you can start Node.js with a flag --use_strict, but I'm not sure how dependable that is. It could be that NPM installed a different version of your dependent modules, and in the different version they specify different rules for strict mode. There are several other ways to define the "strictness" of your function, module, or app as a whole.
You can test for strict mode by using suggestions here: Is there any way to check if strict mode is enforced?.
So, in summary: your issue is not inheritantly a docker issue, but it is an issue with the fact that your javascript environments are running in different strict modes. How you fix that will depend on where strict is being defined.
Hope that helps!

Resources