Node Red Plugin in Container: Node missing - node.js

I wrote a custom node for node red and everything is working fine.
Now I need to put everything in a docker container. Node red is running and the dependency is installed, but the nodes do not show up on the interface. I do not get any error messages, even when I not include the files, what causes an error on the standalone version.
My package.json:
{
"name": "boolean_nodes",
"version": "1.0.0",
"description": "Nodes for boolean operation.",
"dependencies": {
"node-red": "*",
"node-red-contrib-home-assistant-websocket": "*",
"mqtt": "*"
},
"scripts": {
"start": "node-red"
},
"author": "",
"license": "ISC",
"node-red": {
"nodes": {
"BOOL-Switch": "./data/bool/switch/bool_switch.js",
"BOOL-AND": "./data/bool/and/bool_and.js",
"BOOL-OR": "./data/bool/or/bool_or.js",
"ML-Interface": "mlinterface.js"
}
}
}
My dockerfile:
FROM nodered/node-red
# Copy package.json to the WORKDIR so npm builds all
# of your added nodes modules for Node-RED
COPY package.json .
RUN npm install --unsafe-perm --no-update-notifier --no-fund --only=production
# Copy _your_ Node-RED project files into place
COPY /data/bool/switch/bool_switch.js /data/bool/switch/bool_switch.js
COPY /data/bool/switch/bool_switch.html /data/bool/switch/bool_switch.html
COPY /data/bool/and/bool_and.js /data/bool/and/bool_and.js
COPY /data/bool/and/bool_and.html /data/bool/and/bool_and.html
COPY /data/bool/or/bool_or.js /data/bool/or/bool_or.js
COPY /data/bool/or/bool_or.html /data/bool/or/bool_or.html
If experimented with diffrent paths for the files but that does not cause any different behavior.
How do I get may plugin in the container?

Normally you'd package your node as an npm module and then npm install it.
In this instance, you can still load what we call 'local' nodes which aren't packaged properly. By default Node-RED will look under the nodes directory of node-red user directory.
In the docker image, /data is used as the user directory.
So you should be copying your files somewhere under /data/nodes/

Related

Jest not found while running JEST in docker container

I have created below simple Dockerfile:
FROM node:16.7.0
WORKDIR /app
COPY . .
RUN npm install
# ENTRYPOINT [ "npm" ]
CMD ["sh", "-c", "tail -f /dev/null"]
I have added a cmd line with "tail -f /dev/null" to check exactly what's the issue if I issue npm test inside the container.
As soon as I run npm test inside the container --> It throws me below error
# npm test
> docker-jest#1.0.0 test
> jest --verbose
sh: 1: jest: not found
my package.json
{
"name": "docker-jest",
"version": "1.0.0",
"description": "Package for Jest",
"scripts": {
"test": "jest --verbose"
},
"Dependencies": {
"#babel/node": "*",
"#babel/core": "*",
"#babel/preset-env": "*",
"babel-jest": "*",
"jest": "*"
},
"license": "ISC"
}
sum.js
function sum(a, b) {
return a + b;
}
module.exports = sum;
sum.test.js
const sum = require('./sum');
test('adds 1 + 2 to equal 3', () => {
expect(sum(1, 2)).toBe(3);
});
Even if I disable CMD and enable ENTRYPOINT and after the build, if I issue:
docker run -it <imagename> test
It throws me the same error, I see the npm install is installing but can't find the jest # /usr/local/lib/node_modules/ as I see the node modules are deployed in location/usr/local/lib/node_modules/ inside the container, and if I issue jest it says jest not found. If I, run the same without the container it works fine. I mean just in the command line after running npm install and then npm run test.
Can anyone assist me with why I'm getting this error and how to fix it?
-----------UPDATE-------------
Found the fix, it was because of my corrupted package-lock file. When I tested in local without the docker, I somehow corrupted the lock file, and later stage when I build and try to run using docker, the corrupted lock file was causing a whole lot of issues. So I deleted it and again ran thru docker...It's working as expected.
I had the same issue and the fix for me was running npm install -g jest (or yarn global add jest).
To add this to your package.json do the following:
"scripts: {
"test": "npm install -g jest && jest --verbose"
},

Npm workspaces - call workspace script from root package

I'm struggling with multiple npm packages in a root git repository with custom dev scripts to handle launch, compile, build and so on. Now I came across npm workspaces and wanted to use this stunning new feature in my following project structure but I can't get it to work:
projectx (root)
- package.json
- apps
-- backend
-- src
-- package.json (name: #projectx/backend, scripts: "dev":"ts-node or whatever")
-- common
-- src
-- package.json (name: #projectx/common)
-- frontend
-- src
-- package.json (name: #projectx/frontend, scripts: "dev":"webpack")
My root package.json contains:
{
"name": "packagex",
"version": "1.0.0",
"description": "",
"main": "index.js",
"private": "true",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1",
"back:dev": "npm workspace #projectx/backend dev",
"front:dev": "npm workspace #projectx/frontend dev",
"dev": "run-p back:dev front:dev"
},
"workspaces": [
"apps/*"
],
"repository": {
"type": "git",
"url": "git_url"
},
"author": "me",
"license": "ISC",
"devDependencies": {
"npm-run-all": "^4.1.5"
}
}
And now I want to start backend and frontend with npm-run-all and the command on root: npm run dev which results in:
And I also want to share the common package with backend and frontend, which should be possible in this case. Maybe anobody else is facing the same problem or has some ideas what I am doing wrong here.
npm#7.7.0 added a way to call scripts from child packages/workspaces, here are some examples based on your original:
Running a script named "dev" in all workspaces located under apps/backend:
npm run dev -w apps/backend
Running a script named "dev" in all workspaces:
npm run dev --ws
Running a script named "dev" in a package named #projectx/frontend:
npm run dev -w #projectx/frontend
More info:
Related CHANGELOG entry: https://github.com/npm/cli/releases/tag/v7.7.0
Docs: https://docs.npmjs.com/cli/v7/commands/npm-run-script#workspaces-support
Blog post: https://dev.to/ruyadorno/npm-workspaces-npm-run-and-exec-1lg0
Your "workspaces" property in package.json looks right. I'm using NPM Workspaces and it's working well, but it's still missing a lot of features so you need to wire things up yourself. I also don't think npm worksace is a command (but maybe for the future?), so here's a checklist to get it to work:
Make sure you're using Node 15+ and NPM 7+
Set all package.json to "private": true,
Delete all package-lock.json inside of your project, go to the root, then npm install. It should generate one root level package-lock.json that contains all dependencies for your workspaces
Since you're using npm-run-all, add this to your scripts:
"scripts": {
"back:dev": "cd apps/backend && npm run dev",
"front:dev": "cd apps/fontend && npm run dev",
"dev": "npm-run-all build --parallel back:dev front:dev"
}
Then start it with npm run dev.
Note, you may want to consider using start scripts instead of dev to shorten the command you need to type (e.g. npm start instead of npm run dev), but npm run dev will still be fine.
In root package.json you can also add short name for each package:
"scripts": {
"api": "npm --workspace=#app/api run",
}
#app/api is a name in package.json
And run scripts in ./packages/api folder from root like so:
npm run api lint
npm run api dev
I think you wish to:
keep scripts and dependencies separate (thus the 4 package.json files), for ease of maintenance
May I suggest a work-around without workspaces that might do what you're after:
{
...
"scripts": {
"//back:dev": "npm workspace #projectx/backend dev",
"back:dev": "npm --prefix apps/backend dev",
"//front:dev": "npm workspace #projectx/frontend dev",
"front:dev": "npm --prefix apps/frontend dev",
"dev": "run-p back:dev front:dev"
},
"//workspaces": [
"apps/*"
],
"devDependencies": {
"#local/back": "file:apps/backend",
"#local/front": "file:apps/frontend",
"npm-run-all": "^4.1.5"
}
}
The npm --prefix runs npm scripts in another folder than the current one.
The #local/back dependencies are not necessary for that, but I've found such useful if eg. a package depends on another. You might use that trick to reach for the common by:
"dependencies": {
"#local/common": "file:../common"
}
I wished a week ago that workspaces would offer a better solution, but didn't find any benefit over the above mechanisms.
I would also like workspaces to:
only expose those files in the files entry of the particular package.json (now, all are shown)
only allow import to paths in the exports of the particular package.json, if it has one
See
NPM Workspaces monorepo - share local package's distribution folder as root instead of the entire source files

Node.js app doesn't work when I try to host it via GCP deploy command. Error: Cannot find module 'express'

I have my NodeJS app wrote using TypeScript and based on the Express framework. I want to host it in GCP cloud with gcloud app deploy command.
So, first of all, I build my TS sources to JavaScript -is that the correct way of doing it?.
Then from the build (with JS source code) folder I'm trying to run npm start command and it works successfully and I'm also able to check it with Preview:
.
It works well. So far, so good.
Then I run gcloud app deploy from the build folder (with built to JS sources) and I didn't see any errors during deploy.
But afterward, I receive a 500 error on each request whenever I'm trying to reach the deployed app. I've taken a look into a log and I see next error:
Error: Cannot find module 'express'
What seems to be the problem?
I tried the next commands in the build folder:
npm install
npm install express --save
npm install -g express
sudo apt-get install node-express
Nothing works for me.
Here is my package.json file:
{
"name": "full-node",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"build": "tsc",
"dev": "node -r ts-node/register ./src/server.ts",
"debug": "ts-node --inspect ./src/server.ts",
"start": "node build/server.js",
"prod": "npm run build && npm run start"
},
"keywords": [],
"author": "",
"license": "ISC",
"devDependencies": {
"ts-node": "^7.0.1",
"typescript": "^3.0.1"
},
"dependencies": {
"#types/lodash": "^4.14.116",
"body-parser": "^1.18.3",
"connect": "^3.6.6",
"cors": "^2.8.4",
"crypto": "^1.0.1",
"express": "^4.16.3",
"firebase-admin": "^6.0.0",
"lodash": "^4.17.10"
}
}
Any idea what I'm missed? Is this the correct way to deploy an app wrote with TypeScript to GCP cloud?
app.yaml:
# [START app_yaml]
runtime: nodejs8
# [END app_yaml]
since you are running gcloud app deploy from within the build folder,probably the package.json is not deployed as npm install is run first by app engine there is no way express could be missing.you can go to gcp console and under app engine view the version and then under diagnose you can view the source(the files that were actually deployed to app engine).keep in mind that this is only possible for the standard version and not the flex.I can see from your app.yaml you are using the standard.If some files are missing then go to your app root directory and in your .gcloudignore file you can ignore the files/folders you do not want to deploy.then run gcloud app deploy from within the root directory of your project
The problem was pretty simple. Seems like gcloud app deploy use npm run build & npm run start commands to start application somewhere inside. To host Node.Js wrote on TS first we need to build it to simple JS using tsc command. Then in the build folder rewrite package.json file to use correct commands. Look at my start command: "start": "node build/server.js". I was using it inside build folder as well so that's mean gcloud command was searching in /build/build/ folder. I've changed start command to "start": "node server.js" and then all works well.

Node.js spawn process not working inside Docker container

I have a server.js file in Node that calls python scripts as follows:
// call python scripts
var spawn = require("child_process").spawn;
var process = spawn('python',["test.py", function_args]);
process.stdout.on('data', function (data){
res.json({
"answer" : from_python
})
});
This works perfectly when simply running node as usual:
node server.js
But when I place everything inside a docker container, the application never enters the process.stdout.on
Everything else works perfectly. I can serve static files, call express endpoints, etc. It is just the process that is not getting called.
I have tried placing child_process inside my package.json file as a dependency.
Here is my Dockerfile:
FROM node:carbon
# Create app directory
WORKDIR /usr/src/app
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
COPY package*.json ./
RUN npm install
# Bundle app source
COPY . .
EXPOSE 80
CMD ["npm","start"]
This is my package.json file:
{
"name": "my_app_name",
"version": "1.0.0",
"description": "my_desc",
"author": "my_name",
"main": "server.js",
"scripts": {
"start": "node server.js"
},
"dependencies": {
"express": "^4.16.2",
"path": "^0.12.7",
"fs": "^0.0.1-security",
"child_process": "^1.0.2"
}
}
Note that running server.js is not the problem. This file runs the other pieces such as serving static files and exposing express endpoints. The problem seems to just be with the child_process not running. It is not that Python isn't getting called, it is that node will not even enter the process part of server.js.
Also note that the Python prints its output via stdout. Not sure of docker has an issue registering standard outputs.
I think
FROM node:carbon
should be
FROM node:9.3
Carbon is the LTS (currently still 8..). You are using libraries from latest v9.3.0.

Dockerfile RUN doesn't to run in container context

I'm having an issue trying to generate a dockerfile for my nodejs app:
My dockerfile:
FROM node
WORKDIR /app
COPY . /app
RUN npm install
EXPOSE 3000
CMD ["node", "/app/index.js"]
The nodejs (As part of npm install) needs grpc. When I try to run my app, I get the following error message:
Cannot find module '/app/node_modules/grpc/src/node/extension_binary/node-v57-linux-x64/grpc_node.node'
When I explore the app/node_modules/grpc/src/node/extension_binary/ folder, node-v48-win32-x64 is the only folder inside there. My guess is when npm install ran, it used the context my host machine where it detected windows/x64 and downloaded that binary instead. I'd like to avoid running npm install at runtime. How do I fix this?
My package.json:
{
"name": "microservice-test",
"version": "1.0.0",
"description": "A test microservice.",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"author": "FrankerZ",
"license": "ISC",
"devDependencies": {
"grpcc": "0.0.8",
"gulp-livereload": "^3.8.1"
},
"dependencies": {
"async": "^2.5.0",
"grpc": "^1.6.0",
"gulp": "^3.9.1",
"gulp-run": "^1.7.1",
"gulp-util": "^3.0.8",
"protoc-plugin": "0.0.6"
}
}
What I think is happening is, docker run is copying the local node_modules from your project inside the container at COPY . /app
.
So thus you get the linux-x64 error. It basically copied all the machine specific code from node_modules to the container that must have another OS. To fix this, ignore node_modules by making a .dockerignore file alongside your package.json and add just one line.
node_modules
Read more about it from here.

Resources