In my current project, we are trying to implement the current application functionality using Node-RED. The functionality is shown below. Here, Fire state receives two inputs: (1) TemperatureSensor (2) SmokeDetector. Both Sensors are publishing data using MQTT publishers. and Firestate component can receives data through MQTT subsciber.
The fire state can produce an output based on the these two parameters that is if temperaturevalue > 70 and Smokevalue == true. In view of this, my question is -- Does Node-RED support the two inputs functionality? If yes, then how can we implement this functionality? If no, then.. Can I say that two input functionality can not be implemented using Node-RED???? As we have seen that Node-RED provides multiple outputs, but not inputs.
You will need to use a function node and make use of the context variable to keep state between messages and use the message topic to determine which input a message came from.
Something like this:
context.temp = context.temp || 0.0;
context.smoke = context.smoke || false;
if (msg.topic === 'smokeDetector') {
context.smoke = msg.payload;
} else if (msg.topic === 'tempSensor') {
context.temp = msg.payload;
}
if (context.temp >= 70.0 && context.smoke) {
return {topic: 'fireState', payload: 'FIRE!'}
} else {
return null
}
More details can be found in the function node doc here
You can wire in any number of inputs to any node -- just be aware that your node will only see one input msg at a time. There is no inherent msg aggregation simply because there are multiple input wires.
Instead, the task of aggregating multiple input msgs is handled by certain nodes -- some of which are built in to the core node-red server, and some that have been contributed by the community. Which one you should choose will depend upon the specific use case. For instance, should two objects be appended into an array, or merged into one big object? Only you will know what you want -- node-red does not make any assumptions, but gives you different nodes to handle many common use cases. For any other use cases, there is always the generic function node, in which you can use javascript to implement whatever behavior you need.
For your original question, you are looking for a way to merge 2 payloads from different sensors into a single object. The core join and change nodes can be used for that, as can the node-red-contrib-bool-gate and node-red-contrib-aggregator nodes, found on the flow library site.
Here's an example of combining two sensor inputs using the join node, and then using a switch node with the expression payload.temp > 70 and payload.smoke to determine whether to send the msg down the flow:
[
{
"id": "87df68f8.51ad58",
"type": "inject",
"z": "f9a2eec9.c2e26",
"name": "",
"topic": "smoke",
"payload": "true",
"payloadType": "bool",
"repeat": "",
"crontab": "",
"once": false,
"onceDelay": 0.1,
"x": 160,
"y": 1180,
"wires": [
[
"da4182a8.47939"
]
]
},
{
"id": "3ad419ec.1453a6",
"type": "inject",
"z": "f9a2eec9.c2e26",
"name": "",
"topic": "smoke",
"payload": "false",
"payloadType": "bool",
"repeat": "",
"crontab": "",
"once": false,
"onceDelay": 0.1,
"x": 170,
"y": 1140,
"wires": [
[
"da4182a8.47939"
]
]
},
{
"id": "a45b3cb0.f3312",
"type": "inject",
"z": "f9a2eec9.c2e26",
"name": "",
"topic": "temp",
"payload": "65",
"payloadType": "num",
"repeat": "",
"crontab": "",
"once": false,
"onceDelay": 0.1,
"x": 160,
"y": 1220,
"wires": [
[
"da4182a8.47939"
]
]
},
{
"id": "a3b07d81.e6b17",
"type": "inject",
"z": "f9a2eec9.c2e26",
"name": "",
"topic": "temp",
"payload": "75",
"payloadType": "num",
"repeat": "",
"crontab": "",
"once": false,
"onceDelay": 0.1,
"x": 160,
"y": 1260,
"wires": [
[
"da4182a8.47939"
]
]
},
{
"id": "da4182a8.47939",
"type": "join",
"z": "f9a2eec9.c2e26",
"name": "join payloads",
"mode": "custom",
"build": "object",
"property": "payload",
"propertyType": "msg",
"key": "topic",
"joiner": "\n",
"joinerType": "str",
"accumulate": true,
"timeout": "",
"count": "2",
"reduceRight": false,
"reduceExp": "",
"reduceInit": "",
"reduceInitType": "",
"reduceFixup": "",
"x": 430,
"y": 1200,
"wires": [
[
"315c9ce3.570d64",
"50f981b4.be654"
]
]
},
{
"id": "315c9ce3.570d64",
"type": "switch",
"z": "f9a2eec9.c2e26",
"name": "Trigger Alarm?",
"property": "payload.temp > 70 and payload.smoke",
"propertyType": "jsonata",
"rules": [
{
"t": "true"
}
],
"checkall": "true",
"repair": false,
"outputs": 1,
"x": 640,
"y": 1200,
"wires": [
[
"50f981b4.be654"
]
]
},
{
"id": "50f981b4.be654",
"type": "debug",
"z": "f9a2eec9.c2e26",
"name": "",
"active": true,
"tosidebar": true,
"console": false,
"tostatus": false,
"complete": "false",
"x": 690,
"y": 1260,
"wires": [
]
}
]
We can use Join Node and change its configuration by setting mode to manual, and use fixed number of messages as 2. Then once both the inputs are received you can invoke next function node. Join node can combine both payload as array or object. And then in last function code you can send the combined data to MQTT after checking your condition.
Related
I'm new to node-red and I'm trying to create a simple flow of sending data to my Azure IoT hub using node-red. But whenever I send the data the get the following error "Could not connect: getaddrinfo ENOTFOUND xxx-device-hub.azure-devices.net". I tried with different Shared access keys with all permission but still the same. I am working over a proxy network. Please help as I'm not able to work because of this
you can import my node-red flow :
[
{
"id": "6f18f82cf1fb4430",
"type": "tab",
"label": "Flow 5",
"disabled": false,
"info": "",
"env": []
},
{
"id": "191eb7ca.b71a8",
"type": "azureiothub",
"z": "6f18f82cf1fb4430",
"name": "Azure IoT Hub",
"protocol": "mqtt",
"x": 640,
"y": 340,
"wires": [
[
"39c7854c.56d18a"
]
]
},
{
"id": "39c7854c.56d18a",
"type": "debug",
"z": "6f18f82cf1fb4430",
"name": "",
"active": true,
"console": "false",
"complete": "payload",
"x": 870,
"y": 340,
"wires": []
},
{
"id": "b73c20238ff65f0f",
"type": "inject",
"z": "6f18f82cf1fb4430",
"name": "",
"props": [
{
"p": "payload"
},
{
"p": "topic",
"vt": "str"
}
],
"repeat": "",
"crontab": "",
"once": false,
"onceDelay": 0.1,
"topic": "",
"payload": "",
"payloadType": "date",
"x": 160,
"y": 340,
"wires": [
[
"d9b5bad69079e9ea"
]
]
},
{
"id": "d9b5bad69079e9ea",
"type": "function",
"z": "6f18f82cf1fb4430",
"name": "",
"func": "msg.payload ={\"deviceID\":\"DC_Tower_Clock\",\"SAK\":\"<Shared-access-key>\",\"Protocol\":\"mqtt\",\"Data\":{\"DC_sensor1values\":[0],\"DC_sensor1timestamp\":[1651774945]}}\nreturn msg;",
"outputs": 1,
"noerr": 0,
"initialize": "",
"finalize": "",
"libs": [],
"x": 380,
"y": 340,
"wires": [
[
"191eb7ca.b71a8"
]
]
}
]
The problem is not with the access keys, it's with the hostname for your broker.
Error ENOTFOUND means that the OS on the system you are running can not resolve xxx-device-hub.azure-devices.net to an IP address.
Make sure you have entered the right hostname and their are no typos.
You can test by trying to ping the address outside Node-RED first.
There were couple of issues with my flow:
inside the payload I did not have the configuration right (deviceID instead of deviceId and SAK instead of key)
I had my protocol as MQTT when it was actually HTTP
And the main problem was actually the company proxy, as I'm able to send the messages using my personal Laptop but with the same flow with my Work Laptop I'm not able to.
I am creating an npm package that will read json files and then validate the content against some predefined json-schemas everything was working fine when I was testing against small size (~1MB) files but when I started reading 50MB and larger sizes things started to fail,
like when I reach the code to pares the file Allocation failed - JavaScript heap out of memory is being thrown so I tried to increase the size of node memory node --max-old-space-size=4096 but now the parsing is taking forever (waited almost one hour and nothing happened).
So here is how the json file may look like:
[{
"numberVerification": [
{
"longNumber": 281474976710655
}
]
},
{
"metaData": [
{
"name": "nodes",
"elementCount": 155,
"idCounter": 155,
"version": "1.0",
},
{
"name": "edges",
"elementCount": 312,
"idCounter": 312,
"version": "1.0",
},
{
"name": "networkAttributes",
"elementCount": 14,
"idCounter": 14,
"version": "1.0",
},
{
"name": "nodeAttributes",
"elementCount": 330,
"idCounter": 330,
"version": "1.0",
},
{
"name": "edgeAttributes",
"elementCount": 3120,
"idCounter": 3120,
"version": "1.0",
},
{
"name": "cartesianLayout",
"elementCount": 155,
"idCounter": 156
},
]
},
{
"nodes": [
{
"#id": 0,
"n": "TYK2",
"r": "uniprot:P29597"
},
{
"#id": 1,
"n": "ISGF3 complex",
"r": "signor:SIGNOR-C124"
},
{...}
]
},
{
"edges": [
{
"#id": 0,
"s": 0,
"t": 1,
"i": "up-regulates activity"
},
{
"#id": 1,
"s": 2,
"t": 1,
"i": "up-regulates activity"
},
{...}
]
},
{
"nodeAttributes": [
{
"po": 0,
"n": "type",
"v": "protein"
},
{
"po": 0,
"n": "location",
"v": "cytoplasm"
},
{...}
]
},
{
"edgeAttributes": [
{
"po": 0,
"n": "citation",
"v": [
"pubmed:15120645"
],
"d": "list_of_string"
},
{
"po": 0,
"n": "mechanism",
"v": "phosphorylation"
},
{...}
]
},
{
"cartesianLayout": [
{
"node": 0,
"x": 97.73626669665249,
"y": -114.99468800778627
},
{
"node": 1,
"x": 307.72737757573987,
"y": 4.091777979752425
},
{...}
]
},
{
"status": [
{
"error": "",
"success": true
}
]
}]
As you can see each object of the main array is of different type.
I was using fs.readFileSync to read json files and then I parse the whole file using JSON.parse(), but then I find that in order to read big json files I need to use streams but now how I am supposed to validate against json-schema? also I am doing some additional custom validation on the data, like checking if the #id property is unique, validate that the node (in cartesian layout array), t & s (in edges array) properties are pointing on a real #id in the nodes array, and after that I want to make some further statistics on the data.
So is there a way to read the large json files and parse it, and if it is applicable, how can I keep the validation running? and the statistic part do I need to use any external providers?
The json files are generated from a 3rd party so I can't really do anything with the structure of the data and how it is presented inside of them.
If anyone want me to provide any additional things let me know, I appreciate the help.
To work with JSON objects within streams, you can use JSONStream npm library (see docs for usage). You will have to use JSONPath syntax to reach desired nodes (including the root node, if necessary) and perform a "rolling" validation according to your rules.
I'm creating an Interactive Floor Plan using Node-RED (with Dashboard and "node-red-contrib-ui-svg"
My problem: I want a Website-popup in the Node-RED Dashboard by clicking on an information icon but i don't know how to create the popup. I tryed it with the "http in", "http request", "http response" and function nodes, but it didn't work. Maybe i just don't get the payload right?
By clicking on the SVG, a event is starting and the SVG Node sends a payload to the output.
Later there should be multiple events with different SVGs opening different URL-popups
Does someone know, if it is possible to create a popup in the Node-RED Dashboard and if yes, tell me how i can do it?
Here is a small example flow of what i tried:
[
{
"id": "213370b.a1a7e9",
"type": "tab",
"label": "Floorplan",
"disabled": false,
"info": ""
},
{
"id": "3a8acfc1.2d033",
"type": "debug",
"z": "213370b.a1a7e9",
"name": "",
"active": true,
"tosidebar": true,
"console": false,
"tostatus": false,
"complete": "true",
"targetType": "full",
"x": 570,
"y": 100,
"wires": []
},
{
"id": "3d085e29.713452",
"type": "http in",
"z": "213370b.a1a7e9",
"name": "googl",
"url": "svg",
"method": "get",
"upload": false,
"swaggerDoc": "",
"x": 90,
"y": 100,
"wires": [
[
"3e9f0610.1b40da"
]
]
},
{
"id": "7e8c6b26.c6b194",
"type": "http response",
"z": "213370b.a1a7e9",
"name": "",
"statusCode": "",
"headers": {},
"x": 570,
"y": 60,
"wires": []
},
{
"id": "e6a43abb.2208c8",
"type": "function",
"z": "213370b.a1a7e9",
"name": "",
"func": "msg.responseUrl=msg.payload;\nmsg.payload=msg.payload\n\n\nreturn msg;",
"outputs": 1,
"noerr": 0,
"x": 450,
"y": 80,
"wires": [
[
"7e8c6b26.c6b194",
"3a8acfc1.2d033"
]
]
},
{
"id": "3e9f0610.1b40da",
"type": "http request",
"z": "213370b.a1a7e9",
"name": "",
"method": "GET",
"ret": "txt",
"paytoqs": false,
"url": "nodered.org",
"tls": "",
"persist": false,
"proxy": "",
"authType": "",
"x": 270,
"y": 100,
"wires": [
[
"e6a43abb.2208c8"
]
]
},
{
"id": "2360d5fd.e4dc9a",
"type": "ui_svg_graphics",
"z": "213370b.a1a7e9",
"group": "ff128f4a.e252",
"order": 1,
"width": 0,
"height": 0,
"svgString": "<svg xmlns=\"http://www.w3.org/2000/svg\" xmlns:xlink=\"http://www.w3.org/1999/xlink\" x=\"0\" y=\"0\" height=\"100\" viewBox=\"0 0 100 100\" width=\"100\"><rect id=\"svgEditorBackground\" x=\"0\" y=\"0\" width=\"100\" height=\"100\" style=\"fill:none; stroke: none;\"/><defs id=\"svgEditorDefs\"><symbol id=\"2139\" viewBox=\"0 0 64 64\" preserveAspectRatio=\"xMidYMid meet\"><rect x=\"0\" y=\"0\" width=\"64\" height=\"64\" style=\"stroke:none;fill:none;\"/><g xmlns=\"http://www.w3.org/2000/svg\"><circle cx=\"32\" cy=\"32\" fill=\"#42ade2\" r=\"30\"/><g fill=\"#fff\"><path d=\"m36.51 25h-6.992c-2.633 0-5.145 1.05-5.584 2.333-.436 1.284.447 2.334 1.965 2.334s2.072 2.02 1.23 4.492l-4.889 14.349c-.844 2.471.619 4.492 3.252 4.492h6.992c2.633 0 5.143-1.051 5.582-2.333.436-1.283-.447-2.335-1.963-2.335-1.518 0-2.072-2.02-1.23-4.491l4.889-14.349c.843-2.47-.619-4.492-3.252-4.492\"/><path d=\"m36.29 11c-2.666 0-5.406 2.238-6.121 5-.717 2.761.869 4.999 3.533 4.999 2.668 0 5.408-2.238 6.123-4.999.717-2.763-.867-5-3.535-5\"/></g></g></symbol><polygon id=\"svgEditorIconDefs\" style=\"fill:rosybrown;\"/></defs><use xlink:href=\"#2139\" x=\"4.410\" y=\"4.552\" width=\"19.061\" height=\"19.061\" id=\"svg_i\" transform=\"matrix(1.87014 0 0 1.87014 -2.59955 -2.72311)\"/></svg>",
"clickableShapes": [
{
"targetId": "#svg_i",
"action": "click",
"payload": "http://www.google.com",
"payloadType": "str",
"topic": "#svg_i"
}
],
"smilAnimations": [],
"bindings": [],
"showCoordinates": false,
"autoFormatAfterEdit": false,
"showBrowserErrors": false,
"outputField": "payload",
"editorUrl": "//drawsvg.org/drawsvg.html",
"directory": "",
"panning": "disabled",
"zooming": "disabled",
"panOnlyWhenZoomed": false,
"doubleClickZoomEnabled": false,
"mouseWheelZoomEnabled": false,
"name": "",
"x": 120,
"y": 60,
"wires": [
[
"e6a43abb.2208c8"
]
]
},
{
"id": "ff128f4a.e252",
"type": "ui_group",
"z": "",
"name": "SVG",
"tab": "9f9846f6.57ce98",
"order": 1,
"disp": true,
"width": "23",
"collapse": false
},
{
"id": "9f9846f6.57ce98",
"type": "ui_tab",
"z": "",
"name": "Background",
"icon": "dashboard",
"disabled": false,
"hidden": false
}
]
If you are still searching a solution, "modal dialog" is what you are looking for. Try this:
https://discourse.nodered.org/t/how-to-show-modal-dialog-in-template-node/611/8
I'm trying to use fcm-push to send a notification to Android. I'm just not sure how to build the flow.
I've successfully made Firebase send a notification to android via the firebase console.
just need an example flow for fcm-push.
thanks.
After much trial and error - I was able to come up with this. I did copy and then modify another example - unfortunately I don't know where the final example came from that I got to work (sorry). Here is one that is working, you can "import" into Node Red. Remember to change the AuthKey to your own.
[
{
"id": "b765ca96.a1ef58",
"type": "tab",
"label": "Flow 1 _ send notifications to FCM",
"disabled": false,
"info": ""
},
{
"id": "6ecb45a9.373ddc",
"type": "inject",
"z": "b765ca96.a1ef58",
"name": "",
"topic": "",
"payload": "",
"payloadType": "date",
"repeat": "",
"crontab": "",
"once": false,
"x": 120,
"y": 40,
"wires": [
[
"aafc8341.84eae"
]
]
},
{
"id": "3aab90c5.6c2a4",
"type": "http request",
"z": "b765ca96.a1ef58",
"name": "http POST to Google FCM",
"method": "POST",
"ret": "txt",
"url": "https:\/\/fcm.googleapis.com\/fcm\/send",
"tls": "",
"x": 180,
"y": 220,
"wires": [
[
"8d1dc739.0b7ac8"
]
]
},
{
"id": "aafc8341.84eae",
"type": "function",
"z": "b765ca96.a1ef58",
"name": "set payload and headers",
"func": "msg.payload = {\"to\": \"\/topics\/news\",\"data\": {\"message\": \"This is a FCM Topic Message!\"},\"notification\":{\"body\" : \"some_message\"}};\nmsg.headers = {\"Authorization\": \"key=BDzaSyCN932874928374-d1Jz3Rkcvbcr4\"};\nreturn msg;\n",
"outputs": 1,
"noerr": 0,
"x": 170,
"y": 120,
"wires": [
[
"3aab90c5.6c2a4"
]
]
},
{
"id": "8d1dc739.0b7ac8",
"type": "debug",
"z": "b765ca96.a1ef58",
"name": "",
"active": true,
"tosidebar": true,
"console": false,
"complete": "false",
"x": 290,
"y": 340,
"wires": [
]
}
]
Say I have a product collection like this:
{
"_id": "5a74784a8145fa1368905373",
"name": "This is my first product",
"description": "This is the description of my first product",
"category": "34/73/80",
"condition": "New",
"images": [
{
"length": 1000,
"width": 1000,
"src": "products/images/firstproduct_image1.jpg"
},
...
],
"attributes": [
{
"name": "Material",
"value": "Synthetic"
},
...
],
"variation": {
"attributes": [
{
"name": "Color",
"values": ["Black", "White"]
},
{
"name": "Size",
"values": ["S", "M", "L"]
}
]
}
}
and a variation collection like this:
{
"_id": "5a748766f5eef50e10bc98a8",
"name": "color:black,size:s",
"productID": "5a74784a8145fa1368905373",
"condition": "New",
"price": 1000,
"sale": null,
"image": [
{
"length": 1000,
"width": 1000,
"src": "products/images/firstvariation_image1.jpg"
}
],
"attributes": [
{
"name": "Color",
"value": "Black"
},
{
"name": "Size",
"value": "S"
}
]
}
I want to keep the documents separate and for the purpose of easy browsing, searching and faceted search implementation, I want to fetch all the data in a single query but I don't want to do join in my application code.
I know it's achievable using a third collection called summary that might look like this:
{
"_id": "5a74875fa1368905373",
"name": "This is my first product",
"category": "34/73/80",
"condition": "New",
"price": 1000,
"sale": null,
"description": "This is the description of my first product",
"images": [
{
"length": 1000,
"width": 1000,
"src": "products/images/firstproduct_image1.jpg"
},
...
],
"attributes": [
{
"name": "Material",
"value": "Synthetic"
},
...
],
"variations": [
{
"condition": "New",
"price": 1000,
"sale": null,
"image": [
{
"length": 1000,
"width": 1000,
"src": "products/images/firstvariation_image.jpg"
}
],
"attributes": [
"color=black",
"size=s"
]
},
...
]
}
problem is, I don't know how to keep the summary collection in sync with the product and variation collection. I know it can be done using mongo-connector but i'm not sure how to implement it.
please help me, I'm still a beginner programmer.
you don't actually need to maintain a summary collection, its redundant to store product and variation summary in another collection
instead of you can use an aggregate pipeline $lookup to outer join product and variation using productID
aggregate pipeline
db.products.aggregate(
[
{
$lookup : {
from : "variation",
localField : "_id",
foreignField : "productID",
as : "variations"
}
}
]
).pretty()