Publishing multiple buffers in mqttjs - node.js

So I want to send some bytes from a NodeJs script to my mosquitto server over mqtt. For this I am using the client library MQTT.js. To send raw data, the library supports Buffers (also Strings).
All of it works just fine until I try to publish multiple bytes (buffers) in a row. This will always result in publishing only the last buffer but multiple times. When I log the same buffers to the console instead of using mqttjs, I get my expected results (increment from 0xfa to 0xff). However, publishing normal Strings in a row works without trouble. The behaviour of this is inexplicable for me.
Not quite sure if this is a problem from mqttjs, or if I’m just dumb. Would appreciate a short explenation on what i'm doing wrong since i did not find any related issues.
var buf = Buffer.from([0xfa]);
client.on('connect', function () {
client.subscribe('LED', function (err) {
if (!err) {
for (; buf[0] < 0xff; buf[0]++) {
client.publish('LED', buf);
}
}
})
})
Expected message (Result from logging to the console instead of publishing):
<Buffer fa>
<Buffer fb>
<Buffer fc>
<Buffer fd>
<Buffer fe>
Message that mosquitto receives:
<Buffer ff>
<Buffer ff>
<Buffer ff>
<Buffer ff>
<Buffer ff>
A funny discovery i made. If you put the publish before the loop (buf[0] = 0xfa):
client.publish('LED', buf)
for (; buf[0] < 0xff; buf[0]++) {;}
The output is still:
<Buffer ff>

NodeJS is a pass by reference language, this means that the publish method is working on the same instance of the buffer object all the time. You are just passing a reference to the same structure to the publish method.
In this case the for loop is running quicker than the client can send the messages so by the time it gets round to actually packing the buffer's content into the message it has already been updated to be 0xff
You will need to create new buffer objects (or clone the existing values) for each message you want to send.

Related

Simple multiplexing node.js stream - chunk concatenated issue

Im trying to implement a simple node.js stream multiplexer/demultiplexer.
Currently while implementing the multiplexing mechanism i noticed that the output of the multiplexer gets concatenated into a single chunk.
const { PassThrough, Transform } = require("stream");
class Mux extends PassThrough {
constructor(options) {
super(options);
}
input(id, options) {
let encode = new Transform({
transform(chunk, encoding, cb) {
let buf = Buffer.alloc(chunk.length + 1);
buf.writeUInt8(id, 0);
chunk.copy(buf, 1);
cb(null, buf);
},
...options
});
encode.pipe(this);
return encode;
};
};
const mux = new Mux();
mux.on("readable", () => {
console.log("mux >", mux.read())
});
const in1 = mux.input(1);
const in2 = mux.input(2);
in1.write(Buffer.alloc(3).fill(255));
in2.write(Buffer.alloc(3).fill(127));
Output looks like this: mux > <Buffer 01 ff ff ff 02 7f 7f 7f>.
I would have thought that i receive two console.log outputs.
Expected output:
mux > <Buffer 01 ff ff ff>
mux > <Buffer 02 7f 7f 7f>
Can some one explains why i only get one "readable" event and a concatenated chunk from both inputs?
Use the data event and read from the callback:
The 'data' event is emitted whenever the stream is relinquishing ownership of a chunk of data to a consumer.
mux.on("data", d => {
console.log("mux >", d)
});
This now yields:
mux > <Buffer 01 ff ff ff>
mux > <Buffer 02 7f 7f 7f>
Why readable is only emitted once is explained in the docs as well:
The 'readable' event will also be emitted once the end of the stream data has been reached but before the 'end' event is emitted.
data and readable behave differently. In your case readable is never emitted until the end of the stream data has been reached, which returns all the data at once. data is emitted on each available chunk.

Bleno throws an index-out-of-range error on binary data

My code, that provides a BLE service, is running under nodejs and is using bleno package.
My service characteristic supports read and notify.
Here is my simplified implementation of onSubscribe function:
function(maxValueSize, updateValueCallback) {
var a = buildPacket();
var buf = Buffer.from(a);
console.log('Buffer: ', buf);
updateValueCallback(this.RESULT_SUCCESS, buf);
}
Function buildPacket returns a Uint8Array of 20 bytes. The console statement shows that the buffer is of 20 bytes and has the expected values.
However, call to updateValueCallback throws an error:
RangeError: Index out of range
at checkInt (buffer.js:1187:11)
at Buffer.writeUInt8 (buffer.js:1233:5)
at Gatt.<anonymous> (/home/pi/Dev/MyTest/node_modules/bleno/lib/hci socket/gatt.js:881:33)
Here is the relevant code from gatt.js:
878 if (useNotify) {
879 var notifyMessage = new Buffer(3 + dataLength);
880
881 notifyMessage.writeUInt8(ATT_OP_HANDLE_NOTIFY, 0);
882 notifyMessage.writeUInt16LE(valueHandle, 1);
Is there any step I am missing?
Most examples on bleno I read seem to send string data. However, I need to send binary data. Is there anything special required for sending binary data?
Turns out updateValueCallback takes only one parameter. I should have looked at bleno examples a bit more carefully:-(.
Just passing buf as the sole parameter fixed the problem.

how can i parse JSON coming from a nrf24 Buffer?

I am sending a valid JSON object constructed with ArduinoJSON to a RaspberryPi running node.js with the library https://github.com/natevw/node-nrf over a nrf24 radio link. The node.js server receives the data seemingly without problem. But for some reason I can't JSON.parse() the object (or buffer?) without getting a SyntaxError: Unexpected token in JSON at position ...
For some reason the node-nrf library receives data backwards, so i need to reverse the order of the bytes with Array.prototype.reverse.call(d), and then console.log(d.toString()) and everything seems fine. In this case, the console gets Got data: [{"key":"a1","value":150}]. At this point, the content of the buffer looks like : Buffer 5b 7b 22 6b 65 79 22 3a 22 61 31 22 2c 22 76 61 6c 75 65 22 3a 31 35 30 7d 5d 00 00 00 00 00 00. Those are the actual 32 bytes that the nrf24 buffer contains i guess.
But then, when the code gets to the JSON.parse() call, i get SyntaxError: Unexpected token in JSON at position 26. This is the position my object data actually ends in the buffer.
I've also experimented with .toJSON() and JSON.stringify() , but can't actually get a proper object to use ( ie. obj.key, obj.value). It's only returning undefined properties. It seems to me the parsing fails when it reaches the end of the object. I've also tried to match the buffer size with the actual size of the message just to see if the parsing would succeed to no avail !
I am probably very mixed up in concepts of buffers, streams, pipes and objects ... what am i doing wrong ?
I need ideas (or fixes!)
Code running on the receiving end in node.js:
var nrf = NRF24.connect(spiDev, cePin, irqPin);
nrf.printDetails();
nrf.channel(0x4c).transmitPower('PA_MIN').dataRate('1Mbps').crcBytes(2).autoRetransmit({count:15, delay:4000}).begin(function () {
var rx = nrf.openPipe('rx', pipes[0]);
rx.on('data', d => {
let obj = Array.prototype.reverse.call(d);
try {
console.log("Got data: ", d.toString());
console.log(obj);
obj = JSON.parse(obj);
console.log(obj);
} catch (err) {
console.error(err)
}
});
});
I don't think the problem is here in forming the JSON message. But for reference purposes, this is the code running on the Arduino:
#include <SPI.h>
#include <nRF24L01.h>
#include <RF24.h>
#include <ArduinoJson.h>
const uint64_t addresses[5] = {0x65646f4e32LL,0x65646f4e31LL} ;
RF24 radio(7,8);
char output[32];
void setup()
{
Serial.begin(115200);
radio.begin();
radio.setAutoAck(true);
radio.setDataRate(RF24_1MBPS);
radio.enableDynamicPayloads();
radio.setCRCLength(RF24_CRC_16);
radio.setChannel(0x4c);
radio.setPALevel(RF24_PA_MAX);
radio.openWritingPipe(addresses[0]);
}
void loop()
{
const int capacity = JSON_ARRAY_SIZE(2) + 2*JSON_OBJECT_SIZE(2);
StaticJsonBuffer<capacity> jb;
JsonArray& arr = jb.createArray();
JsonObject& obj1 = jb.createObject();
obj1["key"] = "a1";
obj1["value"] = analogRead(A1);
arr.add(obj1);
arr.printTo(output);
bool ok = radio.write(&output, sizeof(output));
arr.printTo(Serial);
Serial.print(ok);
delay(1000);
}
Most likely you have NUL characters at the end of the string. JSON.parse will refuse to parse it.
let obj = '[{"key":"a1","value":150}]\x00\x00\x00\x00\x00\x00';
JSON.parse(obj);
Uncaught SyntaxError: Unexpected token  in JSON at position 26
If you remove the NUL characters, parsing succeeds:
let obj = '[{"key":"a1","value":150}]\x00\x00\x00\x00\x00\x00';
obj = obj.replace(/\0/g, "");
JSON.parse(obj);
Parse 'buffer data' into 'string' like:
rx.on('data', d => {
try {
let obj = d.toString();
console.log(obj);
obj = JSON.parse(obj);
console.log(obj);
} catch (err) {
console.error(err)
}
});

Node.js Buffer from string not correct

To create a utf-8 buffer from a string in javascript on the web you do this:
var message = JSON.stringify('ping');
var buf = new TextEncoder().encode(message).buffer;
console.log('buf:', buf);
console.log('buf.buffer.byteLength:', buf.byteLength);
This logs:
buf: ArrayBuffer { byteLength: 6 }
buf.buffer.byteLength: 6
However in Node.js if I do this:
var nbuf = Buffer.from(message, 'utf8');
console.log('nbuf:', nbuf);
console.log('nbuf.buffer:', nbuf.buffer);
console.log('nbuf.buffer.byteLength:', nbuf.buffer.byteLength);
it logs this:
nbuf: <Buffer 22 70 69 6e 67 22>
nbuf.buffer: ArrayBuffer { byteLength: 8192 }
nbuf.buffer.byteLength: 8192
The byteLength is way to high. Am I doing something wrong here?
Thanks
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/ArrayBuffer
ArrayBuffer.prototype.byteLength Read only
The size, in bytes, of the array. This is established when the array is constructed and cannot be changed. Read only.
It seems you should not assume byteLength property to be equal to the actual byte length occupied by the elements in the ArrayBuffer.
In order to get the actual byte length, I suggest using Buffer.byteLength(string[, encoding])
Documentation: https://nodejs.org/api/buffer.html#buffer_class_method_buffer_bytelength_string_encoding
For example,
var message = JSON.stringify('ping');
console.log('byteLength: ', Buffer.byteLength(message));
correctly gives
byteLength: 6

Streams read(0) instruction

There is a code I found here https://github.com/substack/stream-handbook which reads 3 bytes from stream. And I do not understand how it works.
process.stdin.on('readable', function() {
var buf = process.stdin.read(3);
console.log(buf);
process.stdin.read(0);
});
Being called like this:
(echo abc; sleep 1; echo def; sleep 1; echo ghi) | node consume.js
It returns:
<Buffer 61 62 63>
<Buffer 0a 64 65>
<Buffer 66 0a 67>
<Buffer 68 69 0a>
First of all, why do I need this .read(0) thing? Isn't stream has a buffer where the rest of data is stored until I request it by .read(size)? But without .read(0) it'll print
<Buffer 61 62 63>
<Buffer 0a 64 65>
<Buffer 66 0a 67>
Why?
The second is these sleep 1 instructions. If I call the script without it
(echo abc; echo def; echo ghi) | node consume.js
It'll print
<Buffer 61 62 63>
<Buffer 0a 64 65>
no matter will I use .read(0) or not. I don't understand this completely. What logic is used here to print such a result?
I am not sure about what exactly the author of https://github.com/substack/stream-handbook tried to show using the read(0) approach, but IMHO this is the correct approach:
process.stdin.on('readable', function () {
let buf;
// Every time when the stream becomes readable (it can happen many times),
// read all available data from it's internal buffer in chunks of any necessary size.
while (null !== (buf = process.stdin.read(3))) {
console.dir(buf);
}
});
You can change the chunk size, pass the input either with sleep or without it...
I happened to learn NodeJS stream module these days. Here are some comments inside Readable.prototype.read function:
// if we're doing read(0) to trigger a readable event, but we
// already have a bunch of data in the buffer, then just trigger
// the 'readable' event and move on.
It said that after called .read(0), stream would just trigger (using the process.nextTick) another readable event if stream was not ended.
function emitReadable(stream) {
// ...
process.nextTick(emitReadable_, stream);
// ...
}

Resources