I've got an Arduino sending very basic messages:
Serial.print('R');
Serial.println(1);
or
Serial.print('R');
Serial.println(2);
I'm trying to read each line using node.js and the SerialPort module but I get inconsistent results:
Data: <Buffer 52 31 0d 0a> R1
Data: <Buffer 52 32 0d 0a> R2
Data: <Buffer 52 31 0d 0a> R1
Data: <Buffer 52 32 0d 0a> R2
Data: <Buffer 52 31 0d 0a> R1
Data: <Buffer 52 32 0d 0a> R2
Data: <Buffer 52 31 0d 0a> R1
Data: <Buffer 52 32 0d 0a> R2
Data: <Buffer 52> R
Data: <Buffer 31 0d 0a> 1
Data: <Buffer 52 32 0d 0a> R2
And here's how I've tried to parse:
this.port = new SerialPort(portName, {
baudRate: baudRate,
autoOpen:false,
flowControl: false,
parser: new Readline("\r\n")
});
this.port.open(function (err) {
if (err) {
return console.log('Error opening port: ', err.message);
}
console.log("port open!");
});
this.port.on('error', function(err) {
console.log('Error: ', err.message);
})
this.port.on('open', function() {
console.log("open event called");
});
this.port.on('data', function (data) {
console.log('Data:', data,data.toString('utf8'));
});
In short: I'm expecting R1, R2 messages coming in consistently, not split up like this:
Data: <Buffer 52> R
Data: <Buffer 31 0d 0a> 1
I'm passing ("\r\n" / 0x0d 0x0a) to Readline. What am I missing ?
How can I get consistent new line parsing using SerialPort in node ?
I think that the solution to your problem requires to bind an event on the parser object, while you're currently listening it on the port object. Data that arrives trough the port is not always terminated by 0x0d 0x0a (*). Those two byte are a string terminator signal for the ReadLine parser only.
Thus, maybe you should write this listener in your code instead:
// I'm not actually sure where parser lives, I'm not
// in the condition of trying by myself...
this.port.parser.on('data', function (data) {
console.log('Data:', data,data.toString('utf8'));
});
Unfortunately, I don't have any suggestion to make the syntax more elegant, and for my standard this solution is more elegant than create a function that redirects bindings for you. It depends on your application though, and at the moment I don't have enough information to suggest a possible better solution.
(*) In the first (wrong) comment that I immediately deleted, I asked why you put both byte as termination to the line 0x0d 0x0a (\r\n), and not simply 0x0a (\n) but the Serial.println method actually writes both bytes by default.
Related
We have an IOT device for real-time communication (GPS location).
The device is sending TCP packets to the Nodejs server. But when we receiving so it is in buffer format and stripped.
We use tcpdump to checking, what data is coming in the network/transport layer. And Data is in HEX format.
0x0000: 029e 280b b191 022c 1666 c6e4 0800 4500 ..(....,.f....E.
0x0010: 0061 0ae6 0000 6706 b735 6b54 5c96 ac1f .a....g..5kT\...
0x0020: 1d72 5478 0c08 bc9e fa71 4bff c417 5010 .rTx.....qK...P.
0x0030: 03c4 2571 0000 aa55 0035 8305 4572 0051 ..%q...U.5..Er.Q
0x0040: 7601 0101 0209 e55e cde4 eb5e cde4 eb18 v......^...^....
0x0050: fb48 6dcb c14a 0f00 0040 6c00 0000 1001 .Hm..J...#l.....
0x0060: 4d08 2001 9aff be8f 0d00 000a 0000 00 M..............
But, NodeJS server is receiving data in below format:
Buffer aa 55 00 35 83 05 45 72 00 51 76 01 01 01 02 0a d8 5e ce 02 58 5e ce 02 58 18 fb 3e b3 cb c1 4d 3f 00 00 3f e7 00 00 00 10 00 fb 0c 20 01 9a ff bc 8f ...
So why we are not able to receive full data packets? Because we need full data packets for send acknowledgment purposes.
var net = require('net');
net.createServer(function (socket) {
socket.on('data', function(data) {
var text = data.toString('hex');
console.log("data in hex", text);
});
socket.on('end', function() {
console.log('end');
});
socket.on('close', function() {
console.log('close');
});
socket.on('error', function(e) {
console.log('error ', e);
});
}).listen(3080, function() {
console.log('TCP Server is listening on port 3080');
});
});
Note: Sorry, My English is very poor.
I came across the same issue, when Buffer string is too long and you Log it,
it appears some like ff ... 2 more bytes>.
I managed getting back my entire string by converting the Buffer to Hex.
var net = require('net');
net.createServer(function (socket) {
socket.on('data', function(data) {
const hexData = data.toString('hex')
console.log("data in hex", hexData);
});
})
I'm on MacOS, Node v12 and using a child process (exec/spawn/execSync/spawnSync) to execute a shell command which can return more than 8192 characters. However, back in my nodeJS method that invokes the child process, I only ever get up to 8192 characters and no more than that. (8192 seems to be the default pool size for a buffer).
I've tried increasing the maxBuffer size in the Options to anything larger than 8192 but it does not affect anything.
I've also tried running the same command with exec, spawn, execSync and spawnSync and they all behave the same way. Same result.
When I run:
exec(shellCommand, { encoding: "buffer", maxBuffer: 16384 }, (error, stdout, stderr) => {
console.log('stdout--> length: ', stdout.length, '<--->', stdout)
});
I get:
stdout--> length: 8192 <---> <Buffer 7b 22 72 65 73 75 6c 74 22 3a 5b 7b 22 70 72 6f 6a 65 63 74 4e 61 6d 65 22 3a 22 73 65 65 64 73 22 2c 22 74 65 6d 70 6c 61 74 65 4e 61 6d 65 22 3a 22 ... 8142 more bytes>
I know that the data coming back is larger than 8192 because when I run the shell command in a shell and check the length it is greater than 8192.
Also, and this is the puzzling bit, when I set the child process' stdio option to 'inherit' such as:
execSync(shellCommand, { encoding: "buffer", stdio:"inherit" });
(which says to use the parents stdout, in my case that is the NodeJS' console)
I see the full response back in the console where NodeJS is running.
I have also read a similar issue on github but it hasn't really helped.
How do I go about executing a shell command in NodeJS and getting the full response back?
try this :
const { spawn } = require('child_process');
const cmd = spawn('command', ['arg1', 'arg2']);
let bufferArray= []
/*cmd.stdout.setEncoding('utf8'); sets encdoing
defualt encoding is buffer
*/
cmd.stdout.on('data', (data) => {
console.log(`stdout: ${data}`);
bufferArray.push(data)
});
cmd.stderr.on('data', (data) => {
console.error(`stderr: ${data}`);
});
cmd.on('close', (code) => {
console.log(`child process exited with code ${code}`);
let dataBuffer = Buffer.concate(bufferArray];
console.log(dataBuffer.toString())
});
this could be useful: Node.js spawn child process and get terminal output live
Turns out that the shell command had a process.exit() statement that is being called before the stdout buffer is fully flushed.
So stdout will send 8192 chars and since it's asynchronous the process will go on to the next statement, one of them being process.exit() and it will kill the process before flushing out the rest of the stdout buffer.
TL;DR - exec/spawn works correctly, shell command exits before stdout fully flushed
The default buffer size for child_process.exec is 1MB, so try not passing a maxBuffer; however, it would be much better to use child_process.spawn so that you get the output as a stream.
I m new to this topic but still want to Know how to approach to this.
I want to build a system that uses messaging to perform crud operations on a nodejs Server.
I Know about Rest but i cant figure out how to translate it to messaging with rabbitmq.
edit:
I think i have to make my questin a bit more clear:
What i want to do is sending a message produced by my Java client using amqp and rabbitMQ to a node.js server. The message contains a JSON object.
Some of the data should be send into the database(mysql).
My code looks some kind like this(Java Producer):
JSONObject obj = new JSONObject();
obj.put("fuellstand", behaelter_1.getFuellstand());
obj.put("behaelter", behaelter_1.getId());
obj.put("timestamp", currentTimestamp);
//String message = behaelter_1.getFuellstand()+" "+ behaelter_1.getId()+" "+currentTimestamp;
String message = obj.toJSONString();
channel.basicPublish("", QUEUE_NAME, null, message.getBytes("UTF-8"));
//channel.basicPublish("",QUEUE_NAME , , arg3);
System.out.println(message+" "+message.getBytes("UTF-8"));
And thats how my nodejs server trys to consume it:
amqp.connect('amqp://localhost', function (err, conn) {
if (err) {
console.log("fehler mit dem amqp host!")
throw(err);
} else {
conn.createChannel(function (err, ch) {
if (err) {
console.log("failing to createChanel")
throw(err);
} else {
var q = 'alerts';
ch.assertQueue(q, {durable: false});
console.log(" [*] Waiting for something in %s. CTRL+C to end", q);
ch.consume(q, function (msg) {
console.log(msg);
}, {noAck: true});
}
});
}
});
the console returns the following:
{ fields: { consumerTag: 'amq.ctag-G3vsZRIGRZJT1qntZ1hTuw',
deliveryTag: 1,
redelivered: false,
exchange: '',
routingKey: 'alerts' },properties: {},content: <Buffer 7b 22 66 75 65 6c 6c 73 74 61 6e 64 22 3a 32 32 2c 22 62 65 68 61 65 6c 74 65 72 22 3a 31 2c 22 74 69 6d 65 73 74 61 6d 70 22 3a 32 30 31 36 2d 31 32 ... > }
my only problem at this point is to decode the json j build. I dont get why i cant decode the buffer. or am i getting something wrong?
As it turns out i had to use the following code to access the content of the message
msg.content.toString
now i only need to parse it into json to access the individual json attributes
RabbitMQ is not a database and does not support CRUD operations
The following is an excerpt from the stream-handbook by James Halliday (aka substack):
Here's an example of using .read(n) to buffer stdin into 3-byte
chunks:
process.stdin.on('readable', function () {
var buf = process.stdin.read(3);
console.dir(buf);
});
Running this example gives us incomplete data!
$ (echo abc; sleep 1; echo def; sleep 1; echo ghi) | node consume1.js
<Buffer 61 62 63>
<Buffer 0a 64 65>
<Buffer 66 0a 67>
This is because there is extra data left in internal buffers and we
need to give node a "kick" to tell it that we are interested in more
data past the 3 bytes that we've already read. A simple .read(0) will
do this:
process.stdin.on('readable', function () {
var buf = process.stdin.read(3);
console.dir(buf);
process.stdin.read(0);
});
Now our code works as expected in 3-byte chunks!
$ (echo abc; sleep 1; echo def; sleep 1; echo ghi) | node consume2.js
<Buffer 61 62 63>
<Buffer 0a 64 65>
<Buffer 66 0a 67>
<Buffer 68 69 0a>
When I change the example to 2-byte read chunks, it breaks - presumably because the internal buffer still has data queued up. But that wouldn't happen if read(0) kicked off a 'readable' event each time it was called. Looks like it only happens after all the input is finished.
process.stdin.on('readable', function () {
var buf = process.stdin.read(2);
console.dir(buf);
process.stdin.read(0);
});
What does this code do under the hood? It seems like read(0) queues another 'readable' event, but only at the end of input. I tried reading through the readable stream source, but it's pretty heavy-lifting. Does anyone know how this example works?
There is a code I found here https://github.com/substack/stream-handbook which reads 3 bytes from stream. And I do not understand how it works.
process.stdin.on('readable', function() {
var buf = process.stdin.read(3);
console.log(buf);
process.stdin.read(0);
});
Being called like this:
(echo abc; sleep 1; echo def; sleep 1; echo ghi) | node consume.js
It returns:
<Buffer 61 62 63>
<Buffer 0a 64 65>
<Buffer 66 0a 67>
<Buffer 68 69 0a>
First of all, why do I need this .read(0) thing? Isn't stream has a buffer where the rest of data is stored until I request it by .read(size)? But without .read(0) it'll print
<Buffer 61 62 63>
<Buffer 0a 64 65>
<Buffer 66 0a 67>
Why?
The second is these sleep 1 instructions. If I call the script without it
(echo abc; echo def; echo ghi) | node consume.js
It'll print
<Buffer 61 62 63>
<Buffer 0a 64 65>
no matter will I use .read(0) or not. I don't understand this completely. What logic is used here to print such a result?
I am not sure about what exactly the author of https://github.com/substack/stream-handbook tried to show using the read(0) approach, but IMHO this is the correct approach:
process.stdin.on('readable', function () {
let buf;
// Every time when the stream becomes readable (it can happen many times),
// read all available data from it's internal buffer in chunks of any necessary size.
while (null !== (buf = process.stdin.read(3))) {
console.dir(buf);
}
});
You can change the chunk size, pass the input either with sleep or without it...
I happened to learn NodeJS stream module these days. Here are some comments inside Readable.prototype.read function:
// if we're doing read(0) to trigger a readable event, but we
// already have a bunch of data in the buffer, then just trigger
// the 'readable' event and move on.
It said that after called .read(0), stream would just trigger (using the process.nextTick) another readable event if stream was not ended.
function emitReadable(stream) {
// ...
process.nextTick(emitReadable_, stream);
// ...
}