Node.js: Race condition when receiving data on tcp socket - node.js

I'm using the net library of Node.js to conect to a server that is publishing data. So I'm listening for 'data'-events on client side. When the data-event is fired, I append the received data to my rx-buffer and check if we got a complete message by reading some bytes. If I got a valid message, I remove the message from the buffer and process it. The source code looks like:
rxBuffer = ''
client.on('data', (data) => {
rxBuffer += data
// for example... 10 stores the message length...
while (rxBuffer.length > 10 && rxBuffer.length >= (10 + rxBuffer[10])) {
const msg = rxBuffer.slice(0, 10 + rxBuffer[10])
rxBuffer = rxBuffer.slice(0, msg.length) // remove message from buffer
processMsg(msg) // process message..
}
})
As far as I know that the typical way. But... what happens if the data event fired multiple times? So, imagine I'm getting a data event and while I append the data to my rx-buffer I'm getting the next data event. So the "new" data event will also append the data to the rxBuffer and starts my while-loop. So I've two handlers that are processing the same messages because they share the same rx-buffer. Is this correct?
How can I handle this? In other languages I'd say use something like a mutex to prevent multiple access to the rx-buffer... but what's the solution forjs?!?! Or maybe I'm wrong and I'm never getting multiple data-events while one event is still active? Any ideas?

JavaScript is single threaded. The second event will not run until the first one either completes or blocks, the latter of which could presumably happen in your processMsg(). If that's the case, multiple executions of processMsg() could be interleaved. If they aren't changing any global data (rxBuffer included), then you shouldn't have a problem.

Related

Is there a way to keep node js cluster send() messages in order

Using some added sequence checks below, I see that messages are sometimes arriving out of order and that breaks the code.
I am thinking I must queue up out-of-order messages upon receive to make sure things get processed in order.
Is this just the nature of NodeJS ?
// In the master process:
msg.sequence = next_sequence[i]++;
worker[i].send(msg)
// In worker(s):
process.on("message",handler);
....
var last_sequence = 0;
function handler(msg){
if ( last_sequence + 1 != msg.sequence ) console.log(...);
last_sequence = msg.sequence;
After using send(JSON.stringify(msg)) and JSON.parse when receiving, the behavior seems more deterministic and message sequence numbers are in order.
So it seems that send() does not immediately copy the data and it can still be changed some short while after calling send().
Can anyone confirm this ?

readable.on('end',...) is never fired

I am trying to stream some audio to my server and then stream it to a service specified by the user, the user will be providing me with someHostName, which can sometimes not support that type of request.
My problem is that when it happens the clientRequest.on('end',..) is never fired, I think it's because it's being piped to someHostReq which gets messed up when someHostName is "wrong".
My question is:
Is there anyway that I can still have clientRequest.on('end',..) fired even when the stream clientRequest pipes to has something wrong with it?
If not: how do I detect that something wrong happened with someHostReq "immediately"? someHostReq.on('error') doesn't fire up except after some time.
code:
someHostName = 'somexample.com'
function checkIfPaused(request){//every 1 second check .isPaused
console.log(request.isPaused()+'>>>>');
setTimeout(function(){checkIfPaused(request)},1000);
}
router.post('/', function (clientRequest, clientResponse) {
clientRequest.on('data', function (chunk) {
console.log('pushing data');
});
clientRequest.on('end', function () {//when done streaming audio
console.log('im at the end');
}); //end clientRequest.on('end',)
options = {
hostname: someHostName, method: 'POST', headers: {'Transfer-Encoding': 'chunked'}
};
var someHostReq = http.request(options, function(res){
var data = ''
someHostReq.on('data',function(chunk){data+=chunk;});
someHostReq.on('end',function(){
console.log('someHostReq.end is called');
});
});
clientRequest.pipe(someHostReq);
checkIfPaused(clientRequest);
});
output:
in the case of a correct hostname:
pushing data
.
.
pushing data
false>>>
pushing data
.
.
pushing data
pushing data
false>>>
pushing data
.
.
pushing data
console.log('im at the end');
true>>>
//continues to be true, that's fine
in the case of a wrong host name:
pushing data
.
.
pushing data
false>>>>
pushing data
.
.
pushing data
pushing data
false>>>>
pushing data
.
.
pushing data
true>>>>
true>>>>
true>>>>
//it stays true and clientRequest.on('end') is never called
//even tho the client is still streaming data, no more "pushing data" appears
if you think my question is a duplicate:
it's not the same as this: node.js http.request event flow - where did my END event go? , the OP was just making a GET instead of a POST
it's not the same as this: My http.createserver in node.js doesn't work? , the stream was in paused mode because the none of the following happened:
You can switch to flowing mode by doing any of the following:
Adding a 'data' event handler to listen for data.
Calling the resume() method to explicitly open the flow.
Calling the pipe() method to send the data to a Writable.
source: https://nodejs.org/api/stream.html#stream_class_stream_readable
it's not the same as this: Node.js response from http request not calling 'end' event without including 'data' event , he just forgot to add the .on('data',..)
The behaviour in case of a wrong host name seems some problem with buffers, if the destination stream buffer is full (because someHost is not getting the sended chunks of data) the pipe will not continue to read the origin stream because pipe automatically manage the flow. As pipe is not reading the origin stream you never reach 'end' event.
Is there anyway that I can still have clientRequest.on('end',..) fired
even when the stream clientRequest pipes to has something wrong with
it?
The 'end' event will not fire unless the data is completely consumed. To get 'end' fired with a paused stream you need to call resume() (unpiping first from wrong hostname or you will fall in buffer stuck again) to set the steam into flowMode again or read() to the end.
But how to detect when I should do any of the above?
someHostReq.on('error') is the natural place but if it takes too long to fire up:
First try to set a low timeout request (less than someHostReq.on('error') takes to trigger, as seems too much time for you) request.setTimeout(timeout[, callback]) and check if it doesn't fail when correct hostname. If works, just use the callback or timeout event to detect when the server timeOut and use one of the techniques above to reach to the end.
If timeOut solution fails or doesn't fits your requirements you have to play with flags in clientRequest.on('data'), clientRequest.on('end') and/or clienteRequest.isPaused to guess when you are stuck by the buffer. When you think you are stuck just apply one of the techniques above to reach to the end of the stream. Luckily it takes less time to detect buffer stuck than wait for someHostReq.on('error') (maybe two request.isPaused() = true without reach 'data' event is enought to determine if you are stuck).
How do I detect that something wrong happened with someHostReq
"immediately"? someHostReq.on('error') doesn't fire up except after
some time.
Errors triggers when triggers. You can not "immediately" detect it. ¿Why not just send a prove beacon request to check support before piping streams? Some kind of:
"Cheking service specified by the user..." If OK -> Pipe user request stream to service OR FAIL -> Notify user about wrong service.

How to iterate on each record of a Model.stream waterline query?

I need to do something like:
Lineup.stream({foo:"bar"}).exec(function(err,lineup){
// Do something with each record
});
Lineup is a collection with over 18000 records so I think using find is not a good option. What's the correct way to do this? From docs I can't figure out how to.
The .stream() method returns a node stream interface ( a read stream ) that emits events as data is read. Your options here are either to .pipe() to something else that can take "stream" input, such as the response object of the server, or to attach an event listener to the events emitted from the stream. i.e:
Piped to response
Lineup.stream({foo:"bar"}).pipe(res);
Setup event listeners
var stream = Lineup.stream({foo:"bar"});
stream.on("data",function(data) {
stream.pause(); // stop emitting events for a moment
/*
* Do things
*/
stream.resume(); // resume events
});
stream.on("err",function(err) {
// handle any errors that will throw in reading here
});
The .pause() and .resume() are quite inportant as otherwise things within the processing just keep responding to emitted events before that code is complete. While fine for small cases, this is not desirable for larger "streams" that the interface is meant to be used for.
Additionally, if you are calling any "asynchronous" actions inside the event handler like this, then you need to take care to .resume() within the callback or promise resolution , thus waiting for that "async" action to complete itself.
But look at the "node documentation" linked earlier for more in depth information on "stream".
P.S I believe the following syntax should also be supported if it suits your sensibilities better:
var stream = Lineup.find({foo:"bar"}).stream();

Handling chunked responses from process.stdout 'data' event

I have some code which I can't seem to fix. It looks as follows:
var childProcess = require('child_process');
var spawn = childProcess.spawn;
child = spawn('./simulator',[]);
child.stdout.on('data',
function(data){
console.log(data);
}
);
This is all at the backend of my web application which is running a specific type of simulation. The simulator executable is a c program which runs a loop waiting to be passed data (via its standard input) When the inputs come in for the simulation (ie from the client), I parse the input, and then write data to the child process stdin as follows:
child.stdin.write(INPUTS);
Now the data coming back is 40,000 bytes give or take. But the data seems to be getting broken into chunks of 8192 bytes. I've tried fixing the standard output buffer of the c program but it doesnt fix it. I'm wondering if there is a limit to the size of the 'data' event that is imposed by node.js? I need it to come back as one chunk.
The buffer chunk sizes are applied in node. Nothing you do outside of node will solve the problem. There is no way to get what you want from node without a little extra work in your messaging protocol. Any message larger than the chunk size will be chunked. There are two ways you can handle this issue.
If you know the total output size before you start to stream out of C, prepend the message length to the data so the node process knows how many chunks to pull before terminating the entire message.
Determine a special character you can append to the message you are sending from the C program. When node sees that character, you end the input from that message.
If you are dealing with IO in a web application you really want to stick with the async methods. You need something like the following (untested). There is a good sample of how to consume the Stream API in the docs
var data = '';
child.stdout.on('data',
function(chunk){
data += chunk;
}
);
child.stdout.on('end',
function(){
// do something with var data
}
);
I ran into the same problem. I tried many different things and was starting to get annoyed. I tried prepending and appending with special characters. Maybe I was stupid but I just couldn't get it right.
I ran into a module called linerstream which basically parses every chunk until it sees an EOF. You can use it like this:
process.stdout.pipe(new Linerstream()).on('data', (data) => {
// data here is complete and not chunked
});
The important part is that you do have to write data to stdout with a line that ends with EOF. Otherwise it doesn't know it is the end.
I can say this worked me. Hopefully it helps other people.
ppejovic's solution works, but I prefer concat-stream.
var concat = require('concat-stream');
child.stdout.pipe(concat(function(data) {
// all your data ready to be used.
});
There are a number of good stream helpers worth looking into based on your problem area. Take a look at substack's stream-handbook.

ZeroMQ push/pull and nodejs read stream

I'm trying to read some file by opening read stream and send chunks of the file through ZMQ to another process to consume them. The stream is working like it should, however when I start the worker, it doesn't see the data that's been sent.
I tried sending data through socket every 500ms, not in a callback, and when I start the worker it collects all previous chunks of data:
sender = zmq.socket('push')
setInterval(() ->
console.log('sending work');
sender.send('some work')
, 500)
receiver = zmq.socket("pull")
receiver.on "message", (msg) ->
console.log('work is here: %s', msg.toString())
Outputs:
sending work
sending work
sending work
sending work
sending work
// here I start the worker
sending work
work is here: some work
work is here: some work
work is here: some work
work is here: some work
work is here: some work
work is here: some work
sending work
work is here: some work
sending work
work is here: some work
sending work
work is here: some work
So, when the worker starts, it begins with pulling all the previous data and then it pulls it every time sth new comes in. This does not apply when I do this:
readStream = fs.createReadStream("./data/pg2701.txt", {'bufferSize': 100 * 1024})
readStream.on "data", (data) ->
console.log('sending work');
sender.send('some work'); // I'd send 'data' if it worked..
In this scenario, the worker doesn't pull any data at all.
Are those kind of sockets supposed to create a queue or not? What am I missing here?
Yes, push socket is blocking until HWM is reached, and there's nobody to send to.
Maybe the sender hasn't bound yet, try something like this:
sender.bind('address', function(err) {
if (err) throw err;
console.log('sender bound!');
// the readStream code.
}
also a connect is missing from your code example, I bet it's there, but maybe you forgot it.

Resources