How to parse a continuous multipart response in NodeJS - node.js

I'm creating a tool that needs to consume a Security System API. I can't give much information about it but as the API documentation explains:
A stream of events data will be sent using HTTP Multipart x-mixed-replace transmission. The response data stream is continuous.
Each event is separated by the multipart boundary --DummyBoundary.
From what I understood with some network sniffing tools, whenever an event happens, this stream returns data related to that but how can I read this data into variables?
Thanks.

You can try multer library. More details here

Related

what payload data exactly means in nodejs

Can someone explain me the meaning of payload data in nodejs. I am bit confused here about the term payload data.
I am using payload.somevariable. I thought I am getting the payload data from the HTTP request.
Payload data in NodeJs is just packets or chunks of data sent to the server and that cannot be accessed ordinarily. They can be accessed when we decode them, using the string_decoder module and some of its methods like the on and end.
In a framework of NodeJS known as Express.js, an example is body-parser module. With this module we can access these chunks of data so as to CRUD it.

Continuously send a response to client

Is there a way, I could continuously send a response to the client before finally calling end on the response object?
There is a huge file and I want to read,parse and then send the analysis back to the client. Is it possible that I keep sending a response as soon as I read a line?
response.send also calls end, so I cannot use it.
You want to use res.write, which sends response body content without terminating the response:
This sends a chunk of the response body. This method may be called multiple times to provide successive parts of the body.
Note that write is not included in the Express documentation because it is not an Express-specific function, but is inherited from Node's core http.ServerResponse, which Express's response object extends from:
The res object is an enhanced version of Node's own response object and supports all built-in fields and methods.
However, working with streaming data is always little tricky (see the comment below warning about unexpected timeouts), so it may be easier to restructure your application to send streaming data via a WebSocket, or socket.io for compatibility with browsers that don't support WebSockets.
What kind of server side application are you working with? You could use Node.js with Socket.IO to do something like this. Here's some useful links.
First Node.js application
Node.js readline API

Streaming an API using request module

The API endpoint I need to access provides live streaming option only. But the need is for a regular non streaming API. Using the request node module can I achieve this?
You can hook up to the stream on your server and store data that arrives in the stream locally on the server in a database and then when a REST request comes in for some data, you look in your local database and satisfy the request from that database (the traditional, non-streaming way).
Other than that, I can't figure out what else you might be trying to do. You cannot "turn a streaming API into a non-streaming one". They just aren't even close to the same thing. A streaming API is like subscribing to a feed of information. You don't make a request, new data is just sent to you when it's available. A typical non-streaming API is that a client makes a specific request and the server responds with data for that specific request.
Here's a discussion of the Twitter streaming API that might be helpful: https://dev.twitter.com/streaming/overview

Write then wait for response serialport nodejs

i'm trying to use serialport.js and I wondered how can it be possible to write and wait response from the other side because all is asynchronous.
My problem is I must know to which response is associated the send message.
I think that involve to manage when response never send from the other part.
Should I use queue system ? is there a example to do it or a library ?
thanks in advance.
You need to write something to parse the stream. I do this by implementing a writable stream which fires events for the parts of the stream data that I want.
How you do this specifically depends on the protocol you're trying to implement.

How to take Node.js stream and send it to form endpoint (dropbox)

I have a readable stream (s3) that I would like to pipe to dropbox's put endpoint, however this does not support Transfer-Encoding: chunked required for streaming data.
I see 2 possible solutions
read the stream into a variable of some sort then send up, memory can then be a problem
write the stream to disk then read it back and upload, which feels dirty and will be slow
What is the best solution to this problem?

Resources