how to stream a large xml response from express - node.js

I am trying to stream a large xml file from express to a client, and I have not yet found out how to send until the file is done processing on the server, and res.end() is called.
The xml file is build with xmlbuilder-js. It has a callback that receives document chunks, in which I am trying to send using response.write(chunk).
res.writeHead(200, {
'Content-Type': 'text/xml',
'Transfer-Encoding': 'chunked'
})
xmlToReturn = xmlBuilder.begin({
writer: {
pretty: true,
}
}, function(chunk) {
res.write(chunk)
}).dec('1.0', 'UTF-8', true)
...
res.end()
The callback works as expected, it shows the data chunks coming through.
I have tried:
changing the content-type on the response to, for example,
'application/octet-stream'
using res.flush() after calling
res.write(), or doing that periodically
experimenting with other headers
In all cases, if I can get the response to send, the client never receives the start of it until res.end() is called. What do I need to do so that express starts delivering the content as it flows through the callback?
I've explored questions and posts like this, which suggest my approach is correct but I am doing something wrong, or streaming is not working in express possibly due to other modules or middleware.

Related

Axios request with chunked response stream from Node

I have a client app in React, and a server in Node (with Express).
At server side, I have an endpoint like the following (is not the real endpoint, just an idea of what i'm doing):
function endpoint(req, res) {
res.writeHead(200, {
'Content-Type': 'text/plain',
'Transfer-Encoding': 'chunked'
});
for(x < 1000){
res.write(some_string + '\n');
wait(a_couple_of_seconds); // just to make process slower for testing purposes
}
res.end();
}
This is working perfect, i mean, when I call this endpoint, I receive the whole stream with all the 1.000 rows.
The thing is that I cannot manage to get this data by chunks (for each 'write' or a bunch of 'writes') in order to show that on the frontend as soon as i'm receiving them..(think of a table that shows the rows as soon as i get them from the endpoint call)
In the frontend I'm using Axios to call the API with the following code:
async function getDataFromStream(_data): Promise<any> {
const { data, headers } = await Axios({
url: `http://the.api.url/endpoint`,
method: 'GET',
responseType: 'stream',
timeout: 0,
});
// this next line doesn't work. it says that 'on' is not a function
data.on('data', chunk => console.log('chunk', chunk));
// data has actually the whole response data (all the rows)
return Promise.resolve();
}
The thing is that the Axios call returns the whole data object after the 'res.end()' on the server is called, but I need to get data as soon as the server will start sending the chunks with the rows (on each res.write or whenever the server thinks is ready to send some bunch of chunks).
I have also tried not to use an await and get the value of the promise at the 'then()' of the axios call but it is the same behavior, the 'data' value comes with all the 'writes' together once the server does the 'res.end()'
So, what I doing wrong here ? maybe this is not possible with Axios or Node and I should use something like websockets to solve it.
Any help will be very appreciate it because I read a lot but couldn't get a working solution yet.
For anyone interested in this, what I ended up doing is the following:
At the Client side, I used the Axios onDownloadProgress handler that allows handling of progress events for downloads.
So, I implemented something like this:
function getDataFromStream(_data): Promise<any> {
return Axios({
url: `http://the.api.url/endpoint`,
method: 'GET',
onDownloadProgress: progressEvent => {
const dataChunk = progressEvent.currentTarget.response;
// dataChunk contains the data that have been obtained so far (the whole data so far)..
// So here we do whatever we want with this partial data..
// In my case I'm storing that on a redux store that is used to
// render a table, so now, table rows are rendered as soon as
// they are obtained from the endpoint.
}
}).then(({ data }) => Promise.resolve(data));
}

Why is my node.js server chopping off the response body?

My node server has a strange behaviour when it comes to a GET endpoint that resplies with a big JSON (30-35MB).
I am not using any npm package. Just the core API.
The unexpected behaviour only happens when querying the server from the Internet and it behaves fine if it is queried from the local network.
The problem is that the server stops writing to the response after it writes the first 1260 bytes of the content body. It does not close the connection nor throw an error. Insomnia (the REST client I use for testing) just states that it received a 1260B chunk. If I query the same endpoint from a local machine it says that it received more and bigger chunks (a few KB each).
I don't even think the problem is caused by node but since I am on a clean raspberry pi (installed raspbian and then just node v13.0.1) and the only process I use is node.js I don't know how to find the source of the problem, there is no load balancer or web server to blame. Also the public IP seems OK, every other endpoint is working fine (they reply with less than 1260B per request)
The code for that endpoint looks like this
const text = url.parse(req.url, true).query.text;
if (text.length > 4) {
let results = await models.fullTextSearch(text);
results = await results.map(async result=>{
result.Data = await models.FindData(result.ProductID, 30);
return result;
});
results = await Promise.all(results);
results = JSON.stringify(results);
res.writeHead(200, {'Content-Type': 'application/json', 'Transfer-Encoding': 'chunked', 'Access-Control-Allow-Origin': '*', 'Cache-Control': 'max-age=600'});
res.write(results);
res.end();
break;
}
res.writeHead(403, {'Content-Type': 'text/plain', 'Access-Control-Allow-Origin': '*'});
res.write("You made an invalid request!");
break;
Here are a number of things to do in order to debug this:
Add console.log(results.length) to make sure the length of the data is what you expect it to be.
Add a callback to res.end(function() { console.log('finished sending response')}) to see if the http library thinks it is done sending the response.
Check the return value from res.write(). If it is false (indicating that not all data has yet been sent), add a handler for the drain event and see if it gets called.
Try increasing the sending timeout with res.setTimeout() in case it's just taking too long to send all the data.

How to send response using twice or more callback in one request

I'm using express and the request POST look like that
router.post('/', function(req, res, next){
var data = req.body;
getRandom(data, function(value){
res.json({value: value});
});
});
POST is sent through ajax and then update textarea with new data.
$.ajax({
type: "POST",
url: "/",
data: JSON.stringify(datareq),
dataType: 'json',
contentType: 'application/json',
success: function(x){
$.each(x, function(index, value) {
$('.textarea').append(value + '\n');
});
},
error: function(x) {
console.log(x + 'error');
}
});
How to send this using one POST and a few response. User received one data in textarea when cb finished and then another data and so one till the end.
<textarea>
data 1 - 1sec
data 2 - 2sec leater
data 3 - 3 second later
...
</textarea>
I add Time (1sec ...) only to show that callback has a lot to do to send another response.
Of course this not working because res.send() close connection and I received error
So how to achieve my idea, to sending simultaneously after post request. I want to give user data very fast, then another one when is ready not waiting for all and then send response.
You can't
Reason:
Http closes connection after sending response. You can not keep it open and sending multiple responses to the client. HTTP doesn't support it.
Solution 1:
Simply put a timer at client side and request periodically.
Solution 2 (Recommended):
Use socket, and pass data through it. socket.io is the socket library for nodejs applications. It is very easy to use. Set up a connection, keep sending data from server and receive it on client side.
Just to add on the answer. This answer explains why res.send closes the connection.

Sending result set back to client in node.js

I am having an issue getting the result set back to the client using Node.js. I am new to it and using it for a project but I am stuck and not sure why. Here's the situation: I have a webpage, a server, a request handler and a database interface. I am able to send data back and forth the client and server without any issue. The only time it doesn't work is when I try to send the result from my query back to the client.
function doSomething(response)
{
var data = {
'name': 'doSomething'
};
response.writeHead(200, {'Content-Type': 'text/html', 'Access-Control-Allow-Origin': '*'});
response.end(JSON.stringify(data));
}
This works fine as I can read the name from the object on the client side, but
function fetchAllIDs(response)
{
dbInterface.fetchAllIDs(function(data) {
// console.log(data) prints the correct information here
response.writeHead(200, {'Content-Type': 'text/html', 'Access-Control-Allow-Origin': '*'});
response.end(data);
// console.log(data) from the client side is just blank
});
}
I believe the issue is the way I handle my callback and response because without trying to use mysql the rest of my code works fine. Thanks!
EDIT: I removed a piece code that seems to confuse people. It was just to show that if I have the response code outside the callback then I am able to get any data back to the server. In my actual code, I do not have the two response statements together. I just can't get the rows from the fetchAllIDs function back to the client.
The way you have it written means the
reponse.end('This string is received by the client');
line is always called before the callback function meaning the reponse has ended.
Under JS all the code will finish before the next event (your callback function) is taken off the queue. Comment out the above line // and test it

Pipe an MJPEG stream through a Node.js proxy

Using Motion on linux, every webcam is served up as a stream on its own port.
I now want to serve up those streams, all on the same port, using Node.js.
Edit: This solution now works. I needed to get the boundary string from the original mjpeg stream (which was "BoundaryString" in my Motion config)
app.get('/motion', function(req, res) {
var boundary = "BoundaryString";
var options = {
// host to forward to
host: '192.168.1.2',
// port to forward to
port: 8302,
// path to forward to
path: '/',
// request method
method: 'GET',
// headers to send
headers: req.headers
};
var creq = http.request(options, function(cres) {
res.setHeader('Content-Type', 'multipart/x-mixed-replace;boundary="' + boundary + '"');
res.setHeader('Connection', 'close');
res.setHeader('Pragma', 'no-cache');
res.setHeader('Cache-Control', 'no-cache, private');
res.setHeader('Expires', 0);
res.setHeader('Max-Age', 0);
// wait for data
cres.on('data', function(chunk){
res.write(chunk);
});
cres.on('close', function(){
// closed, let's end client request as well
res.writeHead(cres.statusCode);
res.end();
});
}).on('error', function(e) {
// we got an error, return 500 error to client and log error
console.log(e.message);
res.writeHead(500);
res.end();
});
creq.end();
});
I would think this serves up the mjpeg stream at 192.168.1.2:8302 as /motion, but it does not.
Maybe because it never ends, and this proxy example wasn't really a streaming example?
Streaming over HTTP isn't the issue. I do that with Node regularly. I think the problem you're having is that you aren't sending a content type header to the client. You go right to writing data without sending any response headers, actually.
Be sure to send the right content type header back to the client making the request, before sending any actual content data.
You may need to handle multipart responses, if Node's HTTP client doesn't already do it for you.
Also, I recommend debugging this with Wireshark so you can see exactly what is being sent and received. That will help you narrow down problems like this quickly.
I should also note that some clients have a problem with chunked encoding, which is what Node will send if you don't specify a content length (which you can't because it's indefinite). If you need to disable chunked encoding, see my answer here: https://stackoverflow.com/a/11589937/362536 Basically, you just need to disable it: response.useChunkedEncodingByDefault = false;. Don't do this unless you need to though! And make sure to send a Connection: close in your headers with it!
What you need to do is request the mjpeg stream when it's necessary just in one thread and response each client with mjpeg or jpeg (if you need IE support).

Resources