SSE events are not reaching client until res.end() is called - node.js

The same code used to work before but i dont know why it is not working now.Please help.
My problem is that when i use SSE for real time data sharing.The data object which should be sent on res.write(data:${JSON.stringify(dataObject)}\n\n) is not being sent until res.end() is called and all the data is event streams are sent at once.
response.writeHead(200, {
Connection: "keep-alive",
"Content-Type": "text/event-stream",
"Cache-Control": "no-cache",
"Access-Control-Allow-Origin": '*',
'Content-Encoding':'identity' // this with and without this
});
let syncher= setInterval(() => {
if(response.finished){ // If response ended the interval is cleared
clearInterval(syncher);
return;
}else{
let dataToSend = getEventData(user,event);
if(! dataToSend ){
response.write('data:{close:true}');
clearInterval(syncher);
return;
}
response.write(`data:${JSON.stringify(dataToSend)}\n\n`);
response.flushHeaders(); // Also tried with response.flush()
if(dataToSend.close){
delEventData(user,event);
response.end();
}
}
}, 500);
The above code is in server side this also has on close listener to close the connection
const ev = new EventSource(conf.apiUrl+'/getStatus/'+ (userData.id || '') );
let data = '';
ev.onmessage = eventData=>{
data = JSON.parse(eventData.data);
if(!data){
setState('progress '+data.completedSoFar)
return;
}
if(!data.close){
}else{
if(data.success){
console.log('Done Successfully')
ev.close();
}
}
This is my client side code
I don't know why the event listener is not getting data stream while i searched the internet about this issue i only found that when compression middleware is used this issue occurs .I don't use any compression middleware in my app. I am using nodejs v11.4.0. I am guessing that when i am making eventsource request chrome is adding gzip encoding by default and node is using that to set response encoding header as gzip I tried to delete and replace it but did'nt work which is causing this issue??
Here is my request and response headers for eventSource request
Sorry for my grammar if i made any mistakes.
Thanks for help. Cheers!

Nextjs is basically compressing your data to make it transmit faster. Unfortunately, this makes it so we can't see the data until we flush the cache (my guess is render behavior has changed because compression has changed the content). You can disable compression entirely by including compress: false in your next.config.
I found here that including the following header sidesteps the compression for a specific endpoint:
res.setHeader("Cache-Control", "no-cache, no-transform");
Caution: This will increase bandwidth/resource usage! HTTP compression can reduce the size of your data by 70%.

After lot of debugging and research. The problem turned to be in the webpack-dev-server which compressed my responses. For more info refer https://github.com/facebook/create-react-app/issues/966

Related

Multiple http requests while using SSE with NodeJs

I'm trying to implement an application and one of the things I need to do is to use Server Sent Events to send data from the server to the client. The basis of SSE is to have one connection in which data is transfered back and forth without this connection being closed. The problem I'm having right now is that everytime I make a HTTP from the client using EventSource() multiple request are being made.
Client:
const eventSource = new EventSource('http://localhost:8000/update?nick='+username+'&game='+gameId)
eventSource.onmessage = function(event) {
const data = JSON.parse(event.data)
console.log(data)
}
Server (Node.Js):
case '/update':
res.writeHead(200,{
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache',
'Connection': 'keep-alive'
})
res.write('data: 1')
res.write('\n\n')
res.end('{}')
break
This is what I see in the chrome dev tools. When the client tries to connect using SSE, it makes multiple requests to the server. However only one request was supposed to made.
Do any of you know how to fix this? Thank you in advance.
The way one would do that is to not include the res.end() since the connection has to be kept alive. On top of that, I had to keep track of the responses from the http requests made by the user, so I created a different module with the following methods:
let responses = []
module.exports.remember = function(res){
responses.push(res)
}
module.exports.forget = function(res){
let pos = responses.findIndex((response)=>response===res)
if(pos>-1){
responses.splice(pos, 1)
}
}
module.exports.update = function(data){
for(let response of responses){
response.write(`data: ${data} \n\n`)
}
}
This way one can access the response objects and use the function update() to send data to the connected clients.

Why is my node.js server chopping off the response body?

My node server has a strange behaviour when it comes to a GET endpoint that resplies with a big JSON (30-35MB).
I am not using any npm package. Just the core API.
The unexpected behaviour only happens when querying the server from the Internet and it behaves fine if it is queried from the local network.
The problem is that the server stops writing to the response after it writes the first 1260 bytes of the content body. It does not close the connection nor throw an error. Insomnia (the REST client I use for testing) just states that it received a 1260B chunk. If I query the same endpoint from a local machine it says that it received more and bigger chunks (a few KB each).
I don't even think the problem is caused by node but since I am on a clean raspberry pi (installed raspbian and then just node v13.0.1) and the only process I use is node.js I don't know how to find the source of the problem, there is no load balancer or web server to blame. Also the public IP seems OK, every other endpoint is working fine (they reply with less than 1260B per request)
The code for that endpoint looks like this
const text = url.parse(req.url, true).query.text;
if (text.length > 4) {
let results = await models.fullTextSearch(text);
results = await results.map(async result=>{
result.Data = await models.FindData(result.ProductID, 30);
return result;
});
results = await Promise.all(results);
results = JSON.stringify(results);
res.writeHead(200, {'Content-Type': 'application/json', 'Transfer-Encoding': 'chunked', 'Access-Control-Allow-Origin': '*', 'Cache-Control': 'max-age=600'});
res.write(results);
res.end();
break;
}
res.writeHead(403, {'Content-Type': 'text/plain', 'Access-Control-Allow-Origin': '*'});
res.write("You made an invalid request!");
break;
Here are a number of things to do in order to debug this:
Add console.log(results.length) to make sure the length of the data is what you expect it to be.
Add a callback to res.end(function() { console.log('finished sending response')}) to see if the http library thinks it is done sending the response.
Check the return value from res.write(). If it is false (indicating that not all data has yet been sent), add a handler for the drain event and see if it gets called.
Try increasing the sending timeout with res.setTimeout() in case it's just taking too long to send all the data.

how to stream a large xml response from express

I am trying to stream a large xml file from express to a client, and I have not yet found out how to send until the file is done processing on the server, and res.end() is called.
The xml file is build with xmlbuilder-js. It has a callback that receives document chunks, in which I am trying to send using response.write(chunk).
res.writeHead(200, {
'Content-Type': 'text/xml',
'Transfer-Encoding': 'chunked'
})
xmlToReturn = xmlBuilder.begin({
writer: {
pretty: true,
}
}, function(chunk) {
res.write(chunk)
}).dec('1.0', 'UTF-8', true)
...
res.end()
The callback works as expected, it shows the data chunks coming through.
I have tried:
changing the content-type on the response to, for example,
'application/octet-stream'
using res.flush() after calling
res.write(), or doing that periodically
experimenting with other headers
In all cases, if I can get the response to send, the client never receives the start of it until res.end() is called. What do I need to do so that express starts delivering the content as it flows through the callback?
I've explored questions and posts like this, which suggest my approach is correct but I am doing something wrong, or streaming is not working in express possibly due to other modules or middleware.

node.js: browser image caching with correct headers

I'm developing a web application that manages a large amount of images, stores and resizes them.
the request of an image is something like:
domain:port/image_id/size
The server takes the image_id and if there isn't yet an image of such size it creates it and stores it on filesystem.
So everything is ok and the server is running but I need to cache those images in browser for at least one day to reduce the server bandwidth consumption.
I did several tests but nothing seems to work.
Here is the code I use to make the response header:
response.writeHead(304, {
"Pragma": "public",
"Cache-Control": "max-age=86400",
"Expires": new Date(Date.now() + 86400000).toUTCString(),
"Content-Type": contentType});
response.write(data);
response.end();
I also tried with response status 200.
contentType is always a mime type like "image/jpg" or "image/png"
data is the bytes buffer of the image.
Any advice?
Thanks a lot.
live long and prosper,
d.
I did a lot of tests and I came out with a solution that seems pretty good to manage this caching problem.
Basically what I do is getting the request and check for the request header named "if-modified-since".
If I find it and the value (it is a date) is the same as the modified date of the file, the response will be a 304 status with no content.
If I don't find this value or it's different from the modified date of the file, I send the complete response with status 200 and the header parameter for further access by the browser.
Here is the complete code of the working test I did:
with "working" I mean that the first request get the file from the server while the next requests get a 304 response and don't send content to the browser, that load it from local cache.
var http = require("http");
var url = require("url");
var fs = require('fs');
function onRequest(request, response) {
var pathName = url.parse(request.url).pathname;
if (pathName!="/favicon.ico") {
responseAction(pathName, request, response);
} else {
response.end();
}
}
function responseAction(pathName, request, response) {
console.log(pathName);
//Get the image from filesystem
var img = fs.readFileSync("/var/www/radar.jpg");
//Get some info about the file
var stats = fs.statSync("/var/www/radar.jpg");
var mtime = stats.mtime;
var size = stats.size;
//Get the if-modified-since header from the request
var reqModDate = request.headers["if-modified-since"];
//check if if-modified-since header is the same as the mtime of the file
if (reqModDate!=null) {
reqModDate = new Date(reqModDate);
if(reqModDate.getTime()==mtime.getTime()) {
//Yes: then send a 304 header without image data (will be loaded by cache)
console.log("load from cache");
response.writeHead(304, {
"Last-Modified": mtime.toUTCString()
});
response.end();
return true;
}
} else {
//NO: then send the headers and the image
console.log("no cache");
response.writeHead(200, {
"Content-Type": "image/jpg",
"Last-Modified": mtime.toUTCString(),
"Content-Length": size
});
response.write(img);
response.end();
return true;
}
//IF WE ARE HERE, THERE IS A PROBLEM...
response.writeHead(200, {
"Content-Type": "text/plain",
});
response.write("ERROR");
response.end();
return false;
}
http.createServer(onRequest).listen(8889);
console.log("Server has started.");
Of course, I don't want to reinvent the wheel, this is a benchmark for a more complex server previously developed in php and this script is a sort of "porting" of this PHP code:
http://us.php.net/manual/en/function.header.php#61903
I hope this will help!
Please, if you find any errors or anything that could be improved let me know!
Thanks a lot,
Daniele

Pipe an MJPEG stream through a Node.js proxy

Using Motion on linux, every webcam is served up as a stream on its own port.
I now want to serve up those streams, all on the same port, using Node.js.
Edit: This solution now works. I needed to get the boundary string from the original mjpeg stream (which was "BoundaryString" in my Motion config)
app.get('/motion', function(req, res) {
var boundary = "BoundaryString";
var options = {
// host to forward to
host: '192.168.1.2',
// port to forward to
port: 8302,
// path to forward to
path: '/',
// request method
method: 'GET',
// headers to send
headers: req.headers
};
var creq = http.request(options, function(cres) {
res.setHeader('Content-Type', 'multipart/x-mixed-replace;boundary="' + boundary + '"');
res.setHeader('Connection', 'close');
res.setHeader('Pragma', 'no-cache');
res.setHeader('Cache-Control', 'no-cache, private');
res.setHeader('Expires', 0);
res.setHeader('Max-Age', 0);
// wait for data
cres.on('data', function(chunk){
res.write(chunk);
});
cres.on('close', function(){
// closed, let's end client request as well
res.writeHead(cres.statusCode);
res.end();
});
}).on('error', function(e) {
// we got an error, return 500 error to client and log error
console.log(e.message);
res.writeHead(500);
res.end();
});
creq.end();
});
I would think this serves up the mjpeg stream at 192.168.1.2:8302 as /motion, but it does not.
Maybe because it never ends, and this proxy example wasn't really a streaming example?
Streaming over HTTP isn't the issue. I do that with Node regularly. I think the problem you're having is that you aren't sending a content type header to the client. You go right to writing data without sending any response headers, actually.
Be sure to send the right content type header back to the client making the request, before sending any actual content data.
You may need to handle multipart responses, if Node's HTTP client doesn't already do it for you.
Also, I recommend debugging this with Wireshark so you can see exactly what is being sent and received. That will help you narrow down problems like this quickly.
I should also note that some clients have a problem with chunked encoding, which is what Node will send if you don't specify a content length (which you can't because it's indefinite). If you need to disable chunked encoding, see my answer here: https://stackoverflow.com/a/11589937/362536 Basically, you just need to disable it: response.useChunkedEncodingByDefault = false;. Don't do this unless you need to though! And make sure to send a Connection: close in your headers with it!
What you need to do is request the mjpeg stream when it's necessary just in one thread and response each client with mjpeg or jpeg (if you need IE support).

Resources