My node server has a strange behaviour when it comes to a GET endpoint that resplies with a big JSON (30-35MB).
I am not using any npm package. Just the core API.
The unexpected behaviour only happens when querying the server from the Internet and it behaves fine if it is queried from the local network.
The problem is that the server stops writing to the response after it writes the first 1260 bytes of the content body. It does not close the connection nor throw an error. Insomnia (the REST client I use for testing) just states that it received a 1260B chunk. If I query the same endpoint from a local machine it says that it received more and bigger chunks (a few KB each).
I don't even think the problem is caused by node but since I am on a clean raspberry pi (installed raspbian and then just node v13.0.1) and the only process I use is node.js I don't know how to find the source of the problem, there is no load balancer or web server to blame. Also the public IP seems OK, every other endpoint is working fine (they reply with less than 1260B per request)
The code for that endpoint looks like this
const text = url.parse(req.url, true).query.text;
if (text.length > 4) {
let results = await models.fullTextSearch(text);
results = await results.map(async result=>{
result.Data = await models.FindData(result.ProductID, 30);
return result;
});
results = await Promise.all(results);
results = JSON.stringify(results);
res.writeHead(200, {'Content-Type': 'application/json', 'Transfer-Encoding': 'chunked', 'Access-Control-Allow-Origin': '*', 'Cache-Control': 'max-age=600'});
res.write(results);
res.end();
break;
}
res.writeHead(403, {'Content-Type': 'text/plain', 'Access-Control-Allow-Origin': '*'});
res.write("You made an invalid request!");
break;
Here are a number of things to do in order to debug this:
Add console.log(results.length) to make sure the length of the data is what you expect it to be.
Add a callback to res.end(function() { console.log('finished sending response')}) to see if the http library thinks it is done sending the response.
Check the return value from res.write(). If it is false (indicating that not all data has yet been sent), add a handler for the drain event and see if it gets called.
Try increasing the sending timeout with res.setTimeout() in case it's just taking too long to send all the data.
Related
The same code used to work before but i dont know why it is not working now.Please help.
My problem is that when i use SSE for real time data sharing.The data object which should be sent on res.write(data:${JSON.stringify(dataObject)}\n\n) is not being sent until res.end() is called and all the data is event streams are sent at once.
response.writeHead(200, {
Connection: "keep-alive",
"Content-Type": "text/event-stream",
"Cache-Control": "no-cache",
"Access-Control-Allow-Origin": '*',
'Content-Encoding':'identity' // this with and without this
});
let syncher= setInterval(() => {
if(response.finished){ // If response ended the interval is cleared
clearInterval(syncher);
return;
}else{
let dataToSend = getEventData(user,event);
if(! dataToSend ){
response.write('data:{close:true}');
clearInterval(syncher);
return;
}
response.write(`data:${JSON.stringify(dataToSend)}\n\n`);
response.flushHeaders(); // Also tried with response.flush()
if(dataToSend.close){
delEventData(user,event);
response.end();
}
}
}, 500);
The above code is in server side this also has on close listener to close the connection
const ev = new EventSource(conf.apiUrl+'/getStatus/'+ (userData.id || '') );
let data = '';
ev.onmessage = eventData=>{
data = JSON.parse(eventData.data);
if(!data){
setState('progress '+data.completedSoFar)
return;
}
if(!data.close){
}else{
if(data.success){
console.log('Done Successfully')
ev.close();
}
}
This is my client side code
I don't know why the event listener is not getting data stream while i searched the internet about this issue i only found that when compression middleware is used this issue occurs .I don't use any compression middleware in my app. I am using nodejs v11.4.0. I am guessing that when i am making eventsource request chrome is adding gzip encoding by default and node is using that to set response encoding header as gzip I tried to delete and replace it but did'nt work which is causing this issue??
Here is my request and response headers for eventSource request
Sorry for my grammar if i made any mistakes.
Thanks for help. Cheers!
Nextjs is basically compressing your data to make it transmit faster. Unfortunately, this makes it so we can't see the data until we flush the cache (my guess is render behavior has changed because compression has changed the content). You can disable compression entirely by including compress: false in your next.config.
I found here that including the following header sidesteps the compression for a specific endpoint:
res.setHeader("Cache-Control", "no-cache, no-transform");
Caution: This will increase bandwidth/resource usage! HTTP compression can reduce the size of your data by 70%.
After lot of debugging and research. The problem turned to be in the webpack-dev-server which compressed my responses. For more info refer https://github.com/facebook/create-react-app/issues/966
I'm trying to implement an application and one of the things I need to do is to use Server Sent Events to send data from the server to the client. The basis of SSE is to have one connection in which data is transfered back and forth without this connection being closed. The problem I'm having right now is that everytime I make a HTTP from the client using EventSource() multiple request are being made.
Client:
const eventSource = new EventSource('http://localhost:8000/update?nick='+username+'&game='+gameId)
eventSource.onmessage = function(event) {
const data = JSON.parse(event.data)
console.log(data)
}
Server (Node.Js):
case '/update':
res.writeHead(200,{
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache',
'Connection': 'keep-alive'
})
res.write('data: 1')
res.write('\n\n')
res.end('{}')
break
This is what I see in the chrome dev tools. When the client tries to connect using SSE, it makes multiple requests to the server. However only one request was supposed to made.
Do any of you know how to fix this? Thank you in advance.
The way one would do that is to not include the res.end() since the connection has to be kept alive. On top of that, I had to keep track of the responses from the http requests made by the user, so I created a different module with the following methods:
let responses = []
module.exports.remember = function(res){
responses.push(res)
}
module.exports.forget = function(res){
let pos = responses.findIndex((response)=>response===res)
if(pos>-1){
responses.splice(pos, 1)
}
}
module.exports.update = function(data){
for(let response of responses){
response.write(`data: ${data} \n\n`)
}
}
This way one can access the response objects and use the function update() to send data to the connected clients.
I have a client app in React, and a server in Node (with Express).
At server side, I have an endpoint like the following (is not the real endpoint, just an idea of what i'm doing):
function endpoint(req, res) {
res.writeHead(200, {
'Content-Type': 'text/plain',
'Transfer-Encoding': 'chunked'
});
for(x < 1000){
res.write(some_string + '\n');
wait(a_couple_of_seconds); // just to make process slower for testing purposes
}
res.end();
}
This is working perfect, i mean, when I call this endpoint, I receive the whole stream with all the 1.000 rows.
The thing is that I cannot manage to get this data by chunks (for each 'write' or a bunch of 'writes') in order to show that on the frontend as soon as i'm receiving them..(think of a table that shows the rows as soon as i get them from the endpoint call)
In the frontend I'm using Axios to call the API with the following code:
async function getDataFromStream(_data): Promise<any> {
const { data, headers } = await Axios({
url: `http://the.api.url/endpoint`,
method: 'GET',
responseType: 'stream',
timeout: 0,
});
// this next line doesn't work. it says that 'on' is not a function
data.on('data', chunk => console.log('chunk', chunk));
// data has actually the whole response data (all the rows)
return Promise.resolve();
}
The thing is that the Axios call returns the whole data object after the 'res.end()' on the server is called, but I need to get data as soon as the server will start sending the chunks with the rows (on each res.write or whenever the server thinks is ready to send some bunch of chunks).
I have also tried not to use an await and get the value of the promise at the 'then()' of the axios call but it is the same behavior, the 'data' value comes with all the 'writes' together once the server does the 'res.end()'
So, what I doing wrong here ? maybe this is not possible with Axios or Node and I should use something like websockets to solve it.
Any help will be very appreciate it because I read a lot but couldn't get a working solution yet.
For anyone interested in this, what I ended up doing is the following:
At the Client side, I used the Axios onDownloadProgress handler that allows handling of progress events for downloads.
So, I implemented something like this:
function getDataFromStream(_data): Promise<any> {
return Axios({
url: `http://the.api.url/endpoint`,
method: 'GET',
onDownloadProgress: progressEvent => {
const dataChunk = progressEvent.currentTarget.response;
// dataChunk contains the data that have been obtained so far (the whole data so far)..
// So here we do whatever we want with this partial data..
// In my case I'm storing that on a redux store that is used to
// render a table, so now, table rows are rendered as soon as
// they are obtained from the endpoint.
}
}).then(({ data }) => Promise.resolve(data));
}
I am trying to stream a large xml file from express to a client, and I have not yet found out how to send until the file is done processing on the server, and res.end() is called.
The xml file is build with xmlbuilder-js. It has a callback that receives document chunks, in which I am trying to send using response.write(chunk).
res.writeHead(200, {
'Content-Type': 'text/xml',
'Transfer-Encoding': 'chunked'
})
xmlToReturn = xmlBuilder.begin({
writer: {
pretty: true,
}
}, function(chunk) {
res.write(chunk)
}).dec('1.0', 'UTF-8', true)
...
res.end()
The callback works as expected, it shows the data chunks coming through.
I have tried:
changing the content-type on the response to, for example,
'application/octet-stream'
using res.flush() after calling
res.write(), or doing that periodically
experimenting with other headers
In all cases, if I can get the response to send, the client never receives the start of it until res.end() is called. What do I need to do so that express starts delivering the content as it flows through the callback?
I've explored questions and posts like this, which suggest my approach is correct but I am doing something wrong, or streaming is not working in express possibly due to other modules or middleware.
I am having an issue getting the result set back to the client using Node.js. I am new to it and using it for a project but I am stuck and not sure why. Here's the situation: I have a webpage, a server, a request handler and a database interface. I am able to send data back and forth the client and server without any issue. The only time it doesn't work is when I try to send the result from my query back to the client.
function doSomething(response)
{
var data = {
'name': 'doSomething'
};
response.writeHead(200, {'Content-Type': 'text/html', 'Access-Control-Allow-Origin': '*'});
response.end(JSON.stringify(data));
}
This works fine as I can read the name from the object on the client side, but
function fetchAllIDs(response)
{
dbInterface.fetchAllIDs(function(data) {
// console.log(data) prints the correct information here
response.writeHead(200, {'Content-Type': 'text/html', 'Access-Control-Allow-Origin': '*'});
response.end(data);
// console.log(data) from the client side is just blank
});
}
I believe the issue is the way I handle my callback and response because without trying to use mysql the rest of my code works fine. Thanks!
EDIT: I removed a piece code that seems to confuse people. It was just to show that if I have the response code outside the callback then I am able to get any data back to the server. In my actual code, I do not have the two response statements together. I just can't get the rows from the fetchAllIDs function back to the client.
The way you have it written means the
reponse.end('This string is received by the client');
line is always called before the callback function meaning the reponse has ended.
Under JS all the code will finish before the next event (your callback function) is taken off the queue. Comment out the above line // and test it