NodeJS Server crash when request/response.aborted - node.js

When aborting a xmlHttpRequest, sent to a NodeJS-Express server, the server crashes if the request has not been processed finally or the response can't be send, due to a abroted request.
I use a connected-flag to make sure the response is only sent when the connection is up.
I tried to catch these exceptions, but they don't handle the request aborted event:
var connected = true;
req.connection.on('close', function () {
connected = false;
// code to handle connection abort
});
res.on('error', function (err) {
console.log("response couldn't be sent.");
connected = false;
});
if(connected)
res.send(...);
req.connection.removeListener('close', removeCallback);
res.removeListener('error', removeCallback);
Are there any events I can look at to take care of the „Error: Request aborted“ exception, which causes the server to crash?

According to the W3C specs, XMLHttpRequest emits a "abort" event.
http://www.w3.org/TR/XMLHttpRequest/#event-handlers
So basically, you can listen to that event to handle the error, I guess.

Actually, request.on('abort', fn) should work fine for detecting an aborted HTTP request in node.js

Related

Cannot set headers after they are sent to the client when adding data to firebase

I've been having this error
Error [ERR_HTTP_HEADERS_SENT]: Cannot set headers after they are sent to the client
at ServerResponse.setHeader (_http_outgoing.js:533:11)
After I added an update firebase code in a socket.io
io.on('connection', socket => {
...
socket.once('myEvent', async (dataToSend) => {
try {
await db
.collection('myCollection')
.doc('myDoc')
.update({keyToUpdate: dataToSend})
} catch (err) {
console.log(err)
}
}
}
I thought the problem was with socket so I did the same logic but with a POST route outside the socket, and got the same error, I tried adding a return but didn't work
I suspect it could be because this function is a promise and I'm not handling asynchronous behaviour well, or because the server is acting as a stream thanks to socket.io and this firebase function doesn't like that.
I have other firebase functions in the code but they are get requests (.onSnapshot()) the error only happens with data adding firebase functions (.set(), .add(), .update())
Would appreciate some help here, thanks.
Fixed, didn't exactly knew how onSnapshot works, so everytime an update was made it triggered its callback which was sending info and triggering the error.
Changed to .get().then() and that fixed it

Terminate EventSource event listener?

I'm trying to work around a problem to do with rest streaming between the Nest API and a service (ST) that does not support streaming.
To get around this, I have built a service on Sails which takes a post request from ST containing the Nest Token, and then triggers an EventSource event listener that sends the data back to ST.
It is heavily based off the Nest rest-streaming example here:
https://github.com/nestlabs/rest-streaming and my code is as follows:
startStream: function(req, res) {
var nestToken = req.body.nestToken,
stToken = req.body.stToken,
endpointURL = req.body.endpointURL,
source = new EventSource(sails.config.nest.nest_api_url + '?auth=' + nestToken);
source.addEventListener('put', function(e) {
var d = JSON.parse(e.data);
var data = { devices: d.data.devices, structures: d.data.structures},
config = { headers : {'Authorization': 'Bearer ' + stToken}};
sendData(endpointURL, data, config);
});
source.addEventListener('open', function(e) {
console.log("Connection opened");
});
source.addEventListener('auth_revoked', function(e){
console.log("Auth token revoed");
});
source.addEventListener('error', function(e) {
if (e.readyState == EventSource.CLOSED) {
console.error('Connection was closed! ', e);
} else {
console.error('An unknown error occurred: ', e);
}
}, false);
}
};
The problem I foresee though is that once a request is received by the node server, it start the event listener, however I cannot for the life of me figure out how I can kill the event listener.
If I cannot figure out a way to stop this, then every EventListener will run indefinitely which is obviously not suitable.
Has anyone got any suggestions on how to overcome the issue?
Each SSH client connection is a dedicated socket.
If a particular client doesn't want event streaming, don't make the connection. If they start event streaming, but want to turn it off, call source.close();source=NULL;
If from server-side you want to stop sending the messages, close the socket.
You didn't show the server-side code, but if it is running a dedicated process per SSE client then you just exit the process. If you are maintaining a list of sockets, one per connected client, close the socket. On node.js you might be running a function on setInterval. To close the connection you do and clearInterval() and response.end();.

Timeout in node.js request

I wonder how does node.js request module work in regard to timeout parameter.
What happens after timeout time period have passed? I.e:
var request = require('request');
var options = {
url: Theurl,
timeout: 300000
};
request(options, function(error, resp, body) {...
What happens after 300000? Does request try to request the url again or not?
I also found that Linux Kernel have a default 20 sec TCP socket connection timeout. (http://www.sekuda.com/overriding_the_default_linux_kernel_20_second_tcp_socket_connect_timeout)
Does it mean that timeout option in request will be max 20 sec (if I dont change the Linux Kernel timeout), regardless of what I set in options?
I use Ubuntu.
From the readme of the request package:
Note that if the underlying TCP connection cannot be established,
the OS-wide TCP connection timeout will overrule the timeout option
So in your case, the request will be aborted after 20 sec. The request won't try to request the url again (even if the timeout is set to a lower value than 20000). You would have to write your own logic for this or use another package, such as requestretry.
Example:
var options = {
url: 'http://www.gooooerererere.com/',
timeout: 5000
}
var maxRequests = 5;
function requestWithTimeout(attempt){
request(options, function(error,response,body){
if(error){
console.log(error);
if(attempt==maxRequests)
return;
else
requestWithTimeout(attempt+1);
}
else {
//do something with result
}
});
}
requestWithTimeout(1);
You can also check for a specific error message, such as ETIMEDOUT, with
if(error.code == [ERROR_MESSAGE])
request returns error with error code set as stated in request readme (timeout section).
Take a look at TIME_WAIT details.
But yes, kernel will cut it down with its configuration. As stated in your link, you can change it by chaning tcp_syn_retries.
If timeout happens, your callback function will be executed with error set to message 'Error: ETIMEDOUT'.
This little project https://github.com/FGRibreau/node-request-retry provides ready-to-use, configured wrapper for making retries triggered by many connection error codes, timeout included.

How to close a http.ServerResponse prematurely?

Say that an error occurs when I'm in the middle of sending a chunked response from my http server that I'm writing in Node.js. There's no way to send an error message to the client at this point, and I figure that this answer is correct on what to do in this situation:
All you can do is close the connection. Either the client does not receive all of the headers, or it does not receive the terminating 0-length chunk at the end of the response. Either way is enough for the client to know that the server encountered an error during sending.
So the question is, how do I do this on my http.ServerResponse object? I can't call end, because then the client will think everything went well, and there is no close method. There is a 'close' event, but I get the feeling that's something I'm supposed to listen for in this context, not emit myself, right?
I do it in the following manner:
function respDestroy()
{
this._socket.destroy();
}
function respReply(message, close)
{
if (!close)
this.end(message);
else
this.end(message, function(){ this.destroy(); });
}
server.on('request',
function(req, resp)
{
resp._socket = resp.socket; // `socket` field is nulled after each `end()`
resp.destroy = respDestroy;
resp.reply = respReply;
...
});
You can modify respReply to accept status code and status message as well.

twitter-node detecting connection end

I'm using the twitter-node library for node.js and it works well, however I'm having some minor difficulty handling disconnects.
When twitter disconnects me (I'm connecting a second time from the same server to force a disconnect so I can make sure I'm handling these sorts of issues) it doesn't produce an error or an end event.I thought the following would handle it:
var twitter = new TwitterNode({
user : opts.account,
password : opts.password,
track : opts.hashtags,
follow : opts.follow
});
// omitted handlers for receiving tweets/deletes/limit info, but its there
twitter.addListener('error', function(error) {
console.log('error occoured:' + error.message);
}).addListener('end', function(resp) {
sys.puts("wave goodbye... " + resp.statusCode);
}).stream();
However, I don't get either the message from 'end' or 'error' when I'm disconnected. Anyone familiar with this issue?
For anyone having this same issue:
There's no notification from twitter-node because it doesn't handle the https libraries close event - by going into the source and adding:
response.on('close', function() { twit.emit('close',this); }
The library now emits a close event when the connection is closed by the remote server (twitter) and you can handle it with a listener in your code like this:
twitterStreamReader = new TwitterNode({...});
twitterStreamReader.addListener('close', function(resp) {
sys.puts('The server connection has been closed. You may want to do something about that.');
});

Resources