How to close a http.ServerResponse prematurely? - node.js

Say that an error occurs when I'm in the middle of sending a chunked response from my http server that I'm writing in Node.js. There's no way to send an error message to the client at this point, and I figure that this answer is correct on what to do in this situation:
All you can do is close the connection. Either the client does not receive all of the headers, or it does not receive the terminating 0-length chunk at the end of the response. Either way is enough for the client to know that the server encountered an error during sending.
So the question is, how do I do this on my http.ServerResponse object? I can't call end, because then the client will think everything went well, and there is no close method. There is a 'close' event, but I get the feeling that's something I'm supposed to listen for in this context, not emit myself, right?

I do it in the following manner:
function respDestroy()
{
this._socket.destroy();
}
function respReply(message, close)
{
if (!close)
this.end(message);
else
this.end(message, function(){ this.destroy(); });
}
server.on('request',
function(req, resp)
{
resp._socket = resp.socket; // `socket` field is nulled after each `end()`
resp.destroy = respDestroy;
resp.reply = respReply;
...
});
You can modify respReply to accept status code and status message as well.

Related

server response to webform: how to answer duplicates?

I'm running a small server that needs to receive webforms. The server checks the request and sends back "success" or "fail" which is then displayed on the form (client screen).
Now, checking the form may take a few seconds, so the user may be tempted to send the form again.
What is the corret way to ignore the second request?
So far I have come out with this solutions: If the form is duplicate of the previous one
Don't check and send some server error back (like 429, or 102, or some other one)
Close directly the connection req.destroy();res.destroy();
Ignore the request and exit from the requestListener function.
With solution 1 and 2 the form (on client's browser) displays a message error (even if the first request they sent was correct, so as the duplicates). So it's not a good one.
Solution 3 gives the desired outcome... but I'm not sure if it is the right way around it... basically not changing req and res instead of destroying them. Could this cause issues, or slow down the server? (like... do they stack up?). Of course the first request, once it has been checked, will be sent back with the outcome code. My concern is with the duplicate requests, which I don't destroy nor answer...
Some details on the setup: Nodejs application using the very default code by the http module.
const http = require("http");
const requestListener = function (req, res) {
var requestBody = '';
req.on('data', (data)=>{
requestBody += data;
});
req.on('end', ()=>{
if (isduplicate(requestBody))
return;
else
evalRequest(requestBody, res);
})
}

what's the difference between request.on('error') and response.on('error')

when making an http.request there are 2 events that produce errors: request.on('error') and response.on('error').
I can't see a difference because both errors come from the web server.
what's the difference between thisError and thatError ?
var request = http.request({hostname:"example.com"}, function(response){
response.on('error', function(thisError){
//what's the difference between thisError <<<<<<
});
});
request.on('error', function(thatError){
//and thatError <<<<<
});
During a request you resolve a name, establish a connection, send a bunch of data, and each task could result in an error.
When you receive data through a response object, as an example the other end could close the connection unexpectedly.
Those errors are different and they must belong to the right structure, in this case respectively request and response.

NodeJS Server crash when request/response.aborted

When aborting a xmlHttpRequest, sent to a NodeJS-Express server, the server crashes if the request has not been processed finally or the response can't be send, due to a abroted request.
I use a connected-flag to make sure the response is only sent when the connection is up.
I tried to catch these exceptions, but they don't handle the request aborted event:
var connected = true;
req.connection.on('close', function () {
connected = false;
// code to handle connection abort
});
res.on('error', function (err) {
console.log("response couldn't be sent.");
connected = false;
});
if(connected)
res.send(...);
req.connection.removeListener('close', removeCallback);
res.removeListener('error', removeCallback);
Are there any events I can look at to take care of the „Error: Request aborted“ exception, which causes the server to crash?
According to the W3C specs, XMLHttpRequest emits a "abort" event.
http://www.w3.org/TR/XMLHttpRequest/#event-handlers
So basically, you can listen to that event to handle the error, I guess.
Actually, request.on('abort', fn) should work fine for detecting an aborted HTTP request in node.js

Node.js - How can I wait for something to be POSTed before I reply to a GET

I have 2 clients and one node.js server url - localhost:8888/ServerRequest. The First client GETs from this url and waits for 20 seconds to see if the Second client has POSTed some data for the first client within the 20 second timeout period or not.If the second client did POST before the timeout, then that value is returned to the GET request, else a default value is returned for the GET request. I am not sure what is the best way to implement this. I am trying something like this, but it is not working as desired -
function ServerRequest(response, postData , request)
{
var id;
if(request.method == "GET")
{
id= setTimeout(function( )
{
// handle timeout here
console.log("Got a timeout, sending default value");
cmd = "DefaultVal";
response.write("<?xml version=\"1.0\" encoding=\"UTF-8\"?><list id=\"20101001\"><com type=\"" + cmd + "\"></com></list>")
response.end()
},20000);
}
else if(request.method == "POST")
{
console.log("Received POST, sending POSTed value");
cmd = postData;
//Cancel Timeout
clearTimeout(id);
console.log(" \n Received POST")
response.write("<?xml version=\"1.0\" encoding=\"UTF-8\"?><list id=\"20101001\"><com type=\"" + cmd + "\"></com></list>")
response.end()
}
}
Another approach in my mind was to use 2 separate URLs - One for GET Request (/ServerRequest) and the other for POST Request (/PostData). But then how will I pass the POSTed data from one URL to the other if received before the timeout?
EDIT: I think I know now what I exactly need. I need to implement a longpoll, where a client sends a GET request, and waits for a timeout period (the data might not be immediately available to consume, so it waits for 20 seconds for some other client to POST some data for the first client to consume). In case timeout occurs, a default value is returned in response to the GET request from the first client. I'm working on the longpoll implementation I found here, I'll update if I am able to succeed in what I'm trying. If someone can point me or provide me with a better example, it will be helpful.
Edit: removed my original code after a more careful reading of the question.
The best solution would probably be websockets the browser will appear to hang waiting for 20 seconds.
Using a library like socket.io you can do this
var io = require('socket.io').listen(8888);
function postHandler(req, data, res){
io.sockets.emit("response" , data)
}
then client side
<script src="/socket.io/socket.io.js"></script>
<script>
var socket = io.connect('http://localhost:8888');
socket.on('response', function (data) {
console.log(data);
});
</script>

Sending result set back to client in node.js

I am having an issue getting the result set back to the client using Node.js. I am new to it and using it for a project but I am stuck and not sure why. Here's the situation: I have a webpage, a server, a request handler and a database interface. I am able to send data back and forth the client and server without any issue. The only time it doesn't work is when I try to send the result from my query back to the client.
function doSomething(response)
{
var data = {
'name': 'doSomething'
};
response.writeHead(200, {'Content-Type': 'text/html', 'Access-Control-Allow-Origin': '*'});
response.end(JSON.stringify(data));
}
This works fine as I can read the name from the object on the client side, but
function fetchAllIDs(response)
{
dbInterface.fetchAllIDs(function(data) {
// console.log(data) prints the correct information here
response.writeHead(200, {'Content-Type': 'text/html', 'Access-Control-Allow-Origin': '*'});
response.end(data);
// console.log(data) from the client side is just blank
});
}
I believe the issue is the way I handle my callback and response because without trying to use mysql the rest of my code works fine. Thanks!
EDIT: I removed a piece code that seems to confuse people. It was just to show that if I have the response code outside the callback then I am able to get any data back to the server. In my actual code, I do not have the two response statements together. I just can't get the rows from the fetchAllIDs function back to the client.
The way you have it written means the
reponse.end('This string is received by the client');
line is always called before the callback function meaning the reponse has ended.
Under JS all the code will finish before the next event (your callback function) is taken off the queue. Comment out the above line // and test it

Resources