NodeJS - “socket hang up” after large amount of requests closing - node.js

I have been stuck on this "socket hang up" error for a couple days now, and I was hoping someone could help me.
I currently have two Node programs set up:
An HTTP server in Node that responds with the same data for every
request.
An HTTP server which responds with data from HTTP server 1.
for every request.
My code for HTTP server 2. is below.
var http = require('http')
, https = require('https')
, server = http.createServer(do).listen(801)
function do(req, res) {
var getReq = htts.get('http://127.0.0.1', function (getRes) {
setTimeout(function () {getRes.pipe(res)},1000)
})
req.on('close', function () {
// setTimeout(function () {getReq.abort()},20)
console.log(++i + ': req closed')
getReq.abort()
})
}
The problem is when I send a request to HTTP server 2. and close the request before a response is sent to my browser (I've set a timeout to give me time to abort). If I continually hold down the refresh button, I will receive the "socket hang up" error after an x amount of times, and do not really understand how to fix this. If I set a timer before executing getReq.abort(), the problem happens less often, and if I set the timer to a large number (above 100ms), there isn't an issue at all.
I can consistently replicate the error by executing getReq.abort() right after creating the request, so I believe that it has something to do with aborting between the time the socket is assigned to the request and before the response is sent.
What is wrong with my code, and how do I prevent this from happening?
Thanks

Related

server response to webform: how to answer duplicates?

I'm running a small server that needs to receive webforms. The server checks the request and sends back "success" or "fail" which is then displayed on the form (client screen).
Now, checking the form may take a few seconds, so the user may be tempted to send the form again.
What is the corret way to ignore the second request?
So far I have come out with this solutions: If the form is duplicate of the previous one
Don't check and send some server error back (like 429, or 102, or some other one)
Close directly the connection req.destroy();res.destroy();
Ignore the request and exit from the requestListener function.
With solution 1 and 2 the form (on client's browser) displays a message error (even if the first request they sent was correct, so as the duplicates). So it's not a good one.
Solution 3 gives the desired outcome... but I'm not sure if it is the right way around it... basically not changing req and res instead of destroying them. Could this cause issues, or slow down the server? (like... do they stack up?). Of course the first request, once it has been checked, will be sent back with the outcome code. My concern is with the duplicate requests, which I don't destroy nor answer...
Some details on the setup: Nodejs application using the very default code by the http module.
const http = require("http");
const requestListener = function (req, res) {
var requestBody = '';
req.on('data', (data)=>{
requestBody += data;
});
req.on('end', ()=>{
if (isduplicate(requestBody))
return;
else
evalRequest(requestBody, res);
})
}

Node HTTP Server does not respond to first request within Electron

I'm trying to start a local server in Electron to capture Google's OAuth callback like so:
this.server = http.createServer((request, response) => {
const { query } = url.parse(request.url, true);
if (query.code) {
this.onCodeReceived(query.code);
// do something nicer here eventually
response.end();
} else {
response.end('Error');
this.authStatus = 'error';
}
}).listen(this.LOCAL_SERVER_PORT);
The issue I'm having is that when I finish authenticating with Google the window just sits at "Waiting for 127.0.0.1..." and never actually finishes. I've found using console.log in the request handler that the handler is never actually called, so I'm stumped as to why the request isn't going through.
I've verified that the callback URL has the same port the server is listening to, and that the server actually begins listening. Weirdly if I open a new tab and go to the URI I get the expected Error response.
For reference the callback URI is set as http://127.0.0.1:18363 and this.LOCAL_SERVER_PORT = 18363.
If anyone has any ideas I'd greatly appreciate it. Thank you!

Server sending multiple responses after AJAX request - using Socket.IO and ExpressJS

I'm trying to build a real-time program where users can set a marker on a Google Map and others who are connected can get that same marker. Everything seems to work fine except that after a few minutes, the server side is submitting the data a second time.
To clarify: client sets a marker on the map, the marker is sent to the server, running Node JS with Express JS, in JSON format. The server returns the data to all connected clients. Minutes later, the server sends the same data it received once more, causing a "ERR_EMPTY_RESPONSE" client-side on the last line of example "Client code".
Client code:
var data = new Array();
data.push({lat: Gmap.markers[0].lat, lng: Gmap.markers[0].lng});
var xhttp = new XMLHttpRequest();
xhttp.open("POST", "/marker", true);
xhttp.setRequestHeader('Content-type', 'application/json; charset=UTF-8');
xhttp.send(JSON.stringify(data));
Server-side:
var app = express():
app.post('/marker', function(req,res){
io.emit('marker', req.body);
})
Anyone have any idea of whats going on?
You need to send a response to the http request. If you don't, the browser will time it out and may attempt to retry.
var app = express():
app.post('/marker', function(req,res){
io.emit('marker', req.body);
res.send("ok"); // <== Send a response to the http request here
})

How to close a http.ServerResponse prematurely?

Say that an error occurs when I'm in the middle of sending a chunked response from my http server that I'm writing in Node.js. There's no way to send an error message to the client at this point, and I figure that this answer is correct on what to do in this situation:
All you can do is close the connection. Either the client does not receive all of the headers, or it does not receive the terminating 0-length chunk at the end of the response. Either way is enough for the client to know that the server encountered an error during sending.
So the question is, how do I do this on my http.ServerResponse object? I can't call end, because then the client will think everything went well, and there is no close method. There is a 'close' event, but I get the feeling that's something I'm supposed to listen for in this context, not emit myself, right?
I do it in the following manner:
function respDestroy()
{
this._socket.destroy();
}
function respReply(message, close)
{
if (!close)
this.end(message);
else
this.end(message, function(){ this.destroy(); });
}
server.on('request',
function(req, resp)
{
resp._socket = resp.socket; // `socket` field is nulled after each `end()`
resp.destroy = respDestroy;
resp.reply = respReply;
...
});
You can modify respReply to accept status code and status message as well.

Node.js - How can I wait for something to be POSTed before I reply to a GET

I have 2 clients and one node.js server url - localhost:8888/ServerRequest. The First client GETs from this url and waits for 20 seconds to see if the Second client has POSTed some data for the first client within the 20 second timeout period or not.If the second client did POST before the timeout, then that value is returned to the GET request, else a default value is returned for the GET request. I am not sure what is the best way to implement this. I am trying something like this, but it is not working as desired -
function ServerRequest(response, postData , request)
{
var id;
if(request.method == "GET")
{
id= setTimeout(function( )
{
// handle timeout here
console.log("Got a timeout, sending default value");
cmd = "DefaultVal";
response.write("<?xml version=\"1.0\" encoding=\"UTF-8\"?><list id=\"20101001\"><com type=\"" + cmd + "\"></com></list>")
response.end()
},20000);
}
else if(request.method == "POST")
{
console.log("Received POST, sending POSTed value");
cmd = postData;
//Cancel Timeout
clearTimeout(id);
console.log(" \n Received POST")
response.write("<?xml version=\"1.0\" encoding=\"UTF-8\"?><list id=\"20101001\"><com type=\"" + cmd + "\"></com></list>")
response.end()
}
}
Another approach in my mind was to use 2 separate URLs - One for GET Request (/ServerRequest) and the other for POST Request (/PostData). But then how will I pass the POSTed data from one URL to the other if received before the timeout?
EDIT: I think I know now what I exactly need. I need to implement a longpoll, where a client sends a GET request, and waits for a timeout period (the data might not be immediately available to consume, so it waits for 20 seconds for some other client to POST some data for the first client to consume). In case timeout occurs, a default value is returned in response to the GET request from the first client. I'm working on the longpoll implementation I found here, I'll update if I am able to succeed in what I'm trying. If someone can point me or provide me with a better example, it will be helpful.
Edit: removed my original code after a more careful reading of the question.
The best solution would probably be websockets the browser will appear to hang waiting for 20 seconds.
Using a library like socket.io you can do this
var io = require('socket.io').listen(8888);
function postHandler(req, data, res){
io.sockets.emit("response" , data)
}
then client side
<script src="/socket.io/socket.io.js"></script>
<script>
var socket = io.connect('http://localhost:8888');
socket.on('response', function (data) {
console.log(data);
});
</script>

Resources