Receiving 2 HTTP requests on the server when only 1 sent - node.js

I am creating an app and using http://c9.io environment to develop it. It is a NodeJS app, which provides some REST endpoints for the client side application to query. Till now, everything was running fine, and today what I observe is that for 1 call sent by the browser to the REST API, 2 requests are being shown as received, and the request handler is being called 2 times. This has slowed the response time for one request.
In Chrome developer tools, it shows only one request sent, however, I am using app.use() to log incoming requests in Express and it prints the same 2 times for each request. Also, the handler is called twice.
This is happening intermittently, not every time. I am behind a corporate network. As I have sent a lot of requests in the day for testing, is there any chance that a monitoring program is sending the requests since it finds it suspicious? I have not edited the code that handles the requests.
Edit: Adding the code for handlers as suggested.
app.get('/suggestions/:keyword', function(r, s) {
sug_db.retrieveSuggestions(r.params.keyword, function(data) {
s.writeHead(200, {'content-type': 'text/html'});
s.write(renderSugg({data: data}))
s.end();
});
});
app.get('/search/:query', function(r, s) {
esc_db.search(r.params.query, function(data) {
s.send(renderResults({query: r.params.query, results:data}));
});
});
As you can see, they do nothing but get some data from a database and return the result as HTTP response. The templating engine I am using is Pug (formerly Jade)

It doesn't look like that code that you included in the question can be guilty of running twice. But maybe some code in sug_db.retrieveSuggestions or esc_db.search does that.
What I would do is this:
Add some logging inside the code that you provided, both before calling the functions and inside the callback:
app.get('/suggestions/:keyword', function(r, s) {
console.log('*** GET /suggestions/:keyword handler');
sug_db.retrieveSuggestions(r.params.keyword, function(data) {
console.log('GET /suggestions/:keyword callback');
s.writeHead(200, {'content-type': 'text/html'});
s.write(renderSugg({data: data}))
s.end();
});
});
app.get('/search/:query', function(r, s) {
console.log('*** GET /search/:query handler');
esc_db.search(r.params.query, function(data) {
console.log('GET /search/:query callback');
s.send(renderResults({query: r.params.query, results:data}));
});
});
(or change console.log to whatever method of logging you use).
I would see what is actually called twice - the handlers themselves, or the callbacks, or none. Next would be examination of the functions that are actually called by the handlers:
sug_db.retrieveSuggestions()
esc_db.search()
renderSugg()
renderResults()
It's important to see what is actually called twice and then examine why it can be happening. But it can happen if, for example, you do something like:
function badFunction(data, callback) {
if (something) {
callback('error');
}
callback('ok');
}
instead of:
function goodFunction(data, callback) {
if (something) {
callback('error');
} else {
callback('ok');
}
}
I would expect that the functions that are called from the handlers could do something like that to call the callback twice - and maybe the condition or error that they checking didn't happen before but happens now, causing the change in behavior.

Related

Getting "Error [ERR_HTTP_HEADERS_SENT]: Cannot set headers after they are sent to the client" while using Axios

I am trying to use different Axios calls to get some data from a remote server. One by one the calls are working but as soons as I call them directly after each other its throwing the error message about the headers. I did some research already and I guess it has sth to do that there the headers of the first call gets in the way of the second call. That is probably a very simplematic description of the problem but I am new to node js and the way those axios calls are working.
This is an example of one of my Api calls:
app.get('/api/ssh/feedback', function(req, res){
conn.on('ready', function(){
try {
let allData = {}
var command = 'docker ps --filter status=running --format "{{.Names}}"'
conn.exec(command, function(err, stream){
if (err) throw console.log(err)
stream.on('data', function(data){
allData = data.toString('utf8').split('\n').filter(e=>e)
return res.json({status: true, info: allData})
})
stream.on('close', function(code){
console.log('Process closed with: ' + code)
conn.end()
})
stream.on('error', function(err){
console.log('Error: ' + err)
conn.end()
})
})
} catch (err) {
console.error('failed with: ' + err)
}
}).connect(connSet)
})
I am using express js as a middleware and the shh2 package to get the connection with the remote server. How I mentioned before the call is working but crashes if it is not the first call. I am able to use the api again after I restart the express server.
This is how I am calling the api through axios in my node js frontend:
getNetworkStatus(e){
e.preventDefault()
axios.get('/api/ssh/network').then(res =>{
if(res.data.status){
this.setState({network_info: 'Running'})
this.setState({network: res.data.info})
} else {
this.setState({network_info: 'No Network Running'})
this.setState({network: 'No Network detected'})
}
}).catch(err => {
alert(err)
})
}
I would be really grateful for any help or advice how to solve this problem. Thanks to everyone who spends some time to help me out.
There are two issues in the code you've provided:
You are making assumptions about 'data' events. In general, never assume the size of the chunks you receive in 'data' events. You might get one byte or you might get 1000 bytes. The event can be called multiple times as chunks are received and this is most likely what is causing the error. Side note: if the command is only outputting text, then you are better off using stream.setEncoding('utf8') (instead of manually calling data.toString('utf8')) as it will take care of multi-byte characters that may be split across chunks.
You are reusing the same connection object. This is a problem because you will continue to add more and more event handlers every time that HTTP endpoint is reached. Move your const conn = ... inside the endpoint handler instead. This could also be causing the error you're getting.

Recommended pattern to page through API response until exhausted?

I'm new to Node and the async programming model. I'm having problems dealing with a simple requirement that seems pretty basic in synchronous environments: paging through an API response until the response is empty.
More specifically, the API, on a successful call, will return data and a status of 200 or 206 (partial content). If I see the 206 response, I need to keep making calls to the API (also sending a page query param that I increment each time) until I see the 200 response.
In a synchronous language, the task will be a piece of cake:
// pseudocode
data = []
page = 1
do {
response = api.call(page)
data.append(response.data)
page++
} while (response != 200)
return data
Now, in Node, for a single api call, code like this will work:
// fire when '/' has a GET request
app.get('/', (req, res) => {
axios.get('https://api.com/v1/cats')
.then(response => {
// now what??
});
});
});
See the //now what?? comment? That's the point where I'm wondering how to proceed. I came across this somewhat-relevant post but am not able to convert this to a format that will work for me in Node and Axios.
Is it enough to just wrap the axios code in a separate function? I don't think so, because if I do this:
function getData(pageNum) {
axios.get('https://api.com/v1/cats')
.then(response => {
// now what??
});
});
}
I can't rely on a return value because as soon axios.get() gets executed, the function will be over. I can call getData() again after I get the first response, but then, suppose I want to return all the data from these multiple calls as the HTTP response from my Express server . . . how do I do that?
I hope I will not get downvoted for laziness or something. I've really looked around but not found anything relevant.
First, a counter-question: Is the data set so big that you need to worry about using up all the memory? Because if so then it will take more work to structure your code in a way that streams the data all the way through. (In fact I'm not even sure whether express allows streaming... you are using express aren't you?)
From the axios documentation, it looks like response is a readable stream which provides the response body. So reading it is also an asynchronous task. So you should write a function that does that. See the "Stream" page of the nodejs docs for more details. Or I could be persuaded to help with that too, time permitting. But for now, I'll assume you have a function readResponse, which takes an axios response object as an argument and returns a promise, and the promise resolves to an object such as { statusCode: 206, result: ['thing1', 'thing2'] }. I'll also assume that your goal is to get all the result arrays and concatenate them together to get e.g. ['thing1', 'thing2', 'thing3', 'thing4', 'thing5', 'thing6'].
You could write a self-calling version of your getData function. This will retrieve all data from a given page onwards (not just the page itself):
function getData(pageNum) {
axios.get('https://api.com/v1/cats' + (pageNum ? '?page=' + pageNum) : '')
.then(readResponse)
.then(function(parsedResponse) {
if(parsedResponse.statusCode == 200) {
return parsedResponse.result;
} else if(parsedResponse.statusCode == 206) {
return getData(pageNum + 1).then(function(laterData) {
return parsedResponse.result.concat(laterData);
});
} else {
// error handling here, throw an exception or return a failing promise.
}
});
});
}
Then, to get all data, just call this function with pageNum = 0:
// fire when '/' has a GET request
app.get('/', (req, res) => {
getData(0)
.then(function(results) {
// results is now the array you want.
var response = JSON.stringify(results); // or whatever you're doing to serialise your data
res.send(response);
});
});

Responding to a callback from a child message in node.js

I've run into a problem with node.js and can't figure out the correct way to handle this situation.
I have worker process that handles all the data for a leaderboard. When a request comes in for the leaderboard, I send the request to the worker to handle. The worker will send back the response via the child_process messaging.
My problem is how to efficiently get the response to the callback. This is my first attempt, but wont work as I'm always rebinding the 'message' event to a different callback.
Manager.setup_worker = function () {
Manager.worker = require('child_process').fork("./workers/leaderboard");
}
Manager.process_request = function (request, callback) {
Manager.worker.on("message", function (response) {
callback(response);
})
Manager.worker.send(request);
}

socket.io data seems to be sent multiple times(nodejs and craftyjs)

I am following this tutorial on making HTML5 games. I wanted to try and mix node in to make it multiplayer. I am using node.js(v0.10.4) on server and crafty.js on front end.
I am using socket.io to send and receive messages. For now it's just me(not multiple clients). The weird thing that happens is that the message that comes from the server seems to be sent multiple times. I turned on debug mode in socket.io but it only seems to be sending the data once, yet on the front end the data seems to be coming in, in multiples. I set an incrementor on the data and it seems as if the incrementor is not incrementing multiple times but instead I am getting multiple copies of the same data.
here's node code:
var http = require('http').createServer(handler),
static = require('node-static'),
io = require('socket.io').listen(http);
io.set('log level', 3);
http.listen(80);
//attach the socket to our server
var file = new static.Server(); //Create a file object so we can server the files in the correct folder
function handler(req, res) {
req.addListener('end', function() {
file.serve(req, res);
}).resume();
}
io.sockets.on('connection', function (socket) { //listen for any sockets that will come from the client
socket.on('collected', function(data) {
/**** here's where the data is being sent back to the client *****/
socket.emit('messageFromServer', { data: data.number });
});
});
and here's front end code:
//messenger entity
Crafty.c('SendRecieveMessages',{
count: 0,
sendMessageToServer : function() {
console.log('got a village');
/**** Here's where we send the message to the server ****/
socket.emit('collected', { village : "The message went to the server and back. it's collected", number : this.count });
this.count++;
},
recieveMessageFromServer : function() {
socket.on('messageFromServer', function(data) {
/*** This data seems to be coming back or logging multiple times? ***/
console.log(data);
});
}
});
Lastly here's a screenshot of the debug in process. As you can see number is not always incrementing, it almost looks like the data is getting stored. Thanks!
http://cl.ly/image/0i3H0q2P1X0S
It looks like every time you call Crafty.c, recieveMessageFromServer() is getting called too. Every time recieveMessageFromServer is invoked, it attaches an additional event listener on the socket. That's why the first time data comes back you get one copy, then the second time you get two, the third time you get three, and so on.
You either need to prevent recieveMessageFromServer from being called multiple times, or use removeListener or removeAllListeners to remove the previously attached listeners.
Thanks to #Bret Copeland for helping me figure this one out. As he pointed out, every time socket.on() is called, it seems to add another listener. To prevent this...
I declared a global variable:
I declared a variable as a property in my Game object(in craftyjs, so use whatever you want in your setup)
Game = {
//lots of other code here...
//need this to use later for socket.io
send_message : true
}
then edited my recieveMessageFromServer() function to check whether its ok to send the message or not:
recieveMessageFromServer : function() {
console.log('does this show up multiple times?');
/* Check whether the send_message is true before sending */
if (Game.send_message) {
socket.on('messageFromServer', function(data) {
console.log(data);
Game.send_message = false;
});
}
}

async parallel request - partial render

What is the proper way to partially render a view following an async parallel request?
Currently I am doing the following
// an example using an object instead of an array
async.parallel({
one: function(callback){
setTimeout(function(){
callback(null, 1);
// can I partially merge the results and render here?
}, 200);
},
two: function(callback){
setTimeout(function(){
callback(null, 2);
// can I partially merge the results and render here?
}, 100);
}
},
function(err, results) {
// results is now equals to: {one: 1, two: 2}
// merge the results and render a view
res.render('mypage.ejs', { title: 'Results'});
});
It is basically working fine, but, if I have a function1, function2, ..., functionN the view will be rendered only when the slowest function will have completed.
I would like to find the proper way to be able to render the view as soon as the first function is returning to minimise the user delay, and add the results of the function as soon as they are available.
what you want is facebook's bigpipe: https://www.facebook.com/note.php?note_id=389414033919. fortunately, this is easy with nodejs because streaming is built in. unfortunately, template systems are bad at this because async templates are a pain in the butt. however, this is much better than doing any additional AJAX requests.
basic idea is you first send a layout:
res.render('layout.ejs', function (err, html) {
if (err) return next(err)
res.setHeader('Content-Type', 'text/html; charset=utf-8')
res.write(html.replace('</body></html>', ''))
// Ends the response.
// `writePartials` should not return anything in the callback!
writePartials(res.end.bind(res, '</body></html>'))
})
you can't send </body></html> because your document isn't finished. then writePartials would be a bunch of async functions (partials or pagelets) executed in parallel.
function writePartials(callback) {
async.parallel([partial1, partial2, partial3], callback)
})
Note: since you've already written a response, there's not much you can do with errors except log them.
What each partial will do is send inline javascript to the client. For example, the layout can have .stream, and the pagelet will replace .stream's innerHTML upon arrival, or when "the callback finishes".
function partialStream(callback) {
res.render('stream.partial.ejs', function (err, html) {
// Don't return the error in the callback
// You may want to display an error message or something instead
if (err) {
console.error(err.stack)
callback()
return
}
res.write('<script>document.querySelector(".stream").innerHTML = ' +
JSON.stringify(html) + ';</script>')
callback()
})
})
Personally, I have .stream.placeholder and replace it with a new .stream element. The reason is I basically do .placeholder, .placeholder ~ * {display: none} so things don't jump around the page. However, this requires a DIY front-end framework since suddenly the JS gets more complciated.
There, your response is now streaming. Only requirement is that the client supports Javascript.
I think you can't do it just on the backend.
To minimise users' delay you need to send the minimal page to the browser and then to request the rest of the information from the browser via AJAX. Another approach to minimising delays is to send all templates to the browser on the first page load, together with the rendered page, and render all the pages in browser based on the data you request from the server. That's the way I do it. The beauty of nodejs is that you can use the same templating engine both in the backend and frontend and also share the modules.
If your page is composed in such a way that the slow information is further in HTML than the fast information, you can write response partially without using res.render (that renders complete page) and use res.write instead. I don't think though that this approach deserves serious attention as you would stuck with it sooner than you notice...

Resources