send error message back to browser in nodejs - node.js

I have a node API that is working fine when tested using postman.
But when I use this API in my angular project there occurs an error and browser don't get any response there it keep waiting for a response. When I go to console I see the error message.
How I can make that error message to be sent back to the browser with full stack trace

In general, you will need to catch that error, then populate http response object with it just the same as if you were sending successful response data back to the requestor.
Synchronous processing:
try {
// do my requested stuff
res.status(200).json({something:"returned"});
} catch(ex) {
res.status(500).json(ex);
};
Promises:
Promise.resolve()
.then(() => {
// do my requested stuff
// return my results stuff to the client
res.status(200).json({something:"returned"});
})
.catch((ex) => {
// return 500 error and exception data to the client
res.status(500).json(ex);
});
Also, as standard practice you should catch all errors, and at the very least, you should return a 500 to the browser res.status(500) so you don't leave it hanging when unexpected issues arise.
And, of course you can return html rather than json, and/or more info in the response.
Good luck.

Related

Cannot set headers after they are sent to the client when adding data to firebase

I've been having this error
Error [ERR_HTTP_HEADERS_SENT]: Cannot set headers after they are sent to the client
at ServerResponse.setHeader (_http_outgoing.js:533:11)
After I added an update firebase code in a socket.io
io.on('connection', socket => {
...
socket.once('myEvent', async (dataToSend) => {
try {
await db
.collection('myCollection')
.doc('myDoc')
.update({keyToUpdate: dataToSend})
} catch (err) {
console.log(err)
}
}
}
I thought the problem was with socket so I did the same logic but with a POST route outside the socket, and got the same error, I tried adding a return but didn't work
I suspect it could be because this function is a promise and I'm not handling asynchronous behaviour well, or because the server is acting as a stream thanks to socket.io and this firebase function doesn't like that.
I have other firebase functions in the code but they are get requests (.onSnapshot()) the error only happens with data adding firebase functions (.set(), .add(), .update())
Would appreciate some help here, thanks.
Fixed, didn't exactly knew how onSnapshot works, so everytime an update was made it triggered its callback which was sending info and triggering the error.
Changed to .get().then() and that fixed it

Asynchronous processing of data in Expressjs

I've an Express route which receives some data and process it, then insert into mongo (using mongoose).
This is working well if I return a response after the following steps are done:
Receive request
Process the request data
Insert the processed data into Mongo
Return 204 response
But client will be calling this API concurrently for millions of records. Hence the requirement is not to block the client for processing the data. Hence I made a small change in the code:
Receive request
Return response immediately with 204
Process the requested data
Insert the processed data into Mongo
The above is working fine for the first few requests (say 1000s), after that client is getting socket exception: connection reset peer error. I guess it is because server is blocking the connection as the port is not free and at some point of time, I notice my nodejs process is throwing out Out of memory error.
Sample code is as follows:
async function enqueue(data) {
// 1. Process the data
// 2. Insert the data in mongo
}
async function expressController(request, response) {
logger.info('received request')
response.status(204).send()
try {
await enqueue(request.body)
} catch (err) {
throw new Error(err)
}
}
Am I doing something wrong here?

Handling request simltaneously in nodejs and passing it's response using only one res.send

I am developing an API which takes input in XML containing IDs for media and gives output in XMLform with details of given IDs. I am facing a problem while sending the response of second simultaneous request; here the second request goes into loop showing "loading" on postman.
What I am doing is calling a function in app.post which parses the media and gives output in the callback and send it using res.send, but it works only for single request.
While doing parallel request to same API either it goes in loop or it gives can't set the headers after they are sent as I am using res.send but res.send is the only way which I can use to send the response (even the next doesn't work).
var getCompositeData = function(req, res, next){
abc.getData(req.body, function(err, xmlOutput){
if(err){
console.log("error");
} else {
xmlData = xmlOutput
return next()
}
}
app.post(apiUrl, [
rawBodyParser({
type: 'application/xml'
}),
app.oauth.authorise()
], getCompositeData, function (req, res) {
res.setHeader('Content-Type', 'application/xml');
res.send(xmlData);
});
There are several issues with your code:
if (err) {
console.log("error");
}
If an error occurs, you still need to make sure a response will be sent back, otherwise the request will stall until a timeout happens. You can pass an error to next, and Express will handle it:
if (err) {
return next(err);
}
Next problem:
xmlData = xmlOutput
xmlData is an undeclared variable, which gets overwritten with each request. If two requests happens at (almost) the same time, it's likely that one client gets back an incorrect response (remember, Node.js runs JS code in a single thread; there is not thread-local storage so xmlData gets shared between all requests).
A good place to "store" this sort of data is in res.locals:
res.locals.xmlData = xmlOutput;
return next();
// and later:
res.send(res.locals.xmlData);

Sometimes not receiving success or error response when saving Backbone model

When saving a model to a Node.js endpoint I'm not getting a success or error response every time, particularly on the first the first save and then sometimes on other attempts. The Node.js server is sending a success response every time, and if I use a Chrome rest client it works every time.
var mailchimpModel = new MailchimpModel();
var data = {
"email": $('#email').val()
}
mailchimpModel.save(data, {
success: function(model, response) {
console.log("success");
console.log(response);
},
error: function(model, response) {
console.log("error");
}
});
What I have found is the nodejs server is receiving 2 requests when it's failing
OPTIONS /api/mailchimp 200
POST /api/mailchimp 200
and I only get a success response if I submit the request again straight afterwards.
It's possible your model is failing client-side validation. To check, try:
console.log(mailchimpModel.save(data));
If the value is false then your model is failing client-side validation (usually defined in a validate function in the model). You can check the errors with
console.log(mailchimpModel.valdiationError);
OK found that i need to handle the OPTIONS method on the server, using the soltion on this post worked for me.
https://stackoverflow.com/a/13148080/10644

Node.js response.write buffer limit restrictions

I am using nodeJS with some additional modules to do web page scraping and media item identification from a set of websites.
The node server basically throws back a JSON markup of all the items identified on the page and its associated metadata. The JSON data is generated correctly as I can see it in the server logs however when I write it to the client, for some reason the JSON response is terminated.
I tested this with all browsers and using rest clients and it seems to be point to an issue with response.write(response, 'utf-8') which may not be sending the whole data or the connection gets closed for some reason.
I verified that there is no chunking involved for my test cases so there is no question of the connection being aggressively closed by the client if its still waiting for the next chunk of data. i.e. response.write in this case returns true which implies that all the data has been written to client.
Any pointers as to what could be causing the connection to be terminated or the response to be truncated? For JSON responses of smaller sizes the response is received correctly by the client.
Code:
return parseDOM(page, url, function(err, response){
if(err){
res.writeHeader(200, {'Content-Type':'application/json'});
res.end('Error Parsing DOM from ' + url);
e.message = 'Error Parsing DOM';
callback(e, req, res, targetUrl);
return;
}
else {
if(response){
res.writeHeader(200, {'Content-Type':'application/json', 'Content-Length':response.length});
console.log(response);
res.write(response, 'UTF-8');
res.end();
callback(null, req, res, targetUrl);
return;
}
}
});
Sorry. My bad. I see that the content length is wrong. Identified solution via issue:
Node.js cuts off files when serving over HTTPS

Resources