Promises: is asynchronous handling OK? - node.js

My question: Is it safe to detach the handling from its Promise object?
If I do this ...
var promise1 = new Promise(function(resolve, reject) {
var json = { "counter":0 }
console.log('execute now, worry later ...')
json.counter++;
console.log(json.counter)
resolve(json);
});
var then = function() {
promise1.then(function(value) { console.log('value: '+value.counter) });
}
setTimeout(then, 3000);
var promise2 = new Promise(function(resolve, reject) {
console.log('error thrown here ...')
throw new Error('will it behave the same as with then?');
});
var catchFunc = function() {
promise2.then().catch(function(error) { console.log('error: '+error.message) });
}
setTimeout(catchFunc, 3000);
Then I get warnings that make sense ...
execute now, worry later ...
1
error thrown here ...
(node:9748) UnhandledPromiseRejectionWarning: Unhandled promise rejection (rejection id: 1): Error: will it behave the same as with then?
(node:9748) PromiseRejectionHandledWarning: Promise rejection was handled asynchronously (rejection id: 1)
value: 1
error: will it behave the same as with then?
Background: In some cases, I want my promises to run concurrently. In some cases, I would like them to be run sequentially. The simplest implementation I've found is to wrap both variants into a function, push it in a map, and treat the "then / catch" in a reduce. The result is that my handling in the first variant (concurrently) is at the time of the wrapping and detached as in my sample above.
If it is safe, how can I remove the warnings from my log?

Rejected promise should be chained with catch on same tick. If this doesn't happen, UnhandledPromiseRejectionWarning appears. It's expected to become an exception in next Node versions, so it's should be avoided.
Once promise control flow was introduced, it's beneficial to use promises. setTimeout stands out and doesn't provide error handling for promises.
If promises are processed concurrently, they are usually handled with Promise.all(...).catch(...). In this case resulting promise is chained with catch on same tick (a similar problem is addressed in this answer).

Related

How to bubble up Exceptions from inside async/await functions in NodeJs?

Assume an Express route that makes a call to Mongoose and has to be async so it can await on the mongoose.find(). Also assume we are receiving XML but we have to change it to JSON, and that also needs to be async so I can call await inside of it.
If I do this:
app.post('/ams', async (req, res) => {
try {
xml2js.parseString(xml, async (err, json) => {
if (err) {
throw new XMLException();
}
// assume many more clauses here that can throw exceptions
res.status(200);
res.send("Data saved")
});
} catch(err) {
if (err instanceof XML2JSException) {
res.status(400);
message = "Malformed XML error: " + err;
res.send(message);
}
}
}
The server hangs forever. I'm assuming the async/await means that the server hits a timeout before something concludes.
If I put this:
res.status(200);
res.send("Data saved")
on the line before the catch(), then that is returned, but it is the only thing every returned. The client gets a 200, even if an XMLException is thrown.
I can see the XMLException throw in the console, but I cannot get a 400 to send back. I cannot get anything I that catch block to execute in a way that communicates the response to the client.
Is there a way to do this?
In a nutshell, there is no way to propagate an error from the xml2js.parseString() callback up to the higher code because that parent function has already exited and returned. This is how plain callbacks work with asynchronous code.
To understand the problem here, you have to follow the code flow for xml2js.parseString() in your function. If you instrumented it like this:
app.post('/ams', async (req, res) => {
try {
console.log("1");
xml2js.parseString(xml, async (err, json) => {
console.log("2");
if (err) {
throw new XMLException();
}
// assume many more clauses here that can throw exceptions
res.status(200);
res.send("Data saved")
});
console.log("3");
} catch (err) {
if (err instanceof XML2JSException) {
res.status(400);
message = "Malformed XML error: " + err;
res.send(message);
}
}
console.log("4");
});
Then, you would see this in the logs:
1 // about to call xml2js.parseString()
3 // after the call to xml2js.parseString()
4 // function about to exit
2 // callback called last after function returned
The outer function has finished and returned BEFORE your callback has been called. This is because xml2js.parseString() is asynchronous and non-blocking. That means that calling it just initiates the operation and then it immediately returns and the rest of your function continues to execute. It works in the background and some time later, it posts an event to the Javascript event queue and when the interpreter is done with whatever else it was doing, it will pick up that event and call the callback.
The callback will get called with an almost empty call stack. So, you can't use traditional try/catch exceptions with these plain, asynchronous callbacks. Instead, you must either handle the error inside the callback or call some function from within the callback to handle the error for you.
When you try to throw inside that plain, asynchronous callback, the exception just goes back into the event handler that triggered the completion of the asynchronous operation and no further because there's nothing else on the call stack. Your try/catch you show in your code cannot catch that exception. In fact, no other code can catch that exception - only code within the exception.
This is not a great way to write code, but nodejs survived with it for many years (by not using throw in these circumstances). However, this is why promises were invented and when used with the newer language features async/await, they provide a cleaner way to do things.
And, fortunately in this circumstance xml2js.parseString() has a promise interface already.
So, you can do this:
app.post('/ams', async (req, res) => {
try {
// get the xml data from somewhere
const json = await xml2js.parseString(xml);
// do something with json here
res.send("Data saved");
} catch (err) {
console.log(err);
res.status(400).send("Malformed XML error: " + err.message);
}
});
With the xml2js.parseString() interface, if you do NOT pass it a callback, it will return a promise instead that resolves to the final value or rejects with an error. This is not something all asynchronous interfaces can do, but is fairly common these days if the interface had the older style callback originally and then they want to now support promises. Newer interfaces are generally just built with only promise-based interfaces. Anyway, per the doc, this interface will return a promise if you don't pass a callback.
You can then use await with that promise that the function returns. If the promise resolves, the await will retrieve the resolved value of the promise. If the promise rejects, because you awaiting the rejection will be caught by the try/catch. FYI, you can also use .then() and .catch() with the promise, but in many cases, async and await are simpler so that's what I've shown here.
So, in this code, if there is invalid XML, then the promise that xml2js.parseString() returns will reject and control flow will go to the catch block where you can handle the error.
If you want to capture only the xml2js.parseString() error separately from other exceptions that could occur elsewhere in your code, you can put a try/catch around just it (though this code didn't show anything else that would likely throw an exception so I didn't add another try/catch). In fact, this form of try/catch can be used pretty much like you would normally use it with synchronous code. You can throw up to a higher level of try/catch too.
A few other notes, many people who first start programming with asynchronous operations try to just put await in front of anything asynchronous and hope that it solves their problem. await only does anything useful when you await a promise so your asynchronous function must return a promise that resolves/rejects when the asynchronous operation is complete for the await to do anything useful.
It is also possible to take a plain callback asynchronous function that does not have a promise interface and wrap a promise interface around it. You pretty much never want to mix promise interface functions with plain callback asynchronous operations because error handling and propagation is a nightmare with a mixed model. So, sometimes you have to "promisify" an older interface so you can use promises with it. In most cases, you can do that with util.promisify() built into the util library in nodejs. Fortunately, since promises and async/await are the modern and easier way to do asynchronous things, most newer asynchronous interfaces in the nodejs world come with promise interfaces already.
You are throwing exceptions inside the callback function. So you cant expect the catch block of the router to receive it.
One way to handle this is by using util.promisify.
try{
const util = require('util');
const parseString = util.promisify(xml2js.parseString);
let json = await parsestring(xml);
}catch(err)
{
...
}

Node logging unexpected UnhandledPromiseRejectionWarning

I have a piece of code that's causing Node to log UnhandledPromiseRejectionWarning. But I'm not sure why. Here's the code boiled down:
export class Hello {
async good(): Promise<string> {
let errorP = this.throwError();
let responseP = this.doSomething();
let [error, response] = await Promise.all([errorP, responseP]);
return response + '123';
}
async bad(): Promise<string> {
let errorP = this.throwError();
let responseP = this.doSomething();
let response = (await responseP) + '123';
let error = await errorP;
return response;
}
private async throwError(): Promise<string> {
await (new Promise(resolve => setTimeout(resolve, 1000)));
throw new Error('error');
}
private async doSomething(): Promise<string> {
await (new Promise(resolve => setTimeout(resolve, 1000)));
return 'something';
}
}
Calling try { await hello.bad(); } catch (err) {} causes node to log UnhandledPromiseRejectionWarning
Calling try { await hello.good(); } catch (err) {} does NOT log the warning
Full error:
(node:25960) UnhandledPromiseRejectionWarning: Error: error
at Hello.<anonymous> (C:\hello-service.ts:19:11)
at Generator.next (<anonymous>)
at fulfilled (C:\hello-service.ts:5:58)
at runNextTicks (internal/process/task_queues.js:58:5)
at listOnTimeout (internal/timers.js:523:9)
at processTimers (internal/timers.js:497:7)
(node:25960) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch()
. To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 1)
(node:25960) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
{"level":30,"time":1639745087914,"pid":25960,"hostname":"AZWP10801-12","reqId":"req-1","dd":{"trace_id":"2081604231398834164","span_id":"2081604231398834164","service":"#amerisave/example-service","version":"0.0.0"},"res":{"statu
sCode":200},"responseTime":1025.4359999895096,"msg":"request completed"}
(node:25960) PromiseRejectionHandledWarning: Promise rejection was handled asynchronously (rejection id: 1)
Some dependency versions:
node ver. 14.16.1
ts-node-dev ver. 1.1.8
ts-node ver. 9.1.1
typescript ver. 4.5.2
Why is good good, but bad bad?
The problem in bad() is because the errorP promise rejects BEFORE you get to await errorP and thus it rejects when there is no reject handler for it in place. Nodejs detects that a promise rejected and your process gets back to the event loop and that rejected promise does not have a reject handler on it. That gets the "unhandled rejection" warning.
Notice here that while await errorP doesn't directly apply a reject handler, it does tie errorP to the parent async function which does have a reject handler on it, so the await errorP indirectly assigns reject handling to errorP. Whereas errorP by itself will just reject and not cause anything to happen to the parent async function. It will just be a variable containing a now rejected promise with no reject handler on it.
To take advantage of async automatic error propagation of rejected promises, you have to await that promise.
Nodejs doesn't know you're going to add the await in the future with code that will execute some time in the future, so it will report the unhandled rejection.
Code becomes subject to these types of errors where you put a promise into a variable and you have no reject handler on that promise of any kind and then you go await on some other promise BEFORE you ever put any sort of reject handler on that previous promise. The promise is just sitting there with no error handling on it. If, due to the timing of things, it happens to reject in that state, you will get the warning. The usual solutions are:
Immediately put error handling on the promise so it's never left sitting by itself.
Don't create the promise until you're ready to use it however you're going to use it with appropriate error handling (with a .then().catch() or in a Promise.all().catch() or in an await or whatever).
Don't await other promises while a promise is sitting in a variable without any reject handling.
I find that if I can avoid putting a promise with no handling on it into a variable at all and rather just create the promise right into the circumstance where it's going to be monitored for completion and error, you don't even have to generally think about this issue.
FYI, you can illustrate the same general concept of a promise rejecting before you add a reject handler in a simpler manner here if you run this in nodejs:
function bad(t) {
return new Promise((resolve, reject) => {
setTimeout(reject, t);
});
}
const b = bad(500);
// this timer will fire after bad() rejects
setTimeout(() => {
b.catch(err => {
console.log("caught b rejection");
})
}, 600);
You will get the "uncaught rejection" error because when the promise rejects, it does not yet have a .catch() handler. Your code has this same issue (though obscured a little more) because the reject handler comes from the await and the async function and the try/catch the caller of the async function is using.
Here's a hypothesis (that can be experimentally proven).
The difference in behavior between good and bad can be explained by the order of awaits.
In bad you're awaiting on throwError after you have awaited on doSomething, while in good, you're awaiting on Promise.all, which will not return until both are fullfilled or at least one is rejected (which will be the case here).
So in bad, the throwing is happening outside of await, and your catch is not triggered, and it is caught internally by node.
If you change your bad so that you await on throwError first, then your catch will get triggered:
async bad(): Promise<string> {
let errorP = this.throwError();
let responseP = this.doSomething();
let error = await errorP;
let response = (await responseP) + '123';
return response;
}

Bluebird promise failing to short circuit on reject

Maybe I just don't understand promises but I've used this pattern before and never had issues. Using bluebird within node.
I have this function which is getting hit:
function getStores() {
return new Bluebird((resolve, reject) => {
return Api.Util.findNearbyStores(Address,(stores) => {
if (!stores.result) {
console.log('one')
reject('no response');
console.log('two')
}
const status = stores.results.status
})
})
}
then hits both of my logs, continues past the if and throws
'one'
'two'
TypeError: Cannot read property 'Status' of undefined
Basically it keeps going right past the resolve.
My impression was the promise should immediately short circuit on reject and pass the rejection through as the resolution to the promise. Am I misunderstanding this?
Yes, you are misunderstanding this. reject(…) is not syntax (just like resolve(…) isn't either) and does not act like a return statement that exits the function. It's just a normal function call that returns undefined.
You should be using
if (!stores.result) reject(new Error('no response'));
else resolve(stores.results.status);
The "short-circuiting" behaviour of rejections is attributed to promise chains. When you have
getStores().then(…).then(…).catch(err => console.error(err));
then a rejection of the promise returned by getStores() will immediately reject all the promises in the chain and trigger the catch handler, ignoring the callbacks passed to then.

unhandled rejections holding references

I have a doubt while running below JavaScript in node.js.
var z = new Image();
function x()
{
var promise = new Promise();
return promise;
}
var promise = x();
promise.then(function(){});
..........
promise.reject(z);
There is no reject handler added to promise returned by x(). But, at some point if we are sending reject with response value z, whether z will be garbage collected or will be still held due to unhandled rejection having reference to it. But, when I add catch/reject handler to promise, I am seeing garbage collection happens for z.
Please clarify why garbage collection not happening for Z, when passed to unhandled rejection.
That's not how you use [native A+] promises.
Try like this:
function x()
{
return new Promise((resolve,reject) => {
if(success()) {
resolve(success_val);
} else {
reject(new Error("fail!"));
}
});
}
let promise = x();
promise.then(function(){});
You can't reject a promise outside of where it's created. You have to use the (resolve,reject) constructor callback.

Handling simple value rejects in a promise library

When the Bluebird module detects a non-Error reject, it reports Warning: a promise was rejected with a non-error.
So when we are writing a reusable promise library, how should we handle the situation when it is the client's code that throws simple values?
Simple my-library example:
var Promise = require('bluebird');
module.exports = function (cb) {
return new Promise(function (resolve, reject) {
try {
cb();
resolve('some data');
} catch (e) {
// should we do this?
reject(e);
// or should we do this?
// reject(e instanceof Error ? e : new Error(e));
}
});
};
A client that consumes it:
var lib = require('my-library');
lib(function () {
throw 'Regular text';
})
.catch(function (error) {
// - should we expect exactly 'Regular text' here?
// - or should it be wrapped into Error by the promise library?
});
In the example above we are going to see the same Warning: a promise was rejected with a non-error again, which in this case is obviously the client's fault, but the problem is - it will be reported against the promise library that did the reject.
This brings a question of how a reusable library should handle such cases.
Option 1: do nothing (just use reject(e)), let the client see the warning message Warning: a promise was rejected with a non-error.
Option 2: verify if the error thrown is an instance of Error and if not - do reject(new Error(e)). This enables the client to throw and reject with simple values, which is probably bad.
I am looking for a general recommendation here.
Also consider that sometimes the client can be an old library that you are trying to wrap into promises.
You should definitely and objectively wrap non-errors with errors.
Non errors are terrible for debugging, the bluebird warning is there because it is very hard to figure out where an error came from if it's a primitive. You can't attach properties to primitives and they have no stack traces.
Stack traces mean you don't have to guess where the errors come from - you know.
That said, bluebird already does this automatically with promisify and promisifyAll and consider using them. If you're doing this manually - just make sure you throw or reject things with stacks too.
Node in addition looks for the stack traces of your unhandled rejections - so rejecting with junk will not only give you useless information in Node but it might actually crash your program.
Error messages should be wrapped in error objects, but not automatically.
Automatic wrapping reduces the usefulness of stack traces. For example, lets say the client function is f and your library function is g:
function f(x) { throw 'Hello world' }
function g(f, x) { try { f(x); } catch (e) { throw new Error(e); } }
g(f, 1)
The client function and line numbers where the error originated will be completely missing from the stack:
Error: Hello world
at g (repl:1:52)
at repl:1:1
While if the client function creates the error object:
function f(x) { throw new Error('Hello world'); }
function g(f, x) { try{ f(x); } catch (e) { throw e; } }
g(f, 1)
we get a full stack trace:
Error: Hello world
at f (repl:1:23)
at g (repl:1:25)
at repl:1:1
There is no way to recover the stack up to the point in your library, making it useless for users (they wont be able to debug their own code).
That said, this largely depends on what sort of library you're making and what sort of contract you would like it to present to the users. You should think very carefully before using user-passed callbacks, as they invert the normal chain of responsibility in the stack.
Promises make this easier because they un-invert the inversion: the library is called first to generate a promise, then that promise is passed a callback. As such, creating a new error together with the creation of the promise allows Bluebird to at least partially look into / augment the stack of the client code (the line that created the promise will be present in the stack). The warning is still there because its impossible to provide a complete stack.
No, they simply want you to wrap your JSON error objects into something.
This is not freedom, it's dictature.
There's no point in wrapping simple JSON objects in bulky Error wrappers.
Don't listen to them.

Resources