Why it use Q,bluebird framework when using promise? [duplicate] - node.js

This question already has an answer here:
Are there still reasons to use promise libraries like Q or BlueBird now that we have ES6 promises? [closed]
(1 answer)
Closed 6 years ago.
During studying promise in node.js. i have some doubt.
'promise' is already defined in node.js
but usally it seems to use additional promise framework like Q,bluebird,RSVP etc.
Is there any reason?
Is the reason that core node.js function cannot support promise with promise.denodeify function?

I can't say for Q, but Bluebird is a lot faster than native Promises, and provides a bunch of extra features on top of the native promise.
It's the same reason people use lodash despite having [].map() for years now.
Additionally, Bluebird has better browser support than the native Promise implementation.
You normally only use Bluebird on the server, though. The extra features and speed are not worth the extra size of the library the user has to download.

Well, promise is the native JavaScript object. All those libraries are userland implementations. For example if we look at bluebird it has:
A lot of utility functions and helpers that make your life easier.
It has typed .catch which makes sure you don't catch programmer errors by mistake.
It has .some .any .map .filter and so on for working with collections easily.
It has .reflect and synchronous inspection of promises.
It doesn't swallow errors by default when throwing in a then handler.
Built in coroutine (generator) support for flattening asynchronous flow.
It's faster (usually between 4x and 10x) than native promises in different browsers.
It provides more debugging hooks and better stack traces.
It provides warnings against common promise anti-patterns.
It lets you override the scheduler so you can determine how it schedules tasks.
It supports promise cancellation with sound semantics which is proposed for native promises but not yet adopted.
So in a tl;dr; :
It's faster.
It's more debuggable.
It has a richer API.
Now, whether or not you should use it is up to you - there is always overhead in including libraries - I'm biased as a core contributor.

Related

Does ES6 promises replace the need for async.js methods?

With the introduction of http://es6-features.org/#PromiseUsage in ES6, is https://github.com/caolan/async still relevant? Do async.waterfall and async.series provide any benefit over promises?
Yes and no. The async functions allow to just have it prepared for you rather then you building it yourself and the async still work everywhere. ES6 promises are pretty great. They allow to do a lot and when paired with the proposed ES7 async/await they might remove need for the package all together depending on how its implemented in the future.

Koa / Co / Bluebird or Q / Generators / Promises / Thunks interplay? (Node.js) [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I'm investigating building a web app in part with Koa, but I don't quite have a handle on the hows, whens, and whys of choosing between - and applying - the range of supportive "making async easier" technologies/approaches (listed below).
Overall the disparate guidance on the web about this subject still leaves things blurry, especially in respect to evolving best practices, or at least better ones, and under what scenarios. There seems to be little or nothing on the web that puts it all in context.
I'm hoping the responses to this big arse sprawling post can correct that. Also maybe the questions below can inspire someone to write a thorough blog post or the like to address this matter. My sense is I'm not even close to the only one who would benefit from that.
So I'd be pleased if the bright community could help answer and provide clarity to the following questions in respect to the technologies listed below (in bold type):
-- a) How, and under what circumstance (as applicable) are they complements, supplements, substitutes, and/or overlapping solutions to one another?
-- b) What are their trade-offs in respect to speed-performance, error handling ease, and debugging ease?
-- c) When, where, and why may it be better to use "this" versus "that" technology, technologies-combo, and/or approach?
-- d) Which technologies or approaches, if any, may be "dimming stars".
(Hoping that the opinions that are part of answers can be well explained.)
==============================
Technologies:
* Koa *
My understanding:
Koa is a minimal foundation for build Node apps geared for taking advantage of ECMAScript-6 features, one feature in particular being generators.
* Co *
My understanding:
-- Co is a library of utilites for running ECMAScript-6 generators (which are native to Node .011 harmony), with the goal to allieve some/much(?) of the need to write boilerplate code for running and managing generators.
-- Co is intrinsically part of Koa(?).
Specific questions:
-- If and how does one use Co differently in Koa than in a non-Koa context. In other words, does Koa wholly facade Co?
-- Could Co be replaced in Koa with some other like generator library if there is/was a better one? Are there any?
* Promise Libraries such as "Q" and Bluebird *
My understanding:
-- They are in a sense "polyfills" for implmententing the Promises/A+ spec, if and until Node natively runs that spec.
-- They have some further non-spec convenience utilities for facilitating the use promises, such as Bluebird's promisfyAll utility.
Specific questions:
-- My understanding is the ECMAScript-6 spec does/will largely reflect the Promises/A+ spec, but even so, Node 0.11v harmony does not natively implement Promises. (Is this correct?) However when it does, will technologies such as Q and Bluebird be on their way out?
-- I've read something to the effect that "Q" and Bluebird support generators. What does this mean? Does it mean in part that, for example, they to some degree provided the same utility as Co, and if so to what degree?
* Thunks and Promises *
I think I have an fair handle on what they are, but hoping someone can provide a succinct and clear "elevator pitch" definition on what each is, and of course, as asked above, to explain when to use one versus the other -- in a Koa context and not in it.
Specific questions:
-- Pro and cons to using something like Bluebird's promisfy, versus say using Thunkify (github com/visionmedia/node-thunkify)?
==============================
To give some further context to this post and its questions, it might be interesting if Koa techniques presented in the following webpages could be discussed and contrasted (especiallly on a pros vs cons basis):
-- a) www.marcusoft . net/2014/03/koaintro.html (Where's the thunks or promises, or am I not seeing something?)
-- b) strongloop . com/strongblog/node-js-express-introduction-koa-js-zone (Again, where's the thunks or promises?)
-- c) github . com/koajs/koa/blob/master/docs/guide.md (What does the "next" argument equate to, and what set it and where?)
-- d) blog.peterdecroos . com/blog/2014/01/22/javascript-generators-first-impressions (Not in a Koa context, but presents the use of Co with a promise library (Bluebird), so I'm assuming the technique/pattern presented here lends itself to usage in Koa(?). If so, then how well?
Thanks all!
I've been working almost extensively with generators for a month now so maybe I can take a stab at this. I'll try to keep the opinions to a minimum. Hopefully it helps clarify some of the confusion.
Part of the reason for the lack of best practices and better explanations is that the feature is still so new in javascript. There are still very few places that you can use generators node.js and firefox being the most prominent, though firefox deviates from the standard a bit.
I would like to note that there are tools like traceur and regenerator that will let you use them for development and allow you to turn them into semi-equivalent ES5 so if you find working with them enjoyable then there's no reason not to start using them unless you're targeting archaic browsers.
Generators
Generators weren't originally thought of as a way to handle asynchronous control flows but they work wonderfully at it. Generators are essentially iterator functions that allow their execution to be paused and resumed through the use of yield.
The yield keyword essentially says return this value for this iteration and I'll pick up where I left off when you call next() on me again.
Generator functions are special functions in that they don't execute the first time they're call but instead return an iterator object with a few methods on it and the ability to be used in for-of loops and array comprehensions.
send(),: This sends a value into the generator treating it as the last value of yield and continues the next iteration
next(),: This continues the next iteration of the generator
throw(): This throws an exception INTO the generator causing the generator to throw the exception as though it came from the last yield statement.
close(): This forces the generator to return execution and calls any finally code of the generator which allows final error handling to be triggered if needed.
Their ability to be paused and resumed is what makes them so powerful at managing flow control.
Co
Co was built around the ability of generators to make handling flow control easier. It doesn't support all of the things that you can do with generators but you can use most of them through it's usage with less boilerplate and headache. And for flow control purposes I haven't found that I needed anything outside of what co provides already. Although to be fair I haven't tried sending a value into a generator during flow control but that does bring up some interesting possibilities....
There are other generator libraries out there some of them that I can think of off the top of my head are suspend, and gen-run. I've tried them all and co offers the most flexibility. Suspend may be a little easier to follow if you're not accustomed to generators yet but I can't say that with authority.
As far as node and best practices go I'd say co is currently winning hands down with the amount of support tools that have been created to go with it. With suspend the most likely runner up.
Co works with both promises and thunks and they are used for yield statement so that co knows when to continue execution of the generator instead of you manually having to call next(). Co also supports the use of generators, generator functions, objects and arrays for further flow control support.
By yielding an array or an object you can have co perform parallel operations on all of the yielded items. By yielding to a generator or generator function co will delegate further calls to the new generator until it is completed and then resume calling next on the current generator, allowing you to effectively create very interesting flow control mechanisms with minimal boilerplate code.
Promises
While I said I'd keep opinions to a minimum I would like to state that to me promises are probably the hardest concept to grasp. They are a powerful tool for maintaining code but they are hard to grasp the inner workings of and can come with quite a few gotchas if used for advanced flow control.
The easiest way that I can think of to explain promises is that they are an object returned by a function that maintains the state of the function and a list of callbacks to call when the a specific state of the object is or has been entered into.
The promise libraries themselves won't be going anywhere anytime soon. They add a great deal of nice to haves for promises included done() which didn't make it into the ES6 spec. Not to mention the fact that the same libraries can be used on the browser and in node we'll have them for a good long while.
Thunks
Thunks are just functions that take a single parameter callback and return another function that they are wrapping.
This creates a closure that allows the calling code to instantiate the function passing in its callback so that it can be told when the method is complete.
Thunks are fairly straight forward to understand and use in my opinion but they aren't the right tool for everything. For example spawn is a major pain to create a thunk for, you can do it but it's not easy.
Thunks vs. Promises
These aren't mutually exclusive and can easily be used together, but it's usually better for your sanity to pick one and stick with it. Or at the very least pick a convention so you can easily tell which is which. Thunks run faster from my experience but I haven't benchmarked it. Most of this is probably because it's a smaller abstraction and doesn't have error handling mechanisms built in.
You'll usually be building something that requires error handling though so the overall performance gains of thunks could easily even out or side in the favor of promises depending on your code.
When to Use
Generators - When you can safely say that your application will be able to run on the bleeding edge, whether it's firefox only for the browser or node > 0.11.3
I've been using them extensively at the company I'm out now and couldn't be happier with the control flow mechanisms and lazy evaluation that they allow.
Promises vs. Thunks - This is really up to you and how comfortable you are working with each. They don't provide the same benefits nor do they solve the same problem. Promises help deal with the async problem directly, thunks just ensure a function takes the needed callback parameter for other code to pass in.
You can use them both together and as long as you can keep it so that it's obvious which is which you won't have a problem.
Promises/Thunks with Generators - I suggest doing this anytime you are using generators for control flow. It's not necessary but it's easier just like using co as an abstraction for control flow with generators is easier. Less code to type, easier maintenance, and less possibilities that you'll hit an edge case that somebody else hasn't run into yet.
Koa
I'm not going to go into a lot of detail on koa. Suffice it to say that is similar to express but written to take advantage of generators. This does give it some unique advantages such as easier error handling and cascading middleware. There were ways to accomplish all of these tasks before but they weren't elegant and sometimes not the most performant.
Special Note:
Generators open up a door of possibilities that we really haven't explored yet. Just like they can be used for control flow when that wasn't their initial design I'm positive they can be used to solve a lot of other problems that we normally have problems with in javascript. It will probably be brighter minds than me that find out how else we can use them but I'd at least start playing around with them and getting a better understanding of what they're capable of. There's still more goodies for generators coming in ES.next.

Bluebird instead of Co in Koa?

Seems like Bluebird overlaps Co in generator/coroutine related functionality. Bluebird is espoused to have exceptional speed-performance, so for discussion sake, (assuming the aforementioned overlap premise is true) if one wanted to substitute Bluebird for Co in Koa (Node.js context), could it be easily be done without diminishing Koa's functionality, and if so how?
(My guess is it can't practically be done since it seems Koa is built over Co and doesn't explicitly expose it, but facades it. Such a substitution it seems would be tantamount to replacing jQuery with something else in Bootstrap)
First of all, bluebird and co are not comparable like that. You mean Bluebird.coroutine vs co (short for coroutine).
Now, the difference between Bluebird.coroutine and co is that co only allows you to yield a certain set of hard-coded types. While Bluebird.coroutine can be configured to support yielding arbitrary types, the documentation for example contains examples how you can add support for yielding thunks and callbacks.
Async generators are so trivial that the only differences there can be between implementations is what types you can yield and how it performs. Not much room to be better or worse.
However bluebird.coroutine is only a fraction of bluebird features.
Generators only solve the problem of making a sequence of actions less verbose. There is a lot of useful functionality for more advanced needs like resource management, concurrency coordination, error handling, cancellation+timeouts and long stack traces which are impossible or extremely painful if you only have async generators powered by thunks/callbacks/minimal promises.
You can make a drop-in replacement for co by configuring all the yield types that co supports and then just using bluebird.coroutine:
var co = require("bluebird").coroutine;
// Configure all yield types you need using co.addYieldHandler
// See documentation for examples
module.exports = co;
However this doesn't really make any sense since very little code actually should run directly in your request handler - the functions that the request handler calls however do. And those functions are not helped by koa (hmm so what is the point of koa again? :D), so they can be bluebird coroutines directly.
esailija said this about Bluebird,
a feature is being added that allows not only yielding callbacks, thunks etc but any arbitrary thing that comes to your mind. Bluebird is also the fastest. So after this version koa should be just using bluebird indeed. See https://github.com/petkaantonov/bluebird/issues/131#issuecomment-36975495
That said, I don't believe him. And, I don't believe a bluebird wrapper would be faster than Co -- if such a thing were possible. Co.js works, and there is no way possible to get Bluebird.js to pass the tests currently. If you're using ES6, ignore Bluebird entirely and use Co.

How to avoid deep nested code with mongoose / node?

I am trying to improve the readability of a biggish nodejs app that uses a lot of mongooose. The problem is that with a lot of dependent queries the callbacks get out of hand.
What are the practices of handling this issue?
There are three common solutions for your problem.
First one is async.js lib.
Second one is using Promises. There are more then one implementation of promises in node.js. I know three implementations:
node promise
vow
Q promise
Third one is to use Fibers. There is fibers promise library that do all the tricky work for you.
There was plenty of similar questions before. For example, check this one.
All this libraries do the same thing - they make node.js asynchronous code pretty and readable. So, just choose one that looks simpler for you.
As for me, I prefer async.js lib.
Update: mongoose.js have its own build-in promise - mpromise. You can access it as mongoose.promise. But whenever you call exec() function on a query in mongoose it returns you a promise. I never actually used mongoose.js promises except for the REPL, but you may give it a try.

Node.js control flow: callbacks or promises? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I know there are many control flow libraries for node.js. Some of them let one chain async functions with callbacks (like async, asyncblock, etc), the others use promise concept (Q, deferred, futures, etc). Given a long running script making a series of actions one after another that may fail at any time, which control flow would you prefer and why? What are the pros and cons?
Pros for callbacks:
Simple to understand and create.
Somewhat more efficient, because fewer objects are created and garbage collected.
Node opted for (error,result) callbacks throughout. I recommend following their argument order for consistency. (As opposed to say (result1, result2, result3, error).)
Pros for promises:
Provides a fluent interface, which can sometimes help to mitigate nested callback hell, as shown here. Code appears to flow linearly by chaining .then(foo).then(bar) calls.
A good promises library will let you run many asynchronous operations in parallel, and continue only when they are all completed. The Deferred library does this seamlessly through map, Q has allResolved, and ES6 Promises offer Promise.all(). (This is also possible with callbacks, e.g. using async.parallel(), but not built in.)
A good promises library will let you specify one error handling function which will be called if any of the queued functions fail. To do this with callbacks requires a little boilerplate: if (err) return callback(err); at the start of every callback.
It would make sense to use callbacks near the bottom of the stack, for code which will be run many times per second. Higher up the stack, promises may be preferable as they are easier to read and understand, and can handle errors more elegantly.
It is worth noting that promises can be built from callbacks at runtime. So you can implement your core code in the minimalist callback form, and still expose a promises version of the library if you want to. (As in Q.nfbind().)
I would be interested to hear other pros/cons.
Bonus tip: Always handle errors! With both methods, if you do not handle the error then it will simply disappear, leaving you in the dark about why your code did not work as expected.
Callbacks should always handle if (err) ... and Promises should always have a .catch() if they do not return.
Even if you expect errors sometimes, and don't need to handle those, not handling unexpected errors means you won't hear about errors from developer mistakes such as typos, if the code is changed in future.
An alternative to .catch() for Promises is to listen for unhandled rejections. Personally I use this to issue a warning that .catch() was missing!
I don't think there are many objective pros and cons. Async is very popular (based on npm packages that depend on it).
I like the control flow libraries (specifically async), because it is easier for me to understand. Promises confuse me, while async is easily understandable. I suspect its just a learning curve thing though, and promises would be more readable if I spent the effort in learning them. But should I expect that of people trying to read my code too?
There is a 3rd type too - Fibers. Fibers does not work on Windows yet, but (IMO) offers the clearest syntax for things that should execute in series.
I've been experimenting with a declarative approach with this library: http://chainsjs.org still more work to do on it, but it gives you the ability to define an "execution map" where you can pretty much completely control the flow of execution from a simple mapping.

Resources