Nesting .then() and catch in Javascript promise - node.js

I'm not experienced with Javascript promises and recently I started using promises instead of callbacks in my Javascript projects.
When I tried to run several promise functions one after another I landed in a nested chaos of then(). The code works exactly as expected, but my question is that if this is the way to resolve several promise functions one after another then what is the advantage of using promises instead of callbacks.
If I'm not doing it the right way, then it is a request from you guys to show me the proper way of resolving nested promises.
Below is my code that I don't like it they way it looks:
exports.editExpense = (req, res, next) => {
Account.findAll().then(accounts => {
Budget.findAll().then(budgets => {
Expense.findAll().then(expenses => {
Expense.findByPk(id).then(expense => {
res.render('expenses/index', {
urlQuery: urlQuery,
expenses: expenses,
expense: expense,
accounts: accounts,
budgets: budgets
});
})
})
})
}).catch(error => console.log(error));
};

You can use async/await structure for better formatting
exports.editExpense = async(req, res, next) => {
try {
let accounts = await Account.findAll();
let budgets = await Budget.findAll();
let expenses = await Expense.findAll()
let expense = await Expense.findByPk(id);
if (expense) {
res.render('expenses/index', {
urlQuery: urlQuery,
expenses: expenses,
expense: expense,
accounts: accounts,
budgets: budgets
});
} else {
console.log('else') //<<- Render/Handle else condition otherwise server will hang.
}
} catch (error) {
console.error(error)
}
You should try to minimize the amount of async calls you make in a function as it will impact your performance.

If you prefer to use the then catch structure, in order to take fully advantage of it I recommend you not to nest them. Of course you can, but then you should put a .catch() after each of them. That's why the async introduction made an easier code to read and handle errors, as it simplifies it with the try catch structure.
If you pipe multiple .then(), you can return a value as a promise from each of them that can be used inside the next one once the promise resolves. The only thing is that you loose these values unless you save them either in req with new properties or in variables declared outside the pipe of .then().
That's why, in this snippet, I declared all the variables at the beginning in order to save all the values and use them in the final res
exports.editExpense = (req, res, next) => {
let accounts;
let budgets;
let expenses;
Account.findAll()
.then(fetchedAccounts => {
accounts = fetchedAccounts;
return Budget.findAll()
})
.then(fetchedBudgets => {
budgets = fetchedBudgets;
return Expense.findAll()
})
.then(fetchedExpenses => {
expenses = fetchedExpenses
return Expense.findByPk(id)
})
.then(expense => {
return res.render('expenses/index', {
urlQuery: urlQuery,
expenses: expenses,
expense: expense,
accounts: accounts,
budgets: budgets
});
})
.catch(error => console.log(error));
};

Related

i want to return array of posts, but i got an empty array

postsArr does not get data
router.get('/user-post/:id', checkJwt, (req, res, next) => {
let postsArr = []
db.userSchema.findOne({ _id: req.params.id })
.populate('posts')
.exec((err, da) => {
for (let i = 0; i < da.posts.length; i++) {
db.postSchema.find({ _id: da.posts[i]._id })
.populate('comments')
.exec((err, post) => {
postsArr.push(post)
})
}
console.log(postsArr)
})
})
This is a whole lot easier if you use the promise interface on your database:
router.get('/user-post/:id', checkJwt, async (req, res, next) => {
try {
let da = await db.userSchema.findOne({ _id: req.params.id }).populate('posts').exec();
let postsArray = await Promise.all(da.posts.map(post => {
return db.postSchema.find({ _id: post._id }).populate('comments').exec();
}));
res.json(postsArray);
} catch (e) {
console.log(e);
res.sendStatus(500):
}
});
The challenge with an asynchronous operation in a loop is that they don't run sequentially - they all run in parallel. The for loop just starts all your asynchronous operations and then you never know when they are all done unless you track them all somehow. That can be done without promises by using counters to keep track of when every single asynchronous result is done, but it's a whole lot easier to just let Promise.all() do that for you. It will also put all the results in the right order for you too.
If you wanted to sequence the database operations and run them serially one at a time, you could do this:
router.get('/user-post/:id', checkJwt, async (req, res, next) => {
try {
let da = await db.userSchema.findOne({ _id: req.params.id }).populate('posts').exec();
let postsArray = [];
for (let post of da.posts) {
let result = await db.postSchema.find({ _id: post._id }).populate('comments').exec();
postsArray.push(result);
}
res.json(postsArray);
} catch (e) {
console.log(e);
res.sendStatus(500):
}
});
This second version runs only one database operation at a time, sequentially. It will put less peak load on the database, but likely be slower to finish.
You will notice that the use of promises and await makes the error handling much simpler too as all errors will propagate to the same try/catch where you can log the error and send an error response. Your original code did not have error handling on your DB calls.

Best way to structure async Express GET request

I have been looking to update my Express skills by incorporating async/await handling and had a quick question.
From the examples I have seen online, most requests are structured wrapped inside of a try/catch block and add any await tasks to a variable before handling them.
app.post('/signup', async(req, res) => {
const { email, firstName } = req.body
const user = new User({ email, firstName })
const ret = await user.save()
res.json(ret)
})
My code looks like this:
app.route("/articles")
// GET: articles
.get(async (req, res) => {
await Article.find((err, results) => {
if (!err) {
res.json(results);
} else {
res.send(err);
};
});
})
Should I assign the response from my Mongoose to find to a variable as the first code block example and handle in a try/catch, or does my code essentially do the same thing and in a way that's the best practice as is?
Thanks in advance!
Cheers,
James
Be aware that there are two very different uses of the word async in javascript. The first is the general concept of asynchronous functions which can be implemented in any language and is widely used in javascript with various design patterns (callback, promises, async/await). The second is the async keyword which is used to allow the usage of await and only works on promises (does not work in callbacks).
You seem to be confusing the two. Because of this I now advise people not to use the word "async" when referring to asynchronous functions and only use it to refer to the async keyword.
Your function does not return a promise because you passed a callback to it. As such it cannot be used with await. Because you cannot use await it makes no sense to mark the function as async. I consider your code buggy even though it works - the mechanism still function (the await keyword conveniently ignores non-Promise functions like yours and does not generate an error) however it doesn't communicate your intent well and will confuse future maintainers of your code.
IMHO, the correct version of your code should be:
app.route("/articles")
// GET: articles
.get((req, res) => {
Article.find((err, results) => {
if (!err) {
res.json(results);
} else {
res.send(err);
};
});
})
The correct version of your code with async/await is:
app.route("/articles")
// GET: articles
.get(async (req, res) => {
try {
res.json(await Article.find());
}
catch (err) {
res.send(err);
}
})
The correct version of your code with promises but without async/await should be:
app.route("/articles")
// GET: articles
.get((req, res) => {
Article.find()
.then(result => res.json(result))
.catch(err => res.send(err));
})
The above is of course just my opinion but I suggest you strongly consider it a guideline. Any of the three forms above would be perfectly acceptable to most javascript programmers and most people consider which to use a matter of taste. Personally I prefer plain promises without await but async/await is useful when you have some tricky flow control logic.
Note that Mongoose conveniently supports both promises and callbacks so in this specific case you can just remove the callback to use async/await. However not all libraries do this. If you need to convert a callback based function to a promise you need to wrap it in the Promise constructor:
function convertedToPromise () {
return new Promise((resolve,reject) => {
callbackBasedFunction((err,result) => {
if (err) {
reject(err)
}
else {
resolve(result)
}
});
});
}

findById(req.params.id) returns null as response?

My issue is different than others, when I passed /:id then I return JSON yeah its ok, but issue is that when I give wrong objectId it return null value with statusCode 200 instead of error that this id is wrong. according to my perception it call .catch block instead of .then block because id is not available on database.
const get_id_docs = async (req, res) => {
await models
.findById(req.params.id)
.then(result => {
res.send(result)
})
.catch(err => {
res.sendStatus(404).send("Link Not Found")
})
};
There are two cases, one an invalid id, and the other a valid id but doesn't exists in db.
If you want to differentiate an invalid id, you can validate it before querying, and return 404.
Also you mixed async await and Promise, one of them must be used in this case.
const mongoose = require("mongoose");
const get_id_docs = async (req, res) => {
const isValidId = mongoose.Types.ObjectId.isValid(req.params.id);
if (!isValidId) {
res.status(404).send("Link Not Found - invalid id");
}
try {
const result = await models.findById(req.params.id);
if (result) {
res.send(result);
}
res.status(404).send("Link Not Found - does not exists");
} catch (err) {
res.status(500).send(err.message);
}
};
And if you prefer then catch
const mongoose = require("mongoose");
const get_id_docs = (req, res) => {
const isValidId = mongoose.Types.ObjectId.isValid(req.params.id);
if (!isValidId) {
res.status(404).send("Link Not Found - invalid id");
}
models.findById(req.params.id).then(result => {
if (result) {
res.send(result);
}
res.status(404).send("Link Not Found - does not exists");
})
.catch (err) {
res.status(500).send(err.message);
}
};
You are composing various general-purpose libraries that fill a variety of use cases. In particular, database abstraction frameworks typically try to percent a collection like facade over the underlying data stores. In JavaScript, methods like Array.prototype.find return undefined rather than throwing errors. And I think the authors of mongoose are trying to write analogously behaving APIs.
In addition to providing intuitive default Behavior, by not throwing errors, this enable a wider range of use cases, such as checking for existence, to be handled without boilerplate.
Given that, you want something like the following
const get_id_docs = async (req, res) => {
const result = await models.findById(req.params.id);
if (result) {
res.send(result);
}
res.sendStatus(404).send("Link Not Found");
};
Note that the above has other advantages including that it does not propagate other kinds of errors as 404s arbitrarily.

NodeJS: Handling transactions with NoSQL databases?

Consider a promise-chained chunk of code for example:
return Promise.resolve()
.then(function () {
return createSomeData(...);
})
.then(function () {
return updateSomeData(...);
})
.then(function () {
return deleteSomeData(...);
})
.catch(function (error) {
return ohFishPerformRollbacks();
})
.then(function () {
return Promise.reject('something failed somewhere');
})
In the above code, let's say something went wrong in the function updateSomeData(...). Then one would have to revert the create operation that was executed before this.
In another case, if something went wrong in the function deleteSomeData(...), then one would want to revert the operations executed in createSomeData(...) and updateSomeData(...).
This would continue as long as all the blocks have some revert operations defined for themselves in case anything goes wrong.
Only if there was a way in either NodeJs or the database or somewhere in the middle, that would revert all the transactions happening under the same block of code.
One way I can think of this to happen is by flagging all the rows in database with a transactionId (ObjectID) and a wasTransactionSuccessful(boolean), so that CRUD operations could be clubbed together with their transactionIds, and in case something goes wrong, those transactions could be simply deleted from the database in the ending catch block.
I read about rolling back transactions in https://docs.mongodb.com/manual/tutorial/perform-two-phase-commits/. But I want to see if it can be done in a more simpler fashion and in a generic manner for NoSQL databases to adapt.
I am not sure if this would satisfy your use case, but I hope it would.
let indexArray = [1, 2, 3];
let promiseArray = [];
let sampleFunction = (index) => {
return new Promise((resolve, reject) => {
setTimeout(resolve, 100, index);
});
}
indexArray.map((element) => {
promiseArray.push(sampleFunction(element));
});
Promise.all(promiseArray).then((data) => {
// do whatever you want with the results
}).catch((err) => {
//Perform your entire rollback here
});
async.waterfall([
firstFunc,
secondFunc
], function (err, result) {
if (err) {
// delete the entire thing
}
});
Using the async library would give you a much elegant solution than going with chaining.

How to handle chained promises in a loop in nodejs with bluebird

The gist of the problem is:
for (let i=0;i<list.length;i++) {
AsyncCall_1({'someProperty': list[i] })
.then((resp_1) => {
resp_1.doSomething();
resp_1.AsyncCall_2()
.then((resp_2) => {
resp_2.doSomething();
})
})
}
after last resp.AsyncCall_2.then(()=> {
//do something
})
I need to sequentially chain all the promises so that, the loop waits for the "resp.AsyncCall_2" function to be resolved for its next iteration. After last "resp.AsyncCall_2" call do something. (since all the promises will be resolved the)
Actual Problem:
for (var i=0;i<todo.assignTo.length;i++) {
Users.findOne({'username': todo.assignTo[i] })
.then((user) => {
user.assigned.push(todo.title);
user.notificationCount.assignedTodosCount++;
user.save()
.then((user) => {
console.log("todo is assigned to the user: " + user.username)
})
})
}
//to something at last call resloved (I know this is wrong way of doing this)
Users.find({})
.then((users)=> {
var promises = [];
for (var i=0;i<users.length;i++) {
users[i].notificationCount.totalTodosCount++;
promises.push(users[i].save());
}
Promise.all(promises)
.then(()=> {
res.statusCode = 200;
res.setHeader('Content-Type', 'application/json');
console.log("todo is successfully posted");
res.json({success : true, todo});
},(err) => next(err))
.catch((err) => next(err));
})
Thank You in Advance..
In modern versions of node.js, you can just use async/await and don't need to use Bluebird iteration functions:
async function someMiddlewareFunc(req, res, next) {
try {
for (let item of list) {
let resp_1 = await AsyncCall_1({'someProperty': item });
resp_1.doSomething();
let resp_2 = await resp_1.AsyncCall_2();
resp_2.doSomething();
}
// then do something here after the last iteration
// of the loop and its async operations are done
res.json(...);
} catch(err) {
next(err);
}
}
This will serialize the operations (which is what you asked for) so the 2nd iteration of the loop doesn't start until the async operations in the first iteration is done.
But, it doesn't appear in your real code that you actually need to serialize the individual operations and serializing things that don't have to be serialized usually makes the end-to-end time to complete them be longer. So, you could run all the items in your loop in parallel, collect all the results at the end and then send your response and Bluebird's Promise.map() would be quite useful for that because it combines a .map() and a Promise.all() into one function call:
function someMiddlewareFunc(req, res, next) {
Promise.map(list, (item) => {
return AsyncCall_1({'someProperty': item}).then(resp_1 => {
resp_1.doSomething();
return resp_1.AsyncCall_2();
}).then(resp_2 => {
return resp_2.doSomething();
});
}).then(results => {
// all done here
res.json(...);
}).catch(err => {
next(err);
});
}
FYI, when using res.json(...), you don't need to set these res.statusCode = 200; or res.setHeader('Content-Type', 'application/json'); as they will be done for you automatically.
Further notes about Bluebird's Promise.map(). It accepts a {concurrency: n} option that tells Bluebird how many operations are allowed to be "in flight" at the same time. By default, it runs them all in parallel at the same time, but you can pass any number you want as the concurrency option. If you pass 1, it will serialize things. Using this option can be particularly useful when parallel operation is permitted, but the array is very large and iterating all of them in parallel runs into either memory usage problems or overwhelms the target server. In that case, you can set the concurrency value to some intermediate value that still gives you some measure of parallel execution, but doesn't overwhelm the target (some number between 5 and 20 is often appropriate - it depends upon the target service). Sometimes, commercial services (like Google) also have limits about how many requests they will handle at the same time from the same IP address (to protect them from one account using too much of the service at once) and the concurrency value can be useful for that reason too.
Have you tried Promise.each?
const users = todo.assignTo.map(function(user) {
return Users.findOne({'username': assigned_to });
}
Promise.each(users, function(user) {
user.assigned.push(todo.title);
user.notificationCount.assignedTodosCount++;
user.save()
.then((user) => {
console.log("todo is assigned to the user: " + user.username)
})
}

Resources