I've got a NodeJS app i'm building (using Sails, but i guess that's irrelevant).
In my action, i have a number of requests to other services, datasources etc that i need to load up. However, because of the huge dependency on callbacks, my code is still executing long after the action has returned the HTML.
I must be missing something silly (or not quite getting the whole async thing) but how on earth do i stop my action from finishing until i have all my data ready to render the view?!
Cheers
I'd recommend getting very intimate with the async library
The docs are pretty good with that link above, but it basically boils down to a bunch of very handy calls like:
async.parallel([
function(){ ... },
function(){ ... }
], callback);
async.series([
function(){ ... },
function(){ ... }
]);
Node is inherently async, you need to learn to love it.
It's hard to tell exactly what the problem is but here is a guess. Assuming you have only one external call your code should look like this:
exports.myController = function(req, res) {
longExternalCallOne(someparams, function(result) {
// you must render your view inside the callback
res.render('someview', {data: result});
});
// do not render here as you don't have the result yet.
}
If you have more than two external calls your code will looks like this:
exports.myController = function(req, res) {
longExternalCallOne(someparams, function(result1) {
longExternalCallTwo(someparams, function(result2) {
// you must render your view inside the most inner callback
data = {some combination of result1 and result2};
res.render('someview', {data: data });
});
// do not render here since you don't have result2 yet
});
// do not render here either as you don't have neither result1 nor result2 yet.
}
As you can see, once you have more than one long running async call things start to get tricky. The code above is just for illustration purposes. If your second callback depends on the first one then you need something like it, but if longExternalCallOne and longExternalTwo are independent of each other you should be using a library like async to help parallelize the requests https://github.com/caolan/async
You cannot stop your code. All you can do is check in all callbacks if everything is completed. If yes, go on with your code. If no, wait for the next callback and check again.
You should not stop your code, but rather render your view in your other resources callback, so you wait for your resource to be reached before rendering. That's the common pattern in node.js.
If you have to wait for several callbacks to be called, you can check manually each time one is called if the others have been called too (with simple bool for example), and call your render function if yes. Or you can use async or other cool libraries which will make the task easier. Promises (with the bluebird library) could be an option too.
I am guessing here, since there is no code example, but you might be running into something like this:
// let's say you have a function, you pass it an argument and callback
function myFunction(arg, callback) {
// now you do something asynchronous with the argument
doSomethingAsyncWithArg(arg, function() {
// now you've got your arg formatted or whatever, render result
res.render('someView', {arg: arg});
// now do the callback
callback();
// but you also have stuff here!
doSomethingElse();
});
});
So, after you render, your code keeps running. How to prevent it? return from there.
return callback();
Now your inner function will stop processing after it calls callback.
Related
So I'm making a web application and I'm trying to send variables to an EJS file but when they are sent out of the mongo functions they come out as undefined because it's a different scope for some reason. It's hard to explain so let me try to show you.
router.get("/", function(req, res){
var bookCount;
var userCount;
Books.count({}, function(err, stats){
if(err){
console.log("Books count failed to load.");
}else{
bookCount = stats;
}
});
User.count({}, function(err, count){
if(err){
console.log("User count failed to load.")
}else{
userCount = count;
console.log(userCount);
}
});
console.log(userCount);
//Get All books from DB
Books.find({}, function(err, allbooks){
if(err){
console.log("Problem getting all books");
}else{
res.render("index", {allbooks: allbooks, bookCount: bookCount, userCount: userCount});
}
});
});
So in the User.Count and Books.count I'm finding the number of documents in a collection which works and the number is stored inside of the variables declared at the very top.
After assigning the numbers like userCount i did console.log(userCount) which outputs the correct number which is 3, If was to do console.log(userCount) out of the User.count function it would return undefined, which is a reference to the declaration at the very top.
What is really weird is that Book.Find() has the correct userCount even though its a totally different function. The whole goal im trying to accomplish is doing res.render("index", {userCount: userCount}); outside of the Books.find(). I can do it but of course for some reason it passes undefined instead of 3. I hope this made a shred of sense.
I seem to have found a solution. but if anyone knows a different way I would love to know. So basically all you need to do is move the User.Count function outside of the router.get() function. Not completely sure about the logic of that but it works...
This is a classic asynchronous-operation problem: Your methods (Books.count, Books.find, User.count) are called immediately, but the callback functions you pass to them are not. userCount is undefined in your log because console.log is called before the assignment in the callback function is made. Your code is similar to:
var userCount;
setTimeout(function() {
userCount = 3;
}, 1000);
console.log(userCount); // undefined
User.count takes time to execute before calling back with the result, just like setTimeout takes the specified time to execute before calling its callback. The problem is JS doesn't pause and wait for the timeout to complete before moving on and calling console.log below it, it calls setTimeout, calls console.log immediately after, then the callback function is called one second later.
To render a complete view, you need to be sure you have all of the data before you call res.render. To do so you need to wait for all of the methods to call back before calling res.render. But wait, I just told you that JS doesn't pause and wait, so how can this be accomplished? Promise is the answer. Multiple promises, actually.
It looks like you are using Mongoose models. Mongoose has been written so that if you don't pass a callback function to your methods, they return a promise.
Books.count({}) // returns a promise
JS promises have a method then which takes a callback function that is called when the promise has been resolved with the value of the asynchronous method call.
Books.count({}) // takes some time
.then(function(bookCount) { // called when Books.count is done
// use the bookCount here
})
The problem is, you want to wait for multiple operations to complete, and multiple promises, before continuing. Luckily JS has a utility just for this purpose:
Promise.all( // wait for all of these operations to finish before calling the callback
Books.count({}),
User.count({}),
Books.find({})
)
.then(function(array) { // all done!
// the results are in an array
bookCount = array[0];
userC0unt = array[1];
allBooks = array[2];
})
I understand what a callback is and what asynchronous means, what I don't get is how to run asynchronous functions in node.
For example, how is this
var action = (function(data,callback) {
result = data+1;
callback(result);
});
http.createServer(function (req, res) {
action(5, function(r){
res.end(r.toString());
});
}).listen(80);
different from this
var action = (function(data) {
result = data+1;
return result;
});
http.createServer(function (req, res) {
var r = action(5);
res.end(r.toString());
}).listen(80);
?
I guess in the first example I'm doing it asynchronously, yet I don't know how Node knows when to do it sync or async... is it a matter of the return? or the fact that in the sync mode we're doing var x = func(data);?
And also: when to use sync or async? Because obviously you don't want to use it when adding +1... is it OK to use async just when performing IO tasks, such as reading from DB?
For example, I'm using the library crypto to encrypt a short string (50 chars at most), is this case a good example where I should already be using async?
I guess in the first example I'm doing it asynchronously...
Your first example isn't async :) Merely passing a callback and calling it when you're done doesn't make a function asynchronous.
Asynchronous means that, basically, you're telling Node: "here, do this for me, and let me know when you're done while I continue doing other stuff".
Your example is not handing anything to Node for future completion. It's doing a calculation and calling the callback immediately after that. That's functionally the same as your second example, where you return the result of the calculation.
However, you can change your first example to something that is asynchronous:
var action = (function(data,callback) {
setTimeout(function() {
result = data + 1;
callback(result);
}, 1000);
});
Here, you're telling Node to delay calling the callback for one second, by using setTimeout. In the mean time, Node won't get stuck waiting for a second; it will happily accept more HTTP requests, and each one will be delayed one second before the response is sent.
When to use sync or async?
Asynchronous code is "viral": if you rely on functions that are async, your own code that uses those functions will also have to be async (generally by accepting a callback, or using another mechanism to deal with asynchronicity, like promises).
For example, I'm using the library crypto to encrypt a short string (50 chars at most), is this case a good example where I should already be using async?
This depends on which function you're using. AFAIK, most encryption functions in crypto aren't asynchronous, so you can't "make" them asynchronous yourself.
Both examples will work synchronous. Simple async operations are setTimout and setInterval.
Node actually doesn't care what code are you running. You can block or not (blocking/non-blocking).
In other words - you have event loop. If your process is async he will pass the program control to the event loop, so it can execute any other action node needs to be done. If not - he wont.
if you want a function to work asynchronously, you can do that using promises, look at the code below :
function is_asynch(){
return new Promise((resolve,reject)=>{
resolve( here_your_synch_function() )
})
}
Is it bad practice to emit events with callbacks as arguments in node?
var someonesListened = self.emit('doSomething', param, callback);
if (!someonesListened) {
callback();
}
// in another module somewhere
this.on('doSomething', function(param, callback) {
// Something async....
// Then sometime later
callback();
})
EDIT: After writing this question I realised that by providing a continuation callback to an event that can be intercepted by multiple listeners defeats the purpose so I don't think I will be taking this approach.
No, it's not a bad practice if you know what are you doing.
But keep in mind that this callback could be called multiple times, or not called at all depending on how many listeners there are. If you're fine with that, by all means use callbacks.
I'm having to try and read between the lines, but it appears that you have the following requirements:
You have a callback that needs to run exactly once.
If no one else invokes the callback, you need to do it yourself, but...
...someone else may invoke that callback asynchronously.
Assuming that this is the case, you could use an approach like this:
// define a callback that will exit early if it has already been invoked,
// AND will invoke itself after a 10 second delay, if no one else has.
var hasRun = false, timeoutId = setTimeout(callback, 10000);
function callback() {
if (hasRun) return;
hasRun = true;
clearTimeout(timeoutId);
// do something cool
}
self.emit('some-event', callback);
But, of course, I may have completely misread your requirements :)
I have the node.js code running on a server and would like to know if it is blocking or not. It is kind of similar to this:
function addUserIfNoneExists(name, callback) {
userAccounts.findOne({name:name}, function(err, obj) {
if (obj) {
callback('user exists');
} else {
// Add the user 'name' to DB and run the callback when done.
// This is non-blocking to here.
user = addUser(name, callback)
// Do something heavy, doesn't matter when this completes.
// Is this part blocking?
doSomeHeavyWork(user);
}
});
};
Once addUser completes the doSomeHeavyWork function is run and eventually places something back into the database. It does not matter how long this function takes, but it should not block other events on the server.
With that, is it possible to test if node.js code ends up blocking or not?
Generally, if it reaches out to another service, like a database or a webservice, then it is non-blocking and you'll need to have some sort of callback. However, any function will block until something (even if nothing) is returned...
If the doSomeHeavyWork function is non-blocking, then it's likely that whatever library you're using will allow for some sort of callback. So you could write the function to accept a callback like so:
var doSomHeavyWork = function(user, callback) {
callTheNonBlockingStuff(function(error, whatever) { // Whatever that is it likely takes a callback which returns an error (in case something bad happened) and possible a "whatever" which is what you're looking to get or something.
if (error) {
console.log('There was an error!!!!');
console.log(error);
callback(error, null); //Call callback with error
}
callback(null, whatever); //Call callback with object you're hoping to get back.
});
return; //This line will most likely run before the callback gets called which makes it a non-blocking (asynchronous) function. Which is why you need the callback.
};
You should avoid in any part of your Node.js code synchronous blocks which don't call system or I/O operations and which computation takes long time (in computer meaning), e.g iterating over big arrays. Instead move this type of code to the separate worker or divide it to smaller synchronous pieces using process.nextTick(). You can find explanation for process.nextTick() here but read all comments too.
What is the proper way to partially render a view following an async parallel request?
Currently I am doing the following
// an example using an object instead of an array
async.parallel({
one: function(callback){
setTimeout(function(){
callback(null, 1);
// can I partially merge the results and render here?
}, 200);
},
two: function(callback){
setTimeout(function(){
callback(null, 2);
// can I partially merge the results and render here?
}, 100);
}
},
function(err, results) {
// results is now equals to: {one: 1, two: 2}
// merge the results and render a view
res.render('mypage.ejs', { title: 'Results'});
});
It is basically working fine, but, if I have a function1, function2, ..., functionN the view will be rendered only when the slowest function will have completed.
I would like to find the proper way to be able to render the view as soon as the first function is returning to minimise the user delay, and add the results of the function as soon as they are available.
what you want is facebook's bigpipe: https://www.facebook.com/note.php?note_id=389414033919. fortunately, this is easy with nodejs because streaming is built in. unfortunately, template systems are bad at this because async templates are a pain in the butt. however, this is much better than doing any additional AJAX requests.
basic idea is you first send a layout:
res.render('layout.ejs', function (err, html) {
if (err) return next(err)
res.setHeader('Content-Type', 'text/html; charset=utf-8')
res.write(html.replace('</body></html>', ''))
// Ends the response.
// `writePartials` should not return anything in the callback!
writePartials(res.end.bind(res, '</body></html>'))
})
you can't send </body></html> because your document isn't finished. then writePartials would be a bunch of async functions (partials or pagelets) executed in parallel.
function writePartials(callback) {
async.parallel([partial1, partial2, partial3], callback)
})
Note: since you've already written a response, there's not much you can do with errors except log them.
What each partial will do is send inline javascript to the client. For example, the layout can have .stream, and the pagelet will replace .stream's innerHTML upon arrival, or when "the callback finishes".
function partialStream(callback) {
res.render('stream.partial.ejs', function (err, html) {
// Don't return the error in the callback
// You may want to display an error message or something instead
if (err) {
console.error(err.stack)
callback()
return
}
res.write('<script>document.querySelector(".stream").innerHTML = ' +
JSON.stringify(html) + ';</script>')
callback()
})
})
Personally, I have .stream.placeholder and replace it with a new .stream element. The reason is I basically do .placeholder, .placeholder ~ * {display: none} so things don't jump around the page. However, this requires a DIY front-end framework since suddenly the JS gets more complciated.
There, your response is now streaming. Only requirement is that the client supports Javascript.
I think you can't do it just on the backend.
To minimise users' delay you need to send the minimal page to the browser and then to request the rest of the information from the browser via AJAX. Another approach to minimising delays is to send all templates to the browser on the first page load, together with the rendered page, and render all the pages in browser based on the data you request from the server. That's the way I do it. The beauty of nodejs is that you can use the same templating engine both in the backend and frontend and also share the modules.
If your page is composed in such a way that the slow information is further in HTML than the fast information, you can write response partially without using res.render (that renders complete page) and use res.write instead. I don't think though that this approach deserves serious attention as you would stuck with it sooner than you notice...