Query callback with arguments and synchronous queries - node.js

I have two problems implementing a RESTful service using Node.js / node-postgres lib / PostgreDB and both are due to the async nature of JS.
A) I need to pass an extra argument to a callback in client.query(query, callback) call
I am inside a callback of a query and going through an array of recently fetched rows from a DB and want to launch a subsequent query for each of them:
var query = client.query('SELECT * FROM event', queryAllEventsHandler);
function queryAllEventsHandler(err, result){
allEvents = result.rows;
/* allEvents is an JSON array with the following format
[ {"id_event":1, "name":"name of the event"},
{"id_event":1, "name":"name of the event"}
]
*/
for(var i = 0; i<allEvents.length; i++){
client.query('SELECT * FROM days where id_event = $1',[allEvents[i].id_event], function( err, result){
//I want to have a reference to variable i
}
}
In the above example I want to do something like:
client.query('SELECT * FROM days where id_event = $1',[allEvents[i].id_event], function( AN_EXTRA_ARG, err, result)
Where the AN_EXTRA_ARG is an extra argument or a closure in the callback function... How can I achieve this? Should I create an closure with the of i and pass it as a callback's arg? How ? :|
B) "Synchronizing" queries
I need to launch various queries and create a custom JSON from all of them. Since every query and it's callback are asynchronous (waiting for no one) I was looking for a way to "tame" it and among other stuff I found a solution that occured to me in the first place, but seemed a bit "bad/lousy":
Keeping the query count is really the way to go as #jslatts suggests in Synchronous database queries with Node.js?
Hope I

With regards to question A, you could create a function to handle both your queries and only return when the last query is executed and return both results to the callback.
for(var i = 0; i<allEvents.length; i++){
query(client, allEvents[i], function(result1, result2) {
//do something
});
}
function query(client, event, callback) {
client.query('SELECT * FROM days where id_event = $1',[event.id_event], function( err1, result1){
client.query('SELECT * FROM days where id_event = $1',[event.id_event], function( err2, result2){
callback(result1, result2);
});
});
}

I don't like answering my on question, but this might be of interest to someone.... Regarding the A part of my question. You can assign a custom object to this in your function.
As you know a keyword this corresponds to the Window (top) object when inside a function (unless it's a method function). Using the bind function you can change the reference of this to your own object...
So what I did was, I created a named function queryCallback
function queryCallback(err, result){
//this == Window (default)
}
changed the anonymous callback function to the named one queryCallback:
client.query('SELECT * ... where id_event = $1',[allEvents[i].id_event], queryCallback.bind( {"position":i}, err, result));
Now, note queryCallback.bind( {"position":i}, err, result));
What bind(my_custom_this, [other args]) does is it binds a custom object (in my case {"position":i}) to this inside the function upon which the bind was called...
Now we have this scenario:
function queryCallback(err, result){
//this == {"position":i}
}
Bind explained: http://fitzgeraldnick.com/weblog/26/

A) I personally like lodash (or underscore if you prefer) partial() for this. It takes a function and a number of arguments and returns a function with the provided arguments applied and the remaining arguments still open. It's very much like the functional concept of currying.
B) For combining multiple asynchronous results I highly recommend async. The syntax will take a little getting used to, but makes thing like this very easy. Quick sample:
async.parallel([
one: function(callback){
db.fetch(options, callback);
},
two: function(callback){
db.fetch(options, callback);
}
],
function(err, results){
// this callback will get called when either parallel call gives an error
// or when both have called the callback
if (err) {
// handle error
return;
}
// get the results from results.one and results.two
});
== Added in edit ==
Actually lodash also provides a nicer (imho) albeit slightly more expensive (due to function calls) solution for your problem A):
_(allEvents).each(function(event, index, array) {
client.query('SELECT * FROM days where id_event = $1',[event.id_event], function( err, result) {
// Just use 'index' here, which doesn't change during the each
}
});

For B), your options include async or via a Promise library (such as Q, when.js, Bluebird, etc...)

Related

Using callbacks with sqlite3

Okay so below is a snippet of my code where I have cut many unnecessary things and unrelated but I have left the part dealing with the question.
I am using callbacks while calling the functions needed to run the necessary queries. Since I have many queries like these below, I was wondering if thats the right way to ensure the wanted order for the queries to be executed. I know I could remove the functions and simply put them inside a serialize but its really ugly to repeat the same code so I put them in functions, to put it more clear here is my question.
Question: If I have many queries inside functions the correct way to ensure the get executed in the wanted order is with callbacks as I have done?, even in cases where you dont want to return anything e.g (when updating a row/table in the DB)
get_data(pel, function(results){
var cntl = results;
get_user(pel, function(results_from_user){
update_data(0, 0, function(cb_result){
//do some stuff
});
});
});
function get_data(dt, callback)
{
db.get(`SELECT * FROM my_table`, function(error, row) {
var data_to_return = [..];
return callback(data_to_return);
});
}
function update_data(vdr,dwe,callback)
{
db.run(`UPDATE my_table SET val1='${..}', val2 = '${..}'`);
//..
return callback("updated");
}
function get_user(ms, callback)
{
db.get(`SELECT id FROM my_table_2 WHERE id=${..};`, function(error, row) {
if(row == undefined) db.run(`INSERT INTO my_table_2 (id) VALUES (?)`,[0]);
//..
var id_to_return = [..];
return callback(id_to_return);
});
}
perhaps I should add my code is working as expected, I am just making sure I am not using a weird way.
I can ensure you that you have made a typical solution. in fact callback are used to wait for the response before moving on to the next statement.Goog job

How to properly get result array in async eachSeries for Node?

I'm trying to use async eachSeries in order to code what's the report count for every category. Categories and Reports and stored in separate collections, then I first get available categories and perform a count search on them.
This is my code:
Category.find({},{_id:0, name: 1}, function (err, foundCategories) {
async.eachSeries(foundCategories,
function (item,callback) {
Report.count({category: item.name}, function (err,count) {
var name = item.name;
console.log(count);
return callback(null,{name: count});
});
}
,function (err, results) {
if (err)
response.send(err);
response.send(JSON.stringify(results));
});
});
The problem is that I'm receiving nothing, the console.log outputs actual numbers there, what am I doing wrong?
The API of eachSeries does not provide any results to the final callback - only an error in the failure case. In the success case, it's just a pure control flow "eachSeries is done" indicator, but does not provide a mechanism for passing values from the worker function. mapSeries does provide the functionality you need.
Similar as Peter's answer, async.waterfall provides you with waterfall-execution of your functions, while passing a return value to the next async function in the waterfall chain.

Iterate through Array, update/create Objects asynchronously, when everything is done call callback

I have a problem, but I have no idea how would one go around this.
I'm using loopback, but I think I would've face the same problem in mongodb sooner or later. Let me explain what am I doing:
I fetch entries from another REST services, then I prepare entries for my API response (entries are not ready yet, because they don't have id from my database)
Before I send response I want to check if entry exist in database, if it doesn't:
Create it, if it does (determined by source_id):
Use it & update it to newer version
Send response with entries (entries now have database ids assigned to them)
This seems okay, and easy to implement but it's not as far as my knowledge goes. I will try to explain further in code:
//This will not work since there are many async call, and fixedResults will be empty at the end
var fixedResults = [];
//results is array of entries
results.forEach(function(item) {
Entry.findOne({where: {source_id: item.source_id}}, functioN(err, res) {
//Did we find it in database?
if(res === null) {
//Create object, another async call here
fixedResults.push(newObj);
} else {
//Update object, another async call here
fixedResults.push(updatedObj);
}
});
});
callback(null, fixedResults);
Note: I left some of the code out, but I think its pretty self explanatory if you read through it.
So I want to iterate through all objects, create or update them in database, then when all are updated/created, use them. How would I do this?
You can use promises. They are callbacks that will be invoked after some other condition has completed. Here's an example of chaining together promises https://coderwall.com/p/ijy61g.
The q library is a good one - https://github.com/kriskowal/q
This question how to use q.js promises to work with multiple asynchronous operations gives a nice code example of how you might build these up.
This pattern is generically called an 'async map'
var fixedResults = [];
var outstanding = 0;
//results is array of entries
results.forEach(function(item, i) {
Entry.findOne({where: {source_id: item.source_id}}, functioN(err, res) {
outstanding++;
//Did we find it in database?
if(res === null) {
//Create object, another async call here
DoCreateObject(function (err, result) {
if (err) callback(err);
fixedResults[i] = result;
if (--outstanding === 0) callback (null, fixedResults);
});
} else {
//Update object, another async call here
DoOtherCall(function (err, result) {
if(err) callback(err);
fixedResults[i] = result;
if (--outstanding === 0) callback (null, fixedResults);
});
}
});
});
callback(null, fixedResults);
You could use async.map for this. For each element in the array, run the array iterator function doing what you want to do to each element, then run the callback with the result (instead of fixedResults.push), triggering the map callback when all are done. Each iteration ad database call would then be run in parallel.
Mongo has a function called upsert.
http://docs.mongodb.org/manual/reference/method/db.collection.update/
It does exactly what you ask for without needing the checks. You can fire all three requests asnc and just validate the result comes back as true. No need for additional processing.

Node.JS MySQL query nested function returning array

function searchCoords(){
var result = result;
connection.query('SELECT * FROM monitoring', function(err, result){
if(err){
console.log(err);
}
return{
result: result};
});
}
That's my code. I'm using that code to find the last coordinates of some devices and display them in google maps. but i need to first be able to access the array from the ouside so i can do something like:
myModule.searchCoords().result
or
myModule.searchCoords()().result
However I still can't access the array (result) from the outside function, let alone from another module.
I've been reading on closures, scopes, nested functions, anonymous functions, etc. but i still can't find the solution. What am i doing wrong?
Problem is, that the query is asynchronous, so it can't return a value in the normal flow. If you pass a function when you call searchCoords, then that function can be called after the results come back - which could be after a long delay. This is necessary to prevent the program flow from being blocked whilst potentially long operations are in process.
// accept a callback function to execute after getting results...
function searchCoords(callback){
var result = result;
connection.query('SELECT * FROM monitoring', function(err, result){
if(err){
console.log(err);
}
// run the callback function, passing the results...
callback({result: result});
});
}
// call like this...
// pass a function accepting results object that will be executed as callback
// once results have been returned...
searchCoords(function(resultsObject){
console.log(resultsObject.result)
})

Sending multiple query results to res.render() using node-mysql and mysql-queue

I [new to node.js and programming in general] have two mysql query results (member info and list of workshops that members can attend) and need to send them to res.render() to be presented in .jade template (Member edit page).
To do this I'm using node-mysql and mysql-queue modules. Problem is I don't know how to pass callback function to render the response before queue.execute() finishes so I made workaround and put first two queries in the queue (mysql-queue feature), executed the queue, and afterwards added third "dummy query" which has callback function that renders the template.
My question is can I use this workaround and what would be the proper way to this using this modules?
exports.memberEdit = function (req, res) {
var q = connection.createQueue();
var membersResults,
htmlDateSigned,
htmlBirthDate,
servicesResults;
q.query("SELECT * FROM members WHERE id= ?;", req.id, function (err, results) {
console.log("Članovi: " + results[0]);
membersResults = results[0];
htmlDateSigned = dater.convertDate(results[0].dateSigned);
htmlBirthDate = dater.convertDate(results[0].birthDate);
});
q.query("SELECT * FROM services", function (err, results) {
console.log("Services: " + results);
servicesResults = results;
});
q.execute();
// dummy query that processes response after all queries and callback execute
// before execute() statement
q.query("SELECT 1", function (err,result) {
res.render('memberEdit', { title: 'Edit member',
query:membersResults,
dateSigned:htmlDateSigned,
birthDate:htmlBirthDate,
services:servicesResults });
})
};
I think an alternative could be to use a transaction to wrap your queries with:
var trans = connection.startTransaction();
trans.query(...);
trans.query(...);
trans.commit(function(err, info) {
// here, the queries are done
res.render(...);
});
commit() will call execute() and it provides a callback which will be called when all query callbacks are done.
This is still a bit of a workaround though, it would make more sense if execute() would provide the option of passing a callback (but it doesn't). Alternatively, you could use a module which provides a Promise implementation, but that's still a workaround.

Resources