I have complex solution and I just need to run knex synchronously, is it possible?
I have scenario when knex query is run inside Promise.mapSeries for array with unknown number of elements. For each element some knex query is called, including insert query.
So, this insert could affect result for the next element of array.
var descriptionSplitByCommas = desc.split(",");
Promise.mapSeries(descriptionSplitByCommas, function (name) {
// knex.select
// knex.insert if select doesn't return results
});
This was not my initial code, so maybe even Promise.mapSeries should be removed. But I need each descriptionSplitByCommas array elements to be processed syncrhonously.
Otherwise often while processing next description in array I get SQL error, because of duplicate elements inserted for column with unique index. This would not happen if query would be synchronous.
I am using native promises, so I do not have experience with mapSeries, therefore I cannot tell you what exactly is going on at current state.
However running several asynchronous commands in series instead of parallel is quite common. There is one important thing, you have to know - once you create Promise, you do not have control about how and when it will be resolved. So if you create 100 Promises, they all start resolving in parallel.
This is the reason, there is no method for native promises like Promise.series - it is not possible.
What are your options? If you need to "create promise at one place, but run it in another", then factory method is your friend:
const runPromiseLater = () => Promise.resolve(25);
// some code
const myRealPromise = runPromiseLater();
myRealPromise.then( //
Of course, you can create array with these methods, then is question - how to run it in series?
If you can use Node with support for async/await, then for cycle is good enough
async function runInSeries(array) {
for (let i=0;i < array.length; i++){
await array[i]();
// or if you have only instructions in array then you get the value and then call some // await myMethod(array[i])
}
}
If you cant use that, then async library is your friend: https://caolan.github.io/async/docs.html#series
If you need to use the value from previous calls, you can use .waterfall
Related
I have to check if foreignKey exist, but I can't make a loop with my asynchronous query function
function checkAllFK(tables, foreignKeys) {
let index = -1;
for (var key in foreignKeys) {
index++;
QueryFunction(tables[index], key, foreignKeys[key])
.then(result => {
if(result == null) {
//here, if result is null, that's mean the foreignKey doesn't exist, then, we have to stop the loop and return false;
return false;
}
else if(index == (tables.length - 1)) {
//here, that's mean we are at the end of the loop, and we doesn't break it with the previous if, that's mean all foreignKey exist, then, we return true;
return true;
}
}
the problem is that at the end of the first iteration, you exit the function and the result of the return depends only on the first iteration: false if the if condition is met at the first iteration, null if not
even having looked at many similar topics here, I haven't found a solution to my problem.
Your operation "check all foreignKeys against all tables" can be written in one line.
function checkAllFK(tables, foreignKeys) {
return Promise.all(tables.map(t => Promise.all(foreignKeys.map(k => QueryFunction(t, k))));
}
This function returns a promise that resolves when all queries are done, so you call it like
checkAllFK(tables, foreignKeys)
.then(/* success */)
.catch(/* error */);
However, depending on how many foreignKeys and tables there are and how complex QueryFunction is, this can put enormous stress on the database server. If there are 10 tables and 1000 foreign keys, this would attempt to run 10,000 queries in parallel against the database server. This is not a smart thing to do.
SQL is made to handle these situations. Instead of running 10,000 queries for one thing each, you can decide to run one query for 10,000 things. Or 10 queries for 1000 things each. Both are obviously better than hammering the database server with 10,000 requests.
For example, this returns all foreign keys that do not exist in table_1 in one step.
SELECT
k.key_column
FROM
foreign_keys k
LEFT JOIN table_1 t ON t.key_column = k.key_column
WHERE
t.key_column IS NULL
It depends on what you do in your QueryFunction how the actual SQL needs to look like.
The fact that you have more than one table to check the same foreign keys against is worrying as well, this usually is an indication of poor database design.
there is few common begginer mistakes. Lets start with the tricky one and it is using var keyword in a for-loop in asynchronous context. As you can see, this will return you only 10s, not 1, 2, 3.
for (var i=0; i < 10; i++) {
setTimeout(() => console.log(i), 100);
}
Fix is easy in this case - just use let, which has different scope than var and works as you would expect.
for (let i=0; i < 10; i++) {
setTimeout(() => console.log(i), 100);
}
Second is asynchronous context - the for-cycle ends before you execute the async context inside the promise that is returned by QueryFunction. If you can use newer version of Node.js then async/await is your saviour, just mark your function as async and have
const result = await QueryFunction(tables[index], key, foreignKeys[key])
However be awared - once you have something in Promise/Asynchronous context, you basically cant get back to the synchronous context. So all your logic needs be aware that you are in asynchronous part. That basically means that all the results will be promises and you will need to then them or await them. Its not bug or something, its behaviour and you need to count on that.
you can do same like print a message in the console or write in a file, can get the visual result.
if you want to get a result, use 'wait', please.
I'm trying to map a large array(around 11k items). The actual mapping function is super simple, but the amount of items in the array is just too much and it blocks everything.
What's the best approach to avoid this? I tried using Async map, but I'm getting the same problem.
You can somehow change the sync (map) operation to an async operation using Promise or setTimeout. Recursive function can be used to progressively process the items in large array.
For example:
const largeArrays = [];
const resultArrays = [];
function process(source, target, index) {
if (index === target.length) {
// Now the result Arrays should have all processed data
return
}
// Dummy map action here for example, please change to your own one
target.push(source[index] + 1);
setTimeout(() => { process(source, target, index + 1) }, 0);
}
process(largeArrays, resultArrays, 0)
You can wrap about code into a Promise and resolve it instead of using the return statement above.
You don't need any fancy library, just native javascript function. You can check on two of my blogs illustrating ideas for these kinds of problems.
How to avoid Stack overflow error on recursion
How to make long running loop breakable?
i did not try this but using an async function that handles the mapping part, then call that function in every iteration with necessary information (index, array item etc.) wont help?
I've got a problem with redis and nodejs. I have to loop through a list of phone numbers, and check if this number is present in my redis database. Here is my code :
function getContactList(contacts, callback) {
var contactList = {};
for(var i = 0; i < contacts.length; i++) {
var phoneNumber = contacts[i];
if(utils.isValidNumber(phoneNumber)) {
db.client().get(phoneNumber).then(function(reply) {
console.log("before");
contactList[phoneNumber] = reply;
});
}
}
console.log("after");
callback(contactList);
};
The "after" console log appears before the "before" console log, and the callback always return an empty contactList. This is because requests to redis are asynchronous if I understood well. But the thing is I don't know how to make it works.
How can I do ?
You have two main issues.
Your phoneNumber variable will not be what you want it to be. That can be fixed by changing to a .forEach() or .map() iteration of your array because that will create a local function scope for the current variable.
You have create a way to know when all the async operations are done. There are lots of duplicate questions/answers that show how to do that. You probably want to use Promise.all().
I'd suggest this solution that leverages the promises you already have:
function getContactList(contacts) {
var contactList = {};
return Promise.all(contacts.filter(utils.isValidNumber).map(function(phoneNumber) {
return db.client().get(phoneNumber).then(function(reply) {
// build custom object
constactList[phoneNumber] = reply;
});
})).then(function() {
// make contactList be the resolve value
return contactList;
});
}
getContactList.then(function(contactList) {
// use the contactList here
}, funtion(err) {
// process errors here
});
Here's how this works:
Call contacts.filter(utils.isValidNumber) to filter the array to only valid numbers.
Call .map() to iterate through that filtered array
return db.client().get(phoneNumber) from the .map() callback to create an array of promises.
After getting the data for the phone number, add that data to your custom contactList object (this is essentially a side effect of the .map() loop.
Use Promise.all() on the returned array of promises to know when they are all done.
Make the contactList object we built up be the resolve value of the returned promise.
Then, to call it just use the returned promise with .then() to get the final result. No need to add a callback argument when you already have a promise that you can just return.
The simplest solution may be to use MGET with a list of phone numbers and put the callback in the 'then' section.
You could also put the promises in an array and use Promise.all().
At some point you might want your function to return a promise rather than with callback, just to stay consistent.
Consider refactoring your NodeJS code to use Promises.
Bluebird is an excellent choice: http://bluebirdjs.com/docs/working-with-callbacks.html
you put async code into a for loop (sync operations). So, each iteration of the for loop is not waiting for the db.client(...) function to end.
Take a look at this stackoverflow answer, it explains how to make async loops :
Here
I found 2 ways to execute queries using mongoose find(), but was wondering if they are different:
When should we use:
Model.find({},cb)
And when should we use:
Model.find({}).exec(cb)
Both execute the query and then run the callback.
The main difference is that the first one, returns a Query object while the second returns a Promise which is useful if you need promises.
const query = Model.find({}, cb);
Then you can work with the query variable.
While the promise...
const promise = Model.find({}).exec();
Then you can work with the promise and do things like:
promise.then(cb);
promise.catch((err) => {
console.error(err);
});
But if you do Model.find({}).exec(cb); the callback is also called without using promises...
I hope it helps
The difference is that the first one executes the query and calls your callback. Whereas in the second one where you omit the callback, the query will not be executed. Instead it will return a Query object which can be used to chain methods, specify search terms, and cursor options, etc...
http://mongoosejs.com/docs/2.7.x/docs/query.html
If you don't need to do any kind of chaining, or anything else with the underlying Cursor then just use the first method.
But the second method can be helpful to do things like:
findCars : function(options, callback) {
var query = Model.find({});
if(options.limit) query.limit(options.limit);
if(options.skip) query.limit(options.skip);
if(options.populate) query.populate(options.populate);
return query.exec(callback);
}
This question already has answers here:
Why is my variable unaltered after I modify it inside of a function? - Asynchronous code reference
(7 answers)
Waiting for async call
(1 answer)
Closed 8 years ago.
I have a User model function all that returns all users in an array.
User = {
all: function() {
var users = [];
globalLibrary.db.collection('users').find({}).toArray(function(err, items) {
test.equal(null, err);
users = items;
});
return users;
}
}
I want to ensure that the function doesn't finish or return before the mongodb query is complete.
Currently, this function is just returning [], and querying mongodb asynchronously. I want the function to wait until the function finish query is complete and it returns and array filled with users.
Note:
globalLibrary.db is just a cached mongodb connection.
My solution using promises
Since some people closed the question as a duplicate, I'll write my answer here within the question. Hopefully someone else who is not familiar with asynchronous programming find this useful.
Problem
The point is that you need to use a callback - there's no way to block
on something asynchronous. (...) – Aaron Dufour
The function User.all() I wrote above will return empty array because nothing is stopping the process while the mongodb query is happening. There are some primitive ways to stop the process.
You can crank up some hacky stuff using setTimeout(). This way sucks though because you have to use some arbitrary time that might be higher than the actual time you need to query the mongodb. Simply put, it's slower.
You can also use some event based stuff that #AaronDufour linked in the comment (now deleted). So you can have something a pair of event emitter and a listener to replace setTimeout() way. #Someone points out though that you shouldn't use this in node.js for blocking function.
Now finally, the conventional way of dealing with this problem is using callbacks as pointed out by the answer below. This is fine, but callbacks can quickly get out of control once you start having multiple callbacks stacked inside one another.
I am using https://github.com/kriskowal/q for promises. While using promises doesn't solve all the woes of callbacks, but it looks most like simple synchronous programming style which I think is a huge plus.
First do npm install --save q to start using Q package.
Here's my new User.all() function.
var Q = require('q')
var Users = {
all: function() {
var deferred = Q.defer();
globalLibrary.db.collection('users').find({}).toArray(function(err, items) {
if (err) {
deferred.reject(new Error(err));
} else {
deferred.resolve(items);
}
});
return deferred.promise;
}
}
Now if you want to use User.all().
User.all()
.then(function (docs) {
// docs is the array of returned users from mongodb query.
console.log(docs);
}, function(error) {
// do something if there's an error.
}, function(progress) {
// do something while the query is running.
});
The preferred way of doing this in node.js is to embrace the async nature of it and pass in a callback to the function. A few decent tutorials (last one including mongodb examples):
http://justinklemm.com/node-js-async-tutorial/
http://msdn.microsoft.com/en-us/magazine/dn754378.aspx
If you feel you must go against the grain and go synchronous, I'd take a look at this sync library for node.js and mongodb:
https://www.npmjs.com/package/mongo-sync