Redis Node - Get from hash - Not inserting into array - node.js

My goal is to insert the values gotten from a redis hash. I am using the redis package for node js.
My code is the following:
getFromHash(ids) {
const resultArray = [];
ids.forEach((id) => {
common.redisMaster.hget('mykey', id, (err, res) => {
resultArray.push(res);
});
});
console.log(resultArray);
},
The array logged at the end of the function is empty and res is not empty. What could i do to fill this array please ?

You need to use some control flow, either the async library or Promises (as described in reds docs)
Put your console.log inside the callback when the results return from the redis call. Then you will see more print out. Use one of the control flow patterns for your .forEach as well, as that is currently synchronous.

If you modify your code to something like this, it will work nicely:
var getFromHash = function getFromHash(ids) {
const resultArray = [];
ids.forEach((id) => {
common.redisMaster.hget('mykey', id, (err, res) => {
resultArray.push(res);
if (resultArray.length === ids.length) {
// All done.
console.log('getFromHash complete: ', resultArray);
}
});
});
};
In your original code you're printing the result array before any of the hget calls have returned.
Another approach will be to create an array of promises and then do a Promise.all on it.
You'll see this kind of behavior a lot with Node, remember it uses asynchronous calls for almost all i/o. When you're coming from a language where most function calls are synchronous you get tripped up by this kind of problem a lot!

Related

Multiple Callback Functions with the Node Twitter Library

I'm trying to make requests to Twitter with Twitter for Node and store responses in an array for handling later. I'm pushing returned tweets to an array with each push() happening in a callback, which seems to be working fine. My problem is that I'm having trouble accessing the full array with all pushed tweets.
The cause, of course, is that any attempt to work with that array is getting called before the results from the Twitter API have arrived, so I'm getting an empty array.
How can I (and should I) get my function working with the full array into another callback? I'm asking this from the perspective of someone still trying to get a firm grasp on asynchronous programming - especially multiple callbacks or functions that have to run asynchronously.
Again, current result is tweetHold = [], and I'd like tweetHold to contain all matched tweets for all users in the searchArray.
let searchArray = {
users: ['ByBuddha', 'thetweetofgod']
}
let tweetHold = [];
let T = new Twitter(config);
for (user of searchArray.users) {
let params = {
q: 'from:'+ user,
count: 1,
tweet_mode: 'extended',
result_type: 'recent',
lang: 'en'
}
T.get('search/tweets', params, returnedTweets);
}
function returnedTweets(err, tweets, response) {
tweetHold.push(tweets);
}
// obviously, doesn't work as the array is logged to console before T.get() is done
console.log(tweetHold);
T.get accepts a callback function that gets called once the asynchronous operation is done. But since you want to get multiple responses, and not just one, using just the callbacks by themselves would be a bit messy. For example, inside returnedTweets, you could increment a persistent counter variable, and call the next function once counter === searchArray.users.length, but it would be more elegant to use Promises instead.
Map each T.get call to a Promise that resolves with the tweets variable you're interested in, and then call Promise.all on an array of those Promises. Promise.all takes an array of Promises and returns a Promise that resolves once every Promise in the passed array has resolved.
Note that it looks like you're currently ignoring the err that might come back from T.get - that's probably not a good idea, it would be better to be able to check when errors occur, and then handle the error somehow (otherwise, the tweetHold array may sometimes contain broken data). Luckily, if you use Promises, implementing this is easy - just reject if there's an err, and catch after the Promise.all:
const T = new Twitter(config);
const searchObject = {
users: ['ByBuddha', 'thetweetofgod']
};
const searchPromises = searchArray.users.map((user) => {
return new Promise((resolve, reject) => {
const params = {
q: 'from:'+ user,
count: 1,
tweet_mode: 'extended',
result_type: 'recent',
lang: 'en'
};
T.get('search/tweets', params, (err, tweets) => {
if (err) reject(err);
else resolve(tweets);
});
});
});
Promise.all(searchPromises)
.then((tweetHold) => {
// tweetHold will be an array containing the `tweets` variable for each user searched
console.log(tweetHold);
})
.catch((err) => {
// there was an error, best to handle it somehow
// the `.then` above will not be entered
});

Node.js DNS Lookup scope error? (POST request)

I'm making a DNS Lookup API using Node.js and Express.js framework such that when it sends a POST request, it should return the addresses of different record types.
app.post('/', (req, res) => {
// Request format
// const l = {
// lookup: 'twitter.com',
// recordTypes: ['A', 'TXT']
// };
// Using destructor to fetch properties
const { lookup, recordTypes } = req.body;
console.log(lookup, recordTypes);
// For each record type
recordTypes.forEach(function(type) {
// setTimeout to get something async
setTimeout(function() {
dns.resolve(lookup.toLowerCase(), type, (err, addresses) => {
console.log(type);
if (err) {
return console.log(`\nType(${type}):\n`, err);
}
result = result + JSON.stringify({ type: `${type}`, response: { addresses } });
console.log(result);
});
}, 2000);
});
res.send(result);
});
It logs the correct stuff in the console but when it comes to the response, it returns an empty string. I used setTimeout to mimic the asynchronous nature of the request but it just does not work.
Please assume that I have declared stuff like result etc. because it is working. Also, please don't to redirect me to the Node.js documentation because I have already read that stuff and that's not the problem here. The problem is that I need to get every record type in an array and send that back as a response.
Here's what I have tried:
Tried to push response for each record type in the result array,
Tried to use a for of loop instead of forEach
Please help!
The way I'm reading your code is that for each item in the array you correctly use callbacks to do each individual bit of processing.
However, remember that forEach itself is not asynchronous. Thus you are setting up a bunch of tasks that will complete sometime, then returning undefined... then your results start to trickle in.
There's a couple ways to correctly. As you are using callbacks here I will use that style. You want to get a callback when all items in an array have been completely processed. The async module does this very well, providing a lot of high quality methods that act on arrays and such and give you a way to have a callback when they are all over.
Your function will look something like:
let res = []
async.each( recordTypes,
( type, done ) => {
dns.resolve(lookup.toLowerCase(), type, (err, addresses) => {
result = result + JSON.stringify({ type: `${type}`, response: { addresses } });
done(err)
} )
},
(allOverError) => {
res.send(result);
}
)
Notice there are two function parameters here: the first one is called for every item in the list, and the last is called when every item in the list has been completely processed.
There are other ways too, promises or the async/await keywords (confusing because of the name of the async module), but callbacks are good.

How to properly get result array in async eachSeries for Node?

I'm trying to use async eachSeries in order to code what's the report count for every category. Categories and Reports and stored in separate collections, then I first get available categories and perform a count search on them.
This is my code:
Category.find({},{_id:0, name: 1}, function (err, foundCategories) {
async.eachSeries(foundCategories,
function (item,callback) {
Report.count({category: item.name}, function (err,count) {
var name = item.name;
console.log(count);
return callback(null,{name: count});
});
}
,function (err, results) {
if (err)
response.send(err);
response.send(JSON.stringify(results));
});
});
The problem is that I'm receiving nothing, the console.log outputs actual numbers there, what am I doing wrong?
The API of eachSeries does not provide any results to the final callback - only an error in the failure case. In the success case, it's just a pure control flow "eachSeries is done" indicator, but does not provide a mechanism for passing values from the worker function. mapSeries does provide the functionality you need.
Similar as Peter's answer, async.waterfall provides you with waterfall-execution of your functions, while passing a return value to the next async function in the waterfall chain.

Iterate through Array, update/create Objects asynchronously, when everything is done call callback

I have a problem, but I have no idea how would one go around this.
I'm using loopback, but I think I would've face the same problem in mongodb sooner or later. Let me explain what am I doing:
I fetch entries from another REST services, then I prepare entries for my API response (entries are not ready yet, because they don't have id from my database)
Before I send response I want to check if entry exist in database, if it doesn't:
Create it, if it does (determined by source_id):
Use it & update it to newer version
Send response with entries (entries now have database ids assigned to them)
This seems okay, and easy to implement but it's not as far as my knowledge goes. I will try to explain further in code:
//This will not work since there are many async call, and fixedResults will be empty at the end
var fixedResults = [];
//results is array of entries
results.forEach(function(item) {
Entry.findOne({where: {source_id: item.source_id}}, functioN(err, res) {
//Did we find it in database?
if(res === null) {
//Create object, another async call here
fixedResults.push(newObj);
} else {
//Update object, another async call here
fixedResults.push(updatedObj);
}
});
});
callback(null, fixedResults);
Note: I left some of the code out, but I think its pretty self explanatory if you read through it.
So I want to iterate through all objects, create or update them in database, then when all are updated/created, use them. How would I do this?
You can use promises. They are callbacks that will be invoked after some other condition has completed. Here's an example of chaining together promises https://coderwall.com/p/ijy61g.
The q library is a good one - https://github.com/kriskowal/q
This question how to use q.js promises to work with multiple asynchronous operations gives a nice code example of how you might build these up.
This pattern is generically called an 'async map'
var fixedResults = [];
var outstanding = 0;
//results is array of entries
results.forEach(function(item, i) {
Entry.findOne({where: {source_id: item.source_id}}, functioN(err, res) {
outstanding++;
//Did we find it in database?
if(res === null) {
//Create object, another async call here
DoCreateObject(function (err, result) {
if (err) callback(err);
fixedResults[i] = result;
if (--outstanding === 0) callback (null, fixedResults);
});
} else {
//Update object, another async call here
DoOtherCall(function (err, result) {
if(err) callback(err);
fixedResults[i] = result;
if (--outstanding === 0) callback (null, fixedResults);
});
}
});
});
callback(null, fixedResults);
You could use async.map for this. For each element in the array, run the array iterator function doing what you want to do to each element, then run the callback with the result (instead of fixedResults.push), triggering the map callback when all are done. Each iteration ad database call would then be run in parallel.
Mongo has a function called upsert.
http://docs.mongodb.org/manual/reference/method/db.collection.update/
It does exactly what you ask for without needing the checks. You can fire all three requests asnc and just validate the result comes back as true. No need for additional processing.

Returning an Array using Firebase

Trying to find the best-use example of returning an array of data in Node.js with Q library (or any similar library, I'm not partial) when using Firebase .on("child_added");
I've tried using Q.all() but it never seems to wait for the promises to fill before returning. This is my current example:
function getIndex()
{
var deferred = q.defer();
deferred.resolve(new FirebaseIndex( Firebase.child('users').child(user.app_user_id).child('posts'), Firebase.child('posts') ) );
return deferred.promise;
}
function getPost( post )
{
var deferred = q.defer();
deferred.resolve(post.val());
return deferred.promise;
}
function getPosts()
{
var promises = [];
getIndex().then( function (posts) {
posts.on( 'child_added', function (_post) {
promises.push( getPost(_post) );
});
});
return q.all(promises);
}
The problem occurs in getPosts(). It pushes a promise into your array inside an async function--that won't work since q.all is called before the promise objects have been added.
Also, child_added is a real-time event notification. You can't use that as a way to grab "all of the data" because there is no such thing as "all"; the data is constantly changing in real-time environments. FirebaseIndex is also using child_added callbacks internally, so that's not going to work with this use case either.
You can grab all of the posts using the 'value' callback (but not a specific subset of records) as follows:
function getPosts() {
var def = q.defer();
Firebase.child('users').once('value', function(snap) {
var records = [];
snap.forEach(function(ss) {
records.push( ss.val() );
});
def.resolve(records);
});
return def.promise;
}
But at this point, it's time to consider things in terms of real-time environments. Most likely, there is no reason "all" data needs to be present before getting to work.
Consider just grabbing each record as they come in and appending them to whatever DOM or Array where they need to be stored, and working from an event driven model instead of a GET/POST centered approach.
With luck, you can bypass this use case entirely.

Resources