Export value from callback in node.js - node.js

I am building a small node.js website with a user interface that features a dropdown with a list of countries.
Previously the list of countries was hard coded in a json file that I would read:
exports.countries = require('./json/countries.json');
Then I realized I shouldn't hard code it like that when I can do a distinct query to the the list from the mongodb database.
db.collection.distinct('c', {}, function(err, data) {
// something
});
But then there's the question of how to extract the value of the data variable in that callback function. I discovered that this works:
db.collection.distinct('c', {}, function(err, data) {
if (err) {
throw Error('mongodb problem - unable to load distinct values');
} else {
exports.countries = data;
}
});
I am new to node.js and this seems fishy to me. Is this OK code? Is it better do this with generators or promises? If I wanted to use generators or promises to do this, how would I do that?
The end result where this is used is in a template. ref.countries is the actual list of countries using my fishy code. If I had a Promise instead of the list of countries, how would I change this code?
<% ref.countries.forEach(function(c) { -%>
<option value="<%= c %>">
<%= ref.isoCodes[c] -%>
</option>
<% }); -%>
I am using node v6.10.3.

Your export that you say "works" is impossible to use because the code that loads your module would have no idea when the exports.countries value has actually been set because it is set in an asynchronous call that finishes some indeterminate time in the future. In addition, you have no means of handling any error in that function.
The modern way of doing this would be to export a function that, when called, returns a promise that resolves to the data (or rejects with an error). The code loading your module, then calls that exported function, gets the promise, uses .then() on the promise and uses the data in the .then() handler. That could look something like this:
function getCountries() {
return new Promise(function(resolve, reject) {
db.collection.distinct('c', {}, function(err, data) {
if (err) {
reject(err);
} else {
resolve(data);
}
});
}
}
module.exports.getCountries = getCountries;
The caller would then do something like this:
const myModule = require('myModule');
myModule.getCountries().then(function(countries) {
console.log(countries);
// use country data here
}).catch(function(err) {
// deal with error here
});
Most databases for node.js these days have some level of promise support built in so you often don't have to create your own promise wrapper around your DB functions like was shown above, but rather can use a promise directly returned from the DB engine. How that works is specific to the particular database/version you are using.
If you are using the list of countries in a template rendering operation, then you will have to fetch the list of countries (and any other data needed for the template rendering) and only call res.render() when all the data has been successfully retrieved. This probably also leads to what you should do when there's an error retrieving the necessary data. In that case, you would typically respond with a 5xx error code for the page request and may want to render some sort of error page that informs the end-user about the error.

I am using Node 6.10 so I don't have async and await but if I did they would help me here:
https://developers.google.com/web/fundamentals/getting-started/primers/async-functions
Instead I can use the asyncawait library:
https://github.com/yortus/asyncawait
Code looks like this:
var async = require('asyncawait/async');
var await = require('asyncawait/await');
const db = require('_/db');
function getDistinctValues(key) {
return new Promise((resolve, reject) => {
db.collection.distinct(key, {}, function(err, data) {
if (err) {
throw Error('mongodb problem - unable to load distinct values');
} else {
resolve(data);
}
});
});
};
async(function () {
exports.countries = await(getDistinctValues('c'));
exports.categories = await(getDistinctValues('w'));
})();
Now I can be sure ref.countries and ref.categories are available after this is loaded.

Related

node js mysqlquery attempting to add .then

Currently I'm working on a project in node js. Specifically I'm using soem boiler plate for adobe CEP which allows you to run some js in a panel in their programs. In the code there is the following code that works fine.
mysqlConn.query(query, function(err, result) {
do something with error and result})
When this is execute it gives me an error or result depending on if there is data or there was a problem etc. What I need to do is to run another function after this executes and gives me the result. My knowledge of promises is limited (even thou I've read extensively on it and done tons of tutorials). In my limited knowledge I assume mysqlConn.query returns a promise. So I was assuming I can just do this:
mysqlConn.query(query, function(err, result) {
do something with error and result})
.then(console.log('anything here?'));
This logs to the console 'anything here?' but it also gives me this error in the console.
Uncaught TypeError: mysqlConn.query(...).then is not a function
Any idea what I'm doing wrong or how I can achieve the desired results?
This indicates the mysqlConn.query method does not return a Promise
Instead, you will need to "promisify" the method so it can be changed with .then() and friends:
const myFunc = new Promise((resolve, reject) => {
return mysqlConn.query(query, function(err, result) {
if (err) reject(err);
return resolve(result);
});
});
Now we have myFunc, a Promise-based interface wrapping the callback-basked query function. We can use it like so:
return myFunc()
.then((result) => { ... }) // result will be the result of the query
.catch((err) => { .. } ) // err from the query as well
This can also be achieved in a slightly more involved way through other tools, but I highly recommend you understand this example first.

Node.js DNS Lookup scope error? (POST request)

I'm making a DNS Lookup API using Node.js and Express.js framework such that when it sends a POST request, it should return the addresses of different record types.
app.post('/', (req, res) => {
// Request format
// const l = {
// lookup: 'twitter.com',
// recordTypes: ['A', 'TXT']
// };
// Using destructor to fetch properties
const { lookup, recordTypes } = req.body;
console.log(lookup, recordTypes);
// For each record type
recordTypes.forEach(function(type) {
// setTimeout to get something async
setTimeout(function() {
dns.resolve(lookup.toLowerCase(), type, (err, addresses) => {
console.log(type);
if (err) {
return console.log(`\nType(${type}):\n`, err);
}
result = result + JSON.stringify({ type: `${type}`, response: { addresses } });
console.log(result);
});
}, 2000);
});
res.send(result);
});
It logs the correct stuff in the console but when it comes to the response, it returns an empty string. I used setTimeout to mimic the asynchronous nature of the request but it just does not work.
Please assume that I have declared stuff like result etc. because it is working. Also, please don't to redirect me to the Node.js documentation because I have already read that stuff and that's not the problem here. The problem is that I need to get every record type in an array and send that back as a response.
Here's what I have tried:
Tried to push response for each record type in the result array,
Tried to use a for of loop instead of forEach
Please help!
The way I'm reading your code is that for each item in the array you correctly use callbacks to do each individual bit of processing.
However, remember that forEach itself is not asynchronous. Thus you are setting up a bunch of tasks that will complete sometime, then returning undefined... then your results start to trickle in.
There's a couple ways to correctly. As you are using callbacks here I will use that style. You want to get a callback when all items in an array have been completely processed. The async module does this very well, providing a lot of high quality methods that act on arrays and such and give you a way to have a callback when they are all over.
Your function will look something like:
let res = []
async.each( recordTypes,
( type, done ) => {
dns.resolve(lookup.toLowerCase(), type, (err, addresses) => {
result = result + JSON.stringify({ type: `${type}`, response: { addresses } });
done(err)
} )
},
(allOverError) => {
res.send(result);
}
)
Notice there are two function parameters here: the first one is called for every item in the list, and the last is called when every item in the list has been completely processed.
There are other ways too, promises or the async/await keywords (confusing because of the name of the async module), but callbacks are good.

How to process a big array applying a async function for each element in nodejs?

I am working with zombie.js to scrape one site, I must to use the callback style to connect to each url. The point is that I have got an urls array and I need to process each urls using an async function. This is my first approach:
Array urls = {http..., http...};
function process_url(index)
{
if(index == urls.length)
return;
async_function(url,
function() {
...
//parse the url
...
// Process the next url
process_url(index++);
}
);
}
process_url(0)
Without use someone third party nodejs library to use the asyn funtion as sync function or to wait for the function (wait.for, synchornized, mocha), this is the way that I though to resolve this problem, I don't know what would happen if the array is too big. Is the function released from the memory when the next function is called? or all the functions are in memory until the end?
Any ideas?
Your scheme will work. I call it "manually sequencing async operations".
A general purpose version of what you're doing would look like this:
function processItem(data, callback) {
// do your async function here
// for example, let's suppose it was an http request using the request module
request(data, callback);
}
function processArray(array, fn) {
var index = 0;
function next() {
if (index < array.length) {
fn(array[index++], function(err, result) {
// process error here
if (err) return;
// process result here
next();
});
}
}
next();
}
processArray(arr, processItem);
As to your specific questions:
I don't know what would happen if the array is too big. Is the
function released from the memory when the next function is called? or
all the functions are in memory until the end?
Memory in Javascript is released when it is not longer referenced by any running code and when the garbage collector gets time to run. Since you are running a series of asynchronous operations here, it is likely that the garbage collector gets a chance to run regularly while waiting for the http response from the async operation so memory could get cleaned up then. Functions are just another type of object in Javascript and they get garbage collected just like anything else. When they are no longer reference by running code, they are eligible for garbage collection.
In your specific code, because you are re-calling process_url() only in an async callback, there is no stack build-up (as in normal recursion). The prior instance of process_url() has already completed BEFORE the async callback is called and BEFORE you call the next iteration of process_url().
In general, management and coordination of multiple async operations is much, much easier using promises which are built into the current versions of node.js and are part of the ES6 ECMAScript standard. No external libraries are required to use promises in current versions of node.js.
For a list of a number of different techniques for sequencing your asynchronous operations on your array, both using promises and not using promises, see:
How to synchronize a sequence of promises?.
The first step in using promises is the "promisify" your async function so that it returns a promise instead of takes a callback.
function async_function_promise(url) {
return new Promise(function(resolve, reject) {
async_function(url, function(err, result) {
if (err) {
reject(err);
} else {
resolve(result);
}
});
});
}
Now, you have a version of your function that returns promises.
If you want your async operations to proceed one at a time so the next one doesn't start until the previous one has completed, then a usual design pattern for that is to use .reduce() like this:
function process_urls(array) {
return array.reduce(function(p, url) {
return p.then(function(priorResult) {
return async_function_promise(url);
});
}, Promise.resolve());
}
Then, you can call it like this:
var myArray = ["url1", "url2", ...];
process_urls(myArray).then(function(finalResult) {
// all of them are done here
}, function(err) {
// error here
});
There are also Promise libraries that have some helpful features that make this type of coding simpler. I, myself, use the Bluebird promise library. Here's how your code would look using Bluebird:
var Promise = require('bluebird');
var async_function_promise = Promise.promisify(async_function);
function process_urls(array) {
return Promise.map(array, async_function_promise, {concurrency: 1});
}
process_urls(myArray).then(function(allResults) {
// all of them are done here and allResults is an array of the results
}, function(err) {
// error here
});
Note, you can change the concurrency value to whatever you want here. For example, you would probably get faster end-to-end performance if you increased it to something between 2 and 5 (depends upon the server implementation on how this is best optimized).

Returning an Array using Firebase

Trying to find the best-use example of returning an array of data in Node.js with Q library (or any similar library, I'm not partial) when using Firebase .on("child_added");
I've tried using Q.all() but it never seems to wait for the promises to fill before returning. This is my current example:
function getIndex()
{
var deferred = q.defer();
deferred.resolve(new FirebaseIndex( Firebase.child('users').child(user.app_user_id).child('posts'), Firebase.child('posts') ) );
return deferred.promise;
}
function getPost( post )
{
var deferred = q.defer();
deferred.resolve(post.val());
return deferred.promise;
}
function getPosts()
{
var promises = [];
getIndex().then( function (posts) {
posts.on( 'child_added', function (_post) {
promises.push( getPost(_post) );
});
});
return q.all(promises);
}
The problem occurs in getPosts(). It pushes a promise into your array inside an async function--that won't work since q.all is called before the promise objects have been added.
Also, child_added is a real-time event notification. You can't use that as a way to grab "all of the data" because there is no such thing as "all"; the data is constantly changing in real-time environments. FirebaseIndex is also using child_added callbacks internally, so that's not going to work with this use case either.
You can grab all of the posts using the 'value' callback (but not a specific subset of records) as follows:
function getPosts() {
var def = q.defer();
Firebase.child('users').once('value', function(snap) {
var records = [];
snap.forEach(function(ss) {
records.push( ss.val() );
});
def.resolve(records);
});
return def.promise;
}
But at this point, it's time to consider things in terms of real-time environments. Most likely, there is no reason "all" data needs to be present before getting to work.
Consider just grabbing each record as they come in and appending them to whatever DOM or Array where they need to be stored, and working from an event driven model instead of a GET/POST centered approach.
With luck, you can bypass this use case entirely.

How to wait for all async calls to finish

I'm using Mongoose with Node.js and have the following code that will call the callback after all the save() calls has finished. However, I feel that this is a very dirty way of doing it and would like to see the proper way to get this done.
function setup(callback) {
// Clear the DB and load fixtures
Account.remove({}, addFixtureData);
function addFixtureData() {
// Load the fixtures
fs.readFile('./fixtures/account.json', 'utf8', function(err, data) {
if (err) { throw err; }
var jsonData = JSON.parse(data);
var count = 0;
jsonData.forEach(function(json) {
count++;
var account = new Account(json);
account.save(function(err) {
if (err) { throw err; }
if (--count == 0 && callback) callback();
});
});
});
}
}
You can clean up the code a bit by using a library like async or Step.
Also, I've written a small module that handles loading fixtures for you, so you just do:
var fixtures = require('./mongoose-fixtures');
fixtures.load('./fixtures/account.json', function(err) {
//Fixtures loaded, you're ready to go
};
Github:
https://github.com/powmedia/mongoose-fixtures
It will also load a directory of fixture files, or objects.
I did a talk about common asyncronous patterns (serial and parallel) and ways to solve them:
https://github.com/masylum/i-love-async
I hope its useful.
I've recently created simpler abstraction called wait.for to call async functions in sync mode (based on Fibers). It's at an early stage but works. It is at:
https://github.com/luciotato/waitfor
Using wait.for, you can call any standard nodejs async function, as if it were a sync function, without blocking node's event loop. You can code sequentially when you need it.
using wait.for your code will be:
//in a fiber
function setup(callback) {
// Clear the DB and load fixtures
wait.for(Account.remove,{});
// Load the fixtures
var data = wait.for(fs.readFile,'./fixtures/account.json', 'utf8');
var jsonData = JSON.parse(data);
jsonData.forEach(function(json) {
var account = new Account(json);
wait.forMethod(account,'save');
}
callback();
}
That's actually the proper way of doing it, more or less. What you're doing there is a parallel loop. You can abstract it into it's own "async parallel foreach" function if you want (and many do), but that's really the only way of doing a parallel loop.
Depending on what you intended, one thing that could be done differently is the error handling. Because you're throwing, if there's a single error, that callback will never get executed (count won't be decremented). So it might be better to do:
account.save(function(err) {
if (err) return callback(err);
if (!--count) callback();
});
And handle the error in the callback. It's better node-convention-wise.
I would also change another thing to save you the trouble of incrementing count on every iteration:
var jsonData = JSON.parse(data)
, count = jsonData.length;
jsonData.forEach(function(json) {
var account = new Account(json);
account.save(function(err) {
if (err) return callback(err);
if (!--count) callback();
});
});
If you are already using underscore.js anywhere in your project, you can leverage the after method. You need to know how many async calls will be out there in advance, but aside from that it's a pretty elegant solution.

Resources