I want to use NodeJS to read 60k records from a MySQL database and write them to a ArangoDB database. I will later use ArangoDB's aggregation features etc. to process my dataset.
Coming from PHP, where a script usually runs synchronous, and because I believe it makes sense here, my initial (naive) try was to make my NodeJS script run sync too. However, it doesn't work as expected:
I print to console, call a function via .sync() to connect to ArangoDB server and print all existing databases, then print to console again. But everything below the sync call to my ArangoDB function is completely ignored (does not print to console again, nor does it seem to execute anything else here).
What am I overlooking? Does .done() in the function called via .sync() cause trouble?
var mysql = require('node-mysql');
var arango = require('arangojs');
//var sync = require('node-sync'); // Wrong one!
var sync = require('sync');
function test_arango_query() {
var db = arango.Connection("http://localhost:8529");
db.database.list().done(function(res) {
console.log("Databases: %j", res);
});
return "something?";
}
sync(function() {
console.log("sync?");
var result = test_arango_query.sync();
console.log("done."); // DOES NOT PRINT, NEVER EXECUTED?!
return result;
}, function(err, result) {
if (err) console.error(err);
console.log(result);
});
Your function test_arango_query doesn't use a callback. sync only works with functions that use a callback. It needs to know when the data is ready to return it from .sync(), if your function never calls the callback, then sync can't ever return a result.
Update your function to call a callback function when you want it to return:
function test_arango_query(callback) {
var db = arango.Connection("http://localhost:8529");
db.database.list().done(function(res) {
console.log("Databases: %j", res);
callback('something');
});
}
Related
I was doing a server using nodejs, it need get data from mongodb. I retrieve data after require(../db.js). Somebody said the mongodb needn't be close in nodejs,because nodejs is a single process.....
My question: Need I call dropCollection to close the collection after invoked the db function many times;and How to do?Where to do that? Please,Thanks.
You dont need to drop the collection after invoking db functions,simply call db.close() though it is not needed. But if you want to do it , you can do it as follows:
var dropRestaurants = function(db, callback) {
db.collection('restaurants').drop( function(err, response) {
console.log(response)
callback();
});
};
var storedArticleArray = db.collection('storedArticle').find(query).toArray;
console.dir(storedArticleArray);
db.close();
How can I ensure that console.dir(stroedArticleArray) displays its argument only after the database completed the query and stored the result in storedArticleArray?
Also db.close() does not close before the query is completed.
Does this work:
var storedArticleArray = db.collection('').find(query).toArray(function() {
console.dir(storedArticleArray);
db.close();
});
You must use Node.js callback functions when you query your MongoDB database, because they're asynchronous operations.
Following your example you can use:
var storedArticleArray = [];
db.collection('storedArticle').find(query, function(error, data) {
storedArticleArray = data.toArray;
console.dir(storedArticleArray);
db.close();
});
The callback function will be executed once the query will be completed and returned data (or an error, that you must always handle). In the callback you'll be sure to close the db connection without problems.
I am building an application in Meteor that relies on real time updates from the database. The way Meteor has laid out the examples is to have the database call under the Template call. I've found that when dealing with medium sized datasets this becomes impractical. I am trying to move the request to the server, and have the results passed back to the client.
I have looked at similar questions on SA but have found no immediate answers.
Here is my server side function:
Meteor.methods({
"getTest" : function() {
var res = Data.find({}, { sort : { time : -1 }, limit : 10 });
var r = res.fetch();
return (r);
}
});
And client side:
Template.matches._matches = function() {
var res= {};
Meteor.call("getTest", function (error, result) {
res = result;
});
return res;
}
I have tried variations of the above code - returning in the callback function as one example. As far as I can tell, having a callback makes the function asynchronous, so it cannot be called onload (synchronously) and has to be invoked from the client.
I would like to pass all database queries server side to lighten the front end load. Is this possible in Meteor?
Thanks
The way to do this is to use subscriptions instead of remote method calls. See the counts-by-room example in the docs. So, for every database call you have a collection that exists client-side only. The server then decides the records in the collection using set and unset.
I am trying to write a node.js application, and we need to deploy it in production.
We need to make sure that node.js does not hang when there are any long running processes/operations, like querying, or the database server access.
So, i am trying to make a call to mongo or to filesystem which takes very long time to finish, so that i can verify that other node.js server is free to serve any other requests while that takes place.
Sadly, i am not able to insert a record for which mongo takes really long time to finish or to make a synch call to the file system.
Can someone tell me how to do it?
Thanks
Tuco
The trick is do a console log of the data after the block that do a call and a console.log in the callback if in the console apears first the message is actually asynchronous
Im using mongojs as driver for mongo:
collection.find({}, function(err, res) {
console.log("done")
});
console.log("sendign signal")
If its asynchronous, in the console:
sendign signal
done!
Now for the chained behavior you can make something like that
dbChain = (function() {
var chain = [], cursor = 0, busy = false;
chainin = {
push : function(aFn) {
if(!busy) {
chainin.reset();
aFn();
busy = true;
} else {
chain.push(aFn)
}
},
next : function() {
cursor++;
if(chain[cursor]) {
chain[cursor]();
} else {
chainin.reset();
}
},
reset : function() {
chain = [];
cursor = 0;
busy = false;
}
}
return chainin;
})()
and, in all the db calls you have to do:
dbChain.push(...(a function ) ...)
in all your callbacks
dbChain.next()
I'm using Mongoose with Node.js and have the following code that will call the callback after all the save() calls has finished. However, I feel that this is a very dirty way of doing it and would like to see the proper way to get this done.
function setup(callback) {
// Clear the DB and load fixtures
Account.remove({}, addFixtureData);
function addFixtureData() {
// Load the fixtures
fs.readFile('./fixtures/account.json', 'utf8', function(err, data) {
if (err) { throw err; }
var jsonData = JSON.parse(data);
var count = 0;
jsonData.forEach(function(json) {
count++;
var account = new Account(json);
account.save(function(err) {
if (err) { throw err; }
if (--count == 0 && callback) callback();
});
});
});
}
}
You can clean up the code a bit by using a library like async or Step.
Also, I've written a small module that handles loading fixtures for you, so you just do:
var fixtures = require('./mongoose-fixtures');
fixtures.load('./fixtures/account.json', function(err) {
//Fixtures loaded, you're ready to go
};
Github:
https://github.com/powmedia/mongoose-fixtures
It will also load a directory of fixture files, or objects.
I did a talk about common asyncronous patterns (serial and parallel) and ways to solve them:
https://github.com/masylum/i-love-async
I hope its useful.
I've recently created simpler abstraction called wait.for to call async functions in sync mode (based on Fibers). It's at an early stage but works. It is at:
https://github.com/luciotato/waitfor
Using wait.for, you can call any standard nodejs async function, as if it were a sync function, without blocking node's event loop. You can code sequentially when you need it.
using wait.for your code will be:
//in a fiber
function setup(callback) {
// Clear the DB and load fixtures
wait.for(Account.remove,{});
// Load the fixtures
var data = wait.for(fs.readFile,'./fixtures/account.json', 'utf8');
var jsonData = JSON.parse(data);
jsonData.forEach(function(json) {
var account = new Account(json);
wait.forMethod(account,'save');
}
callback();
}
That's actually the proper way of doing it, more or less. What you're doing there is a parallel loop. You can abstract it into it's own "async parallel foreach" function if you want (and many do), but that's really the only way of doing a parallel loop.
Depending on what you intended, one thing that could be done differently is the error handling. Because you're throwing, if there's a single error, that callback will never get executed (count won't be decremented). So it might be better to do:
account.save(function(err) {
if (err) return callback(err);
if (!--count) callback();
});
And handle the error in the callback. It's better node-convention-wise.
I would also change another thing to save you the trouble of incrementing count on every iteration:
var jsonData = JSON.parse(data)
, count = jsonData.length;
jsonData.forEach(function(json) {
var account = new Account(json);
account.save(function(err) {
if (err) return callback(err);
if (!--count) callback();
});
});
If you are already using underscore.js anywhere in your project, you can leverage the after method. You need to know how many async calls will be out there in advance, but aside from that it's a pretty elegant solution.