nodeunit fail to exit from my asynchronous tests - node.js

Whenever I run my nodeunit test in IDE or console, it run well but fail to exit. Help me please with it!
var store = require('../lib/db');
var list = require('../source/models/mock_deals');
var logger = require('../lib/logging').logger;
exports.setUp = function(done){
logger.info('start test...');
done();
};
exports.tearDown = function(done){
logger.info('end test...');
done();
};
exports.testInsertDeal = function(test){
var length = list.length;
test.equals(length, 2);
store.mongodb.open(function(err,db){
if(err){
logger.error(err);
return;
}
logger.info("mongodb is connected!");
db.collection('deals',function(err,collection){
for(var i=0; i<length; i++){
var item = list[i];
collection.insert(item, function(err, result){
if(err){
logger.error('Fail to insert document deal [' + item.id + ']');
return;
}
logger.info( 'index ' + i + ' : ' +JSON.stringify(item) );
});
}
});
test.expect(1);
});
test.done();
};

I changed to use mongoose instead of mongodb. test still could not exit automatically.
But when I disconnected mongoose in test.tearDown method in my nodeunit test. the test existed correctly.
Add below in you test:
exports.tearDown = function(done){
mongoose.disconnect(function(err){
if(err) {
logger.error(err);
return;
}
logger.info('mongoose is disconnected');
});
done();
};
And more, If I use log4js for logging in my test and configure log4js with reloadSecs: 500 , test will not exist either. After I set reloadSecs to 0, then test exists well. So we need to configure logging.json with option reloadSecs: 0
To summarize: we need to make sure there are no working parts there after all test methods are done. then test will exist correctly.

If you know when your program should exit, you can simply use the following line of code to exit:
process.exit(0);
where 0 is the return code of the program.
Now that isn't really fixing the problem. There is probably a call back still waiting or a connection that is still active keeping your program up and running that isn't shown in the code you posted here. If you don't care to find it, just use process.exit. If you really care to find it, you will have to dig some more. I've never used nodeunit but I have used other node libraries that leave stuff up in their inner workings that keep the program from exiting. In those cases, I usually don't feel like wading through other peoples source code to find out what is going on so I just do the afore mentioned process.exit call.
This should at least give you an option.

Related

Running Mocha multiple times with the same data produces different results

I have been noticing. With no discernible pattern, that sometimes I run my mocha tests and get a failed assertion. then I run it again immediately. Without making any changes and the assertions all pass. Is there something that I'm doing wrong. Is there test data that is cached?
I, unfortunately don't know what to post, since I can't seem to pin this down to any one thing but here is one function I've seen it happen to
mysqltest.js
describe('get_tablet_station_name', function() {
it ('isSuccessful', function() {
mysqlDao.init();
var callback = sinon.spy();
mysqlDao.get_tablet_station_name(state.TABLET_NAME, function(err, result) {
assert(err == undefined)
});
});
it ('isUnsuccessful', function() {
mysqlDao.init();
var callback = sinon.spy();
mysqlDao.get_tablet_station_name("'''%\0", function(err, result) {
assert(err != undefined);
});
});
});
When the assertions fail it is
assert(err != undefined);
err is returning null despite the bad SQL statement.
You should share the error you are getting when test case is getting failed.
But from you code and problem described, I can guess that your mysql calls must be taking time greater than the default timeout of mocha test cases. And while running your test case for the second time mysql must be returning data from cache hence preventing timeout.
To debug try increasing the default timeout of your test case, and for that use inside you describe call ->
describe('get_tablet_station_name', function() {
this.timeout(10000);
......//rest of the code
Please note this could not be the solution,please provide why assertion is getting failed if above mentioned thing do not work.

Exiting a process after a database call in Node?

I'm experimenting with calling a database from Node, and am using the following client.execute() sample code
socket.on('send', function(data){
client.execute('SELECT * FROM db.main', [], function(err, result) {
if (err) {
//do something
} else {
for (var i = 0; i < result.rows.length; i++) {
console.log('id=' + result.rows[i].get('topic_id'));
}
process.exit(0);
}
});
});
As seen above, I'm running this code inside a socket.io listener method. However, the server stops whenever it is executed. On the other hand, when I remove 'process.exit(0)', things seem to run just fine.
So is that line necessary?
The line: process.exit(0); will exit your program, i guess it was put there for debugging purpose or smth.
You generally should never need to manually call process.exit(0). If there is nothing left to do, the process will exit naturally.

Node.js. What should I use? Next()?

app.get("/server", function (req, res){
connection.query("SELECT * from serverdb", function(err, rows)
{
var data = rows;
var reachabilityResultString="";
var serverCount = rows.length;
var arrayWithReachabilityResultStrings = new Array();
var insertReachabilityResultStringIntoArray;
for (var counterForServername = 0 ; counterForServername < serverCount; counterForServername++)
{
ls = childProcess.exec('ping ' + rows[counterForServername].ipadresse,function (error, stdout, stderr)
{
if (error)
{
console.log(error.stack);
console.log('Error code: '+error.code);
console.log('Signal received: '+error.signal);
var errorSignal = ("Signal received: " + error.signal);
var errorReachability = "Error";
}
else
{
console.log('Child Process STDOUT: '+stdout);
console.log('Child Process STDERR: '+stderr);
pingOutput = String(stdout);
console.log(reachabilityResult(pingOutput));
insertReachabilityResultStringIntoArray = arrayWithReachabilityResultStrings.push(reachabilityResult(pingOutput));
console.log(arrayWithReachabilityResultStrings);
};
ls.on('exit', function (code) {
console.log('Child process exited with exit code '+code);
});
});
};
});
res.render("all.jade,{servers: data, status: arrayWithReachabilityResultStrings});
});
..well..this is my code. My problem is that the program first invoke the website with the jadecode; I hope you know what I mean. I want to deliver the arrayWithReachabilityResultStrings to all.jade, so the program must wait until the for loop is finished. But I don't know how to make it wait. I know the "problem" is the asynchronous behavior of node.js but I don't know how I can solve this..
just fix your missing " and move your
res.render("all.jade,{servers: data, status: arrayWithReachabilityResultStrings});
one line up. It needs to be invoked by a callback in connection.query, as it is now it is invoked much sooner.
It would also be nice, if you read a bit about javascript variable scoping. This SO question does good job in that.
P.S.: Glad to see new people learning node.
If you need to run an arbitrary number of subcommands and wait until they are all done, you should consider a helper library such as async.js and use the async.queue flow control function. This kind of coordination is actually somewhat tricky in node to code by hand without any flow control facilities. However, it is certainly possible. In this case you would need a separate done counter that you increment on each 'exit' event and when all of your child processes have been started and all have finished, you're done.

node.js file system problems

I keep banging my head against the wall because of tons of different errors. This is what the code i try to use :
fs.readFile("balance.txt", function (err, data) //At the beginning of the script (checked, it works)
{
if (err) throw err;
balance=JSON.parse(data);;
});
fs.readFile("pick.txt", function (err, data)
{
if (err) throw err;
pick=JSON.parse(data);;
});
/*....
.... balance and pick are modified
....*/
if (shutdown)
{
fs.writeFile("balance2.txt", JSON.stringify(balance));
fs.writeFile("pick2.txt", JSON.stringify(pick));
process.exit(0);
}
At the end of the script, the files have not been modified the slightest. I then found out on this site that the files were being opened 2 times simultaneously, or something like that, so i tried this :
var balance, pick;
var stream = fs.createReadStream("balance.txt");
stream.on("readable", function()
{
balance = JSON.parse(stream.read());
});
var stream2 = fs.createReadStream("pick.txt");
stream2.on("readable", function()
{
pick = JSON.parse(stream2.read());
});
/****
****/
fs.unlink("pick.txt");
fs.unlink("balance.txt");
var stream = fs.createWriteStream("balance.txt", {flags: 'w'});
var stream2 = fs.createWriteStream("pick.txt", {flags: 'w'});
stream.write(JSON.stringify(balance));
stream2.write(JSON.stringify(pick));
process.exit(0);
But, this time, both files are empty... I know i should catch errors, but i just don't see where the problem is. I don't mind storing the 2 objects in the same file, if that can helps. Besides that, I never did any javascript in my life before yesterday, so, please give me a simple explanation if you know what failed here.
What I think you want to do is use readFileSync and not use readFile to read your files since you need them to be read before doing anything else in your program (http://nodejs.org/api/fs.html#fs_fs_readfilesync_filename_options).
This will make sure you have read both the files before you execute any of the rest of your code.
Make your like code do this:
try
{
balance = JSON.parse(fs.readFileSync("balance.txt"));
pick = JSON.parse(fs.readFileSync("pick.txt"));
}
catch(err)
{ throw err; }
I think you will get the functionality you are looking for by doing this.
Note, you will not be able to check for an error in the same way you can with readFile. Instead you will need to wrap each call in a try catch or use existsSync before each operation to make sure you aren't trying to read a file that doesn't exist.
How to capture no file for fs.readFileSync()?
Furthermore, you have the same problem on the writes. You are kicking off async writes and then immediately calling process.exit(0). A better way to do this would be to either write them sequentially asynchronously and then exit or to write them sequentially synchronously then exit.
Async option:
if (shutdown)
{
fs.writeFile("balance2.txt", JSON.stringify(balance), function(err){
fs.writeFile("pick2.txt", JSON.stringify(pick), function(err){
process.exit(0);
});
});
}
Sync option:
if (shutdown)
{
fs.writeFileSync("balance2.txt", JSON.stringify(balance));
fs.writeFileSync("pick2.txt", JSON.stringify(pick));
process.exit(0);
}

How to wait for all async calls to finish

I'm using Mongoose with Node.js and have the following code that will call the callback after all the save() calls has finished. However, I feel that this is a very dirty way of doing it and would like to see the proper way to get this done.
function setup(callback) {
// Clear the DB and load fixtures
Account.remove({}, addFixtureData);
function addFixtureData() {
// Load the fixtures
fs.readFile('./fixtures/account.json', 'utf8', function(err, data) {
if (err) { throw err; }
var jsonData = JSON.parse(data);
var count = 0;
jsonData.forEach(function(json) {
count++;
var account = new Account(json);
account.save(function(err) {
if (err) { throw err; }
if (--count == 0 && callback) callback();
});
});
});
}
}
You can clean up the code a bit by using a library like async or Step.
Also, I've written a small module that handles loading fixtures for you, so you just do:
var fixtures = require('./mongoose-fixtures');
fixtures.load('./fixtures/account.json', function(err) {
//Fixtures loaded, you're ready to go
};
Github:
https://github.com/powmedia/mongoose-fixtures
It will also load a directory of fixture files, or objects.
I did a talk about common asyncronous patterns (serial and parallel) and ways to solve them:
https://github.com/masylum/i-love-async
I hope its useful.
I've recently created simpler abstraction called wait.for to call async functions in sync mode (based on Fibers). It's at an early stage but works. It is at:
https://github.com/luciotato/waitfor
Using wait.for, you can call any standard nodejs async function, as if it were a sync function, without blocking node's event loop. You can code sequentially when you need it.
using wait.for your code will be:
//in a fiber
function setup(callback) {
// Clear the DB and load fixtures
wait.for(Account.remove,{});
// Load the fixtures
var data = wait.for(fs.readFile,'./fixtures/account.json', 'utf8');
var jsonData = JSON.parse(data);
jsonData.forEach(function(json) {
var account = new Account(json);
wait.forMethod(account,'save');
}
callback();
}
That's actually the proper way of doing it, more or less. What you're doing there is a parallel loop. You can abstract it into it's own "async parallel foreach" function if you want (and many do), but that's really the only way of doing a parallel loop.
Depending on what you intended, one thing that could be done differently is the error handling. Because you're throwing, if there's a single error, that callback will never get executed (count won't be decremented). So it might be better to do:
account.save(function(err) {
if (err) return callback(err);
if (!--count) callback();
});
And handle the error in the callback. It's better node-convention-wise.
I would also change another thing to save you the trouble of incrementing count on every iteration:
var jsonData = JSON.parse(data)
, count = jsonData.length;
jsonData.forEach(function(json) {
var account = new Account(json);
account.save(function(err) {
if (err) return callback(err);
if (!--count) callback();
});
});
If you are already using underscore.js anywhere in your project, you can leverage the after method. You need to know how many async calls will be out there in advance, but aside from that it's a pretty elegant solution.

Resources