Right way to terminate testing mongodb server - node.js

We are using mongodb-prebuilt package for perform unit/integration testing (CI/CD) for the developed software, which performs interaction with mongodb database.
Common use case is following: launch testing mongodb server with prebuild package, perform test suite execution and then terminate server. First and second items are done easily, but third one causes some problems.
If mongodb-prebuilt-started server had not been terminated, test runner will hang forever. Instead, if try to terminate server by testConnection.command({ shutdown: 1 }) command, then unhandledRejection fires, which refers on closed connection - which of course had been forcefully closed by stopping the server.
Which is right way to dispose mongodb-prebuilt at afterAll test section? Test engine is jest, but it does not matter strongly.
Example code here:
import { MongodHelper } from 'mongodb-prebuilt';
import { MongoClient } from 'mongodb';
describe('Test suite', () => {
let testConnection;
beforeAll(async () => {
const mongodHelper = new MongodHelper(['--port', "27018"]);
await mongodHelper.run();
testConnection = await MongoClient.connect(`mongodb://localhost:27018/admin`);
});
afterAll(async () => {
await testConnection.dropDatabase();
await testConnection.command({ shutdown: 1 });
await testConnection.close();
testConnection = null;
});
/* Here follows some test suite that uses testConnection */
}
There was some attempts to solve problem:
1) Don't await testConnection.command({ shutdown: 1 })-generated promise, and instantly initiate client connection closing - it works on some machines, but most likely depends on execution speed, so works unstable.
2) Because of client connection termination at the end of the tests does not matter strongly - it's possible to setup process.on('unhandledRejection', ...) handler in afterAll section and just mute exception - works well, but seems to be ideologically incorrect
So, maybe is there stereotypical solution for original task?
IMPORTANT NOTE: suggesting use some mock packages instead live mongodb is not appropriate at all, since developed software is specific adapter for mongodb and should respect all aspects of live database, including timings, admin commands and so on

Most convenient solution for original problem is using mongo-unit npm package, which despite the name, provides API for automated integrated testing with real mongodb prebuild server, including launch and stopping it: https://github.com/mikhail-angelov/mongo-unit/
Gauge test suite with mongo-unit looks like following:
import 'regenerator-runtime/runtime';
import mongoUnit from 'mongo-unit';
import { MongoClient } from 'mongodb';
describe('MongoDB-based test suite, () => {
let testConnection;
beforeAll(async () => {
const connectionUrl = await mongoUnit.start({ dbName: 'admin' });
testConnection = await MongoClient.connect(connectionUrl);
});
afterAll(async () => {
await testConnection.dropDatabase();
await testConnection.close(true);
await mongoUnit.stop();
testConnection = null;
});
/* Test suite goes here */
});
Package also allows to select initial database at start, and perform clear mongodb server process stopping at end of test suite.

Related

How to delay server from running before Tests run

I've been trying to get around writing functional tests for my services using jest.
The action in the tests gets resolved but the test never passed because it keeps complaining that the server is already in use.
startServer()
describe("api tests", () => {
it("createUser: should create user successfully", async () => {
const user = await createUserService(userCredentials)
expect(user.email).toBe("test2#test.com")
}, 12000)
})
If i check my database, the user actually gets created in the process but for some reasons the tests does't pass and it return an error saying port address in use.
I have also tried to call startServer() in jest's beforeAll(() => {}). It still returns the same behaviour.

BadRequestException: no open transaction, qldb, nodejs driver

I set up my nodejs app with qldb to implement a wallet service. Set up some tests with some success tests and some expected error tests and once in a while this error 'BadRequestException: no open transaction' would happen and cause my tests to fail. if I run the test again, they will pass. Again once in a while, this error will happen unexpectedly cause the tests to fail. I noted when commented out my expected error tests and the error didn't happen or did not happen as often. and this error happens not only to the expected error test but to the successful tests.
this is how my tests look like
describe('createWallet()', () => {
it('should return an object with wallet Id', async () => {
let result6 = await controller.createWallet({ body: mocks.walletInfo6.info });
documentId6 = result6.walletId;
expect(result6).to.have.property('walletId').that.is.a.uuid;
});
it('One player should have only one active wallet for each currency', async () => {
try {
let res = await controller.createWallet({ body: mocks.walletInfo1.info });
assert.fail('expected error was not thrown')
} catch (e) {
expect(e.message).to.equal('Player already owns an active wallet in this currency.');
}
});
});
describe('suspendWallet()', () => {
it('should change wallet status to suspend', async () => {
let res = await controller.suspendWallet({ documentId: documentId3 });
await controller.suspendWallet({ documentId: documentId5 });
expect(res).to.be.a.string;
expect(res).to.equal(documentId3);
});
it('should not change wallet status if wallet Id is invalid', async () => {
try {
let res = await controller.suspendWallet({ documentId: mocks.invalidWalletId });
assert.fail('expected error was not thrown')
} catch (e) {
expect(e.message).to.equal('Did not find any record with this document Id.');
}
});
});
It's hard to be certain about how your application is running into this error without looking at how the driver is being used to execute transactions.
The driver APIs (for example - execute ) returns a promise. One way the application could be seeing the "No transaction open error" is that the promise is not resolved before sending further commands.
Cookbook - Refer to the QLDB JS driver cookbook which lists code sample for CRUD operations. Note how the samples use await inside the transactions to wait for the promises to resolve. Not waiting for the promises returned by execute can cause the driver to commit the transaction before the execute call is processed and hence a "No open transaction error".
Sample code for executing transactions -
var qldb = require('amazon-qldb-driver-nodejs');
const qldbDriver = new qldb.QldbDriver("vehicle-registration");
(async function() {
await qldbDriver.executeLambda(async (txn) => {
await txn.execute("CREATE TABLE Person");
});
})();
In case you still face issues, please share the code snippet where you use the driver to execute transactions.
Update on this issue. I use nodejs driver version 2.1.0.
My team and I found out that the problem was because there are rollbacks that happen after the error tests and we don't know when the rollbacks are done. When the rollback of the previous test is still running, the transaction for that test is still open so if the next test tries to open a new transaction it would conflict and would not able to open a new transaction for the next test. To fix this, we just not throw errors inside the transaction to prevent rollbacks to happen. This way works for our code, but a better solution would be a way to detect when the rollback is done from the driver and wait for the transaction to close before opening a new transaction.

What is the proper way to handle connecting and closing the MongoDB Client from NodeJS (not using Mongoose!)?

export const client = new MongoClient(
process.env.ATLAS_URI,
// TODO: Figure out what this is and why it's needed to turn off deprecation warning
{
useUnifiedTopology: true,
}
);
Following this guide and all make sense...but she is just doing one 'call' and then close().
I need to keep doing repeated calls:
export const getAllProducts = async () => {
try {
await client.connect();
const cursor = await client.db("products").collection("data").find();
return await cursor.toArray();
} catch (err) {
throw new Error(err);
} finally {
await client.close();
}
};
The first call is fine. After that: Error: MongoError: Topology is closed, please connect
I honestly don't quite understand what Topology means, but evidently it's the close() that's contributing to the issue.
It doesn't make sense that I set up new MongoClient and the ATLAS_URI does have the 'database name' in there...so why I have to connect specify that again?
Anyway, the main part of my ❓ stands: Do I just keep a separate process going and not close it? Do I start back with a whole new MongoClient each time? 😕
I'll just put a brief answer here incase anyone runs into this.
The Mongodb documentation for the Node.js driver will give you simple examples that include the client.connect()and client.close() methods just to give you a runnable example of making a simple call to the database but in a real server application you are just opening the connection to the client once during start up and typically only closing when the server application is being closed.
So in short: You don't need to open and close and connection everytime you want to perform some action on your database.

mocha test failing with " MongoError: server sockets closed"

My mocha tests are failing with:
MongoError: server XXXX sockets closed
I have a workaround how to fix them:
const https = require('https');
const server = https.createServer(..);
close() {
mongoose.disconnect(); // <-------- I will comment this line
this.server.close();
};
I would comment out the line mongoose.disconnect(); and my test suite starts working. I would like to clean up after my tests too. Each of my test files recreates server and starts from the scratch. It seems like the error appears because there needs to be some 'waiting' before the next test file executes.
How can I correct this error?
Solution - Captain Hook to the rescue!
If I understand correctly, you wish to startup and cleanup your server after the tests. You also have a series of repetitive tasks you need to do before and after each test.
Mocha has the perfect solution for you: Say hello to Mr. Hook!
Mocha hooks are functions that you can run both before all tests, after all tests, or before each test and after each test:
https://mochajs.org/#hooks
The documentation is pretty complete and I really do recommend it. I your case however, since you are dealing with databases, you probably will be dealing with async hooks.
Sounds complex? Don't worry!
This is how normal sync hooks work:
describe('hooks', function() {
before(function() {
// runs before all tests in this block
});
after(function() {
// runs after all tests in this block
});
beforeEach(function() {
// runs before each test in this block
});
afterEach(function() {
// runs after each test in this block
});
//tests
it("This is a test", () => {
assert.equal(1, 1);
});
});
async hooks only have one difference: they have a parameter done, which is called once your task is finished. Lets assume that we are setting up a DB that takes 1.5 seconds to setup. We want to do this before all the tests, and we only want to do it once.
Let's assume this is our listen function from our DB:
const listen = callback => {
setTimeout(callback, 1500);
};
So after 1.5 seconds, it calls the callback function signalizing it is ready for action.
Now lets see how we would make an async hook:
describe('hooks', function() {
let myDB;
before( done => {
myDB = newDB();
myDB(done);
});
//tests
});
And that's it! Hope it helps!

Scaffolding a Node.js app properly without Express (the app doesn't receive requests)

[This question is quite vague, I apologize for it. I'm trying to address my various troubles by answering the question myself]
I am building a Node.js app which has to perform various tasks at given intervals. Here is the global scaffold (involves bluebird promises and mongoose for DB interactions) :
var Promise = require("bluebird");
var mongoose = require('mongoose');
mongoose.Promise = require('bluebird');
// Personal modules
var bootApp = require(...);
var doStuffA = require(...);
var doStuffB = require(...);
var doStuffC = require(...);
// running locally, but meant to be deployed at some point
mongoose.connect('mongodb://localhost:27017/myDatabase');
var db = mongoose.connection;
db.on('error', () => {
console.log("Error : lost connection !"));
process.exit(1);
});
db.once('open', () => {
bootApp() // always start by booting
.then( () => { // then start the infinite loop of events
setInterval(doStuffA, 1000*60*60); // 1x/1h
setInterval(doStuffB, 1000*60*10); // 1x/10min
setInterval(doStuffC, 1000*60*3); // 1x/3min
}).catch((e) => { // errors are handled by doStuffX(), so we should never catch anything here
console.log(e.message);
process.exit(1);
});
});
Each module doStuffX is a function returning a Promise, handling its own errors, and should finish at some point.
Expected behaviour for the entire app :
The app should be able to run forever
The app should try to doStuffX() at the given interval, regardless of whether it succeeded or failed last time.
[Optional :] The app should close smoothly without retrying any doStuff upon receiving a "shut down" signal.
My question : how to build a clean scaffold for such an app ? Can I get rid of setInterval and use promises instead ? One of my main concerns is to make sure the previous instance of doStuffX() is finished before starting the next one, even if it involves "killing" it in some way.
I am open to any link about scaffolding apps, but PLEASE DO NOT GIVE ME AN ANSWER/LINK INVOLVING EXPRESS : I don't need Express, since my app doesn't receive any request. (everything I found so far starts with Express :/)
If you don't want to start the next doStuffX() until the previous one is done, then you can replace your setInterval() with repeated setTimeout() calls.
function runA() {
setTimeout(function() {
doStuffA().then(runA).catch(function(err) {
// decide what to do differently if doStuffA has an error
});
}, 1000*60*60);
}
runA();
You could also add a timeout to this so that if doStuffA() doesn't respond within a certain amount of time, then you take some other action. This would involve using another timer and a timeout flag.
[I answer my own question, trying to put here everything I changed afterwards, in case someone falls into this page someday...]
For the Mongoose part of the scaffold, here is what I got so far for a reliable long-term DB connection :
The Mongoose documentation gives a fancy way to ensure the driver will never give up on trying to reconnect with reconnectTries
I don't really understand socketOptions and keepalive which seem related to replicas, so I leave them out of my code for now
Since Mongoose should autoreconnect whenever something goes wrong, I'll keep the db.once('open') as the access to the app code itself, even though I don't really understand yet the difference with db.on('connected')
I recommend reading this.
var Promise = require("bluebird");
var mongoose = require('mongoose');
mongoose.Promise = require('bluebird');
// Personal modules
var bootApp = require(...);
var doStuffA = require(...);
var doStuffB = require(...);
var doStuffC = require(...);
// running locally, but meant to be deployed at some point
var uri = 'mongodb://localhost:27017/myDatabase';
// the added option makes sure the app will always try to reconnect...
mongoose.connect(uri, { server: { reconnectTries: Number.MAX_VALUE } });
var db = mongoose.connection;
db.on('error', () => {
console.log("Error with Mongoose connection."));
});
db.once('open', () => {
bootApp() // always start by booting
.then( () => { // then start the infinite loop of events
//////////////////////////////////
/// Here goes the actual stuff ///
//////////////////////////////////
}).catch((e) => { // errors are handled by doStuffX(), so we should never catch anything here
console.log(e.message);
});
});
Now, for the actual repetitive stuff, my objective is to make sure everything runs smoothly, and that no process gets stuck. About the changes I made :
The methods used are not native ES6 but are specific to bluebird. You can read about .timeout() and .delay() which I find very useful for chaining timeouts and intervals in a clean code.
In my mind, .then(runA, runA) should always launch ONE UNIQUE instance of runA but I'm concerned about whether I could actually end up launching two parallel instances...
// Instead of using setInterval in a bluebird promised environment...
setInterval(doStuffA, 1000*60*60); // 1x/1h
// I would have liked a full promise chain, but as jfriend00 stated,
// It will end up crashing because the initial promise is never resolved...
function runA() {
return doStuffA()
.timeout(1000*60*30) // kill the running instance if it takes longer than 30min
.delay(1000*60*60) // wait 60min
.then(runA, runA); // whatever the outcome, restart the process
}
runA();
// Therefore, a solution like jfriend00's seems like the way to go :
function runA() {
setTimeout(function() {
doStuffA()
.timeout(1000*60*30)
.then(runA, runA)
}, 1000*60*60);
}
runA();

Resources