How do I test child processes using chai and mocha? - node.js

I'm creating a framework to execute processes at a specific time (cron-like) and to test it I'm using chai-mocha-grunt.
The architecture of solution is based on this example. Basically, we have:
A Master process, which calls the Child (via child_process.fork) a specific number of times.
A Child process, which executes something using setInverval();
A process to call the Master.start() function.
With this architecture how do I test to ensure that threads are executed at the correct time using mocha and chai (with the 'assert' library)?
In other words, how do I make chai 'listen' to the threads and check if they are executed at the correct time?

I'm not sure you need chai itself to listen to your threads. If you're building off of the example you linked this should be pretty straight forward because the Master.js is already an EventEmitter and it's already emitting all events it hears from the child processes.
Your test structure could be as simple as this:
describe('ForkExample test', function() {
// Set an appropriate test timeout here
this.timeout(1000);
it('should do stuff at the right time', function(done) {
var fe = new ForkExample();
fe.start(1);
fe.on('event', function(type, pid, e) {
if (type === 'child message') {
// Check here that the timing was within some expected range
done();
}
});
});
});

Related

Electron: Race condition between main and renderer process

In my Electron application, I am seeing weird behavior. In Windows, sometimes the renderer process executes before the initialization of Electron finishes, which is causing the issue in a startup.
For eg: I set up a sequelize database and register IPC channels in the constructor of Main.ts file, so as per my knowledge, app.on('ready') event should be fired once the constructor finishes execution but sometimes in Windows OS only, ready events fires even before the database setup, and my renderer process is calling the database to fetch the default records for the MainWindow.
I think this is a race condition between the renderer process and main process execution, does anyone know how to fix that?
Main.ts
export class Main {
private mainWindow: BrowserWindow;
static instance: Main;
public async init(ipcChannels: IpcChannelInterface[]) {
Main.instance = this;
// Registering the IPC Channels
await this.registerIpcChannels(ipcChannels);
var config = require('../../package.json');
app.setAsDefaultProtocolClient(config.build.protocols.name);
app.setAppUserModelId(config.build.appId);
app.on('ready', Main.createWindow);
app.on('window-all-closed', Main.onWindowAllClosed);
app.on('activate', Main.onActivate);
//Below statement setup the database
await SequelizeDB.setup();
}
}
(new Main()).init([new IpcChannel1(), new IpcChannel2()]);
The ready event fires whenever the Electron setup has finished. It has nothing to do with your constructor or init method. From docs:
Emitted once, when Electron has finished initializing
It sounds like you're saying that your createWindow function has a dependency on the database setup function. In that case, you can just do the setup first:
await SequelizeDB.setup();
await app.whenReady(); // this can replace your on("ready", ...) stuff
Main.createWindow();

Is there any way to add callbacks to Jest when all tests succeed or fail?

I'd like to run a callback when all the tests in a describe block pass (or fail), is there some hook or something in the Jest API to do this? I could not find anything applicable in the docs.
I'm making several API requests to collect data in order to compare it to data in a CSV file, in order to diff the contents. When the tests have all passed, I would like to save all the API responses in a file, therefore I need some sort of 'all tests passed' callback
You can run jest programmatically. Note that this approach is "hack" because there is no official support for running jest like this.
see: https://medium.com/web-developers-path/how-to-run-jest-programmatically-in-node-js-jest-javascript-api-492a8bc250de
There is afterAll that is aware of describe but runs regardless of test results. It can be used as a part of function to aggregate data from tests:
let responses;
testAndSaveResponses((name, fn) => {
if (!responses) {
responses = [];
} else {
afterAll(async () => {
if (!responses.includes(null)) {
// no errors, proceed with response processing
}
});
}
test(name, async () => {
try {
responses.push(await fn());
} catch (err) {
responses.push(null);
throw err;
}
});
});
It's supposed to be used instead of Jest test and be enhanced to support multiple describe scopes.
There is custom environment. Circus runner allows to hook test events, finish_describe_definition in particular. It is applied to all tests, unaware of custom data (e.g. responses that need to be saved) and should interact with them through global variables.
There is custom reporter, it receives a list of passed and failed tests. It is applied to all tests, unaware of custom data defined in tests and doesn't have access to globals from test scope so cannot be used to collect responses.

Mocha/Chai: How to perform async test of something that didn't happen?

I am trying to write a short mocha/chai Node test for some async process, expecting it to ignore irrelevant input. It basically looks like this (compared to the test of relevant input). The problem is how do I write the second test? It's an async process that eventually does nothing, no error/success emits...
it('should process input', function(done) {
object
.on('success', function(result) {
expect.result.to.equal("OK");
done();
})
.asyncDoSomething('relevant input');
});
it('should ignore input', function(done) {
object.asyncDoSomething('irrelevant input');
// TODO: how do I verify the async process eventually did nothing?
});
That's a good one - the only solution that comes to mind is to wait for a timeout and assume if it didn't happen in this time, then it will not happen. But this is not good design and needlessly slows down the test suite.
Have you thought about isolating the decision logic to somewhere where it could be tested synchronously and then make a test for that?
For the moment (still awaiting possibly better solutions?), I have updated the emitter to emit some sort of an 'ignored' event for all cases where it decides to ignore the input asynchronously. For testing, I check the "cause" of the ignore using:
expect(cause).to.equal(expectedCause)

How to avoid timeouts in mocha testcases?

Here I am attaching my code, I am passing done callback and using supertest for request. Since I am using assert/expect in my testcase inside request.end block why I need to worry about timeout? What is mistake I am making here.
it('should get battle results ', function(done) {
request(url)
.post('/compare?vf_id='+vf_id)
.set('access_token',access_token)
.send(battleInstance)
.end(function(err, res){ // why need timeout
if (err) return done(err);
console.log(JSON.stringify(res.body));
expect(res.body.status).to.deep.equal('SUCCESS');
done();
});
});
Testcase results following response:
Error: timeout of 2000ms exceeded. Ensure the done() callback is being called in this test.
If I am running my testcases with mocha command then its show this error while If I am running test mocha --timeout 15000 then testcase is passing correctly. But I want to avoid timeout, How can I do that?
If I am running my testcases with mocha command then its show this error while If I am running test mocha --timeout 15000 then testcase is passing correctly. But I want to avoid timeout, How can I do that?
You can't avoid timeouts, since it looks like you're testing a remote service. If, for whatever reason, the request to that service takes a long time, you will run into timeouts.
You can tell Mocha to disable for timeout checking by setting the timeout to 0, but that's probably also not ideal because it may cause each test case to take an excessive amount of time.
As an alternative, you can mock request (which I assume is superagent) so you can control the entire HTTP request/response flow, but since it looks like you're testing a remote service (one which you have no control over) that would make this particular test case moot.
In mocha a default timeout of 2 seconds (2000ms) is set by default.
You can extend the default (global) timeout from the command line using the --timeout xxxx flag.
If you want instead to change the timeout for a specific test case you can use the this.timeout( xxxx ) function - note it does not work for arrow functions - (where xxxx is a number like 20000 representing milliseconds).
it('My test', function(){
this.timeout(5000);
//... rest of your code
});
You can also set a timeout of a set of test cases (wrapped by a describe):
describe("My suite", function(){
// this will apply for both "it" tests
this.timeout(5000);
it( "Test 1", function(){
...
});
it( "Test 2", function(){
...
});
});
It also works for before, beforeEach, after, afterEach blocks.
More documentation is available here: https://mochajs.org/#timeouts
Consider that 2 seconds is usually a good amount of time to run your tests so I would say that extend the default timeout should be an exception, not the common rule in your tests.
Also if your test is not async and you have to extend the timeout I would strongly suggest to review the function that is taking so long before extending the timeout.
Mocha : Timeouts
Test-specific timeouts may also be applied, or the use of this.timeout(0) to disable timeouts all together:
To disable the timeout from happening simply set it to 0. I use mocha <file> --timeout 0 when I'm debugging so the timeout error does not get thrown.
Here is what you need
mocha timeouts
describe('a suite of tests', function() {
this.timeout(500);
it('should take less than 500ms', function(done){
setTimeout(done, 300);
});
it('should take less than 500ms as well', function(done){
setTimeout(done, 250);
});
})

Nodejs event handling

Following is my nodejs code
var emitter = require('events'),
eventEmitter = new emitter.EventEmitter();
eventEmitter.on('data', function (result) { console.log('Im From Data'); });
eventEmitter.on('error', function (result) { console.log('Im Error'); });
require('http').createServer(function (req, res) {
res.end('Response');
var start = new Date().getTime();
eventEmitter.emit('data', true);
eventEmitter.emit('error', false);
while(new Date().getTime() - start < 5000) {
//Let me sleep
}
process.nextTick(function () {
console.log('This is event loop');
});
}).listen(8090);
Nodejs is single threaded and it runs in an eventloop and the same thread serves the events.
So, in the above code on a request to my localhost:8090 node thread should be kept busy serving the request [there is a sleep for 5s].
At the same time there are two events being emitted by eventEmitter. So, both these events must be queued in the eventloop for processing once the request is served.
But that is not happening, I can see the events being served synchronously as they are emitted.
Is that expected? I understand that if it works as I expect then there would be no use of extending events module. But how are the events emitted by eventEmitter handled?
Only things that require asynchronous processing are pushed into the event loop. The standard event emitter in node will dispatch an event immediately. Only code using things like process.nextTick, setTimeout, setInterval, or code explicitly adding to it in C++ affect the event loop, like node's libraries.
For example, when you use node's fs library for something like createReadStream, it returns a stream, but opens the file in the background. When it is open, node adds to the event loop and when the function in the loop gets called, it will trigger the 'open' event on the stream object. Then, node will load blocks from the file in the background, and add to the event loop to trigger data events on the stream.
If you wanted those events to be emitted after 5 seconds, you'd want to use setTimeout or put the emit calls after your busy loop.
I'd also like to be clear, you should never have a busy loop like that in Node code. I can't tell if you were just doing it to test the event loop, or if it is part of some real code. If you need more info, please you expand on the functionality you are looking to achieve.

Resources