How to persist() when using nockNack with jest and Node? - node.js

I am currently working on some unit tests for an express app.
I am using "jest": "^29.4.1", "nock": "^13.3.0",.
The tests I am writing use nockBack.
Imagine I have 3 separate test files that run the code below. The first 2 properly run, save a nock fixture in the proper directory and then re-run just fine. As soon as I introduce a 3rd test; it runs and passes the first time (and saves a fixture etc...) but if I re-run the 3rd test it fails with this error error: Error [NetworkingError]: Nock: No match for request.... I read in the docs that a way to alleviate this is to use the persist() method BUT this is not documented for nockBack only for methods using nock to make calls to pseudo endpoints. I am testing 3rd party api calls that need to go out initially on the netowrk and then subsequent calls will be pulled from the fixtures.
I tried clearing interceptor use by adding these to all my tests:
beforeEach(() => nock.cleanAll());
afterEach(() => nock.cleanAll());
But this does not help to make the 3rd test pass when re-running.
I also tried adding persist() like so: const { nockDone } = await nockBack('post-data.json').persist(); <---- but this fails since it's not a recognized method.
Is there a way to make this work when using nockBack?
Test 1
const nockBack = require('nock').back;
const path = require('path');
const { getPosts } = require('./post');
nockBack.fixtures = path.join(__dirname, '__nock-fixtures__');
nockBack.setMode('record');
test('return a list of posts by a user', async () => {
const userId = 1;
const { nockDone } = await nockBack('post-data.json');
const data = await getPosts(userId);
expect(data.length).toBeGreaterThan(0);
data.forEach((post) => {
expect(post).toEqual(
expect.objectContaining({
userId,
})
);
});
nockDone();
});

Related

Handle async imports in jest.mock factory

I am reusing db mocks in different tests so I created aka factories for mocking a single table and calling them in jest.mock()
jest.mock('db', () => {
const {
UsersMockFactory,
RequestMockFactory
} = jest.requireActual('../../../../mocks');
return {
Users: UsersMockFactory(),
Requests: RequestMockFactory(),
};
});
The problem is that 19 tests will pass with mock from that file but 20th will throw an error RequestMockFactory is not a function.
I've tried using const mocks = require('../../../../mocks') and then console.log the result but RequestMockFactory still wasn't there for some reasons. I don't know why but it is the main reason I have this problem. I've tried to use await import() like this and it imports correctly:
jest.mock('db', async () => {
const {
UsersMockFactory,
RequestMockFactory,
} = await import('../../../../mocks');
return {
Users: UsersMockFactory(),
Requests: RequestMockFactory(),
};
});
Now the problem is that mock factory returns a Promise for an entire mock and obviously methods on a model are undefined.
I've tried to mock inside describe() , beforeAll(), inside it() - jest.mock is not working there.
How to handle imports in jest.mock without such pain?

I node.js how to update the value of an already required module

I am a little step away of a very simple database cache system with nodejs. I don't want to install a complete data handling solution like Redis, I just need to read and cache data and sometimes reload it from DB, without any handling feature, so I implemented this:
In my cache module, for the first evaluation I query DB and return the result in module.exports
So all following require for this module will return the cached value instead of querying the DB again >> I made it so far and I divided by 5 the consumed time.
Then I need a function to reload the module >> I failed there...
I feel like I am a very little step away to succeed, here is what I implemented (I run node.js v14.15.1, I simplified all code to make it more readable...):
Firstly I made the module to export the permissions by itself, it works, I got the list of permissions when I require the module:
/** FILE cached_permissions.js **/
const permissionModel = require("../models/permission.model")
getPermissionsAtInit = async () => {
return await permissionModel.find().exec()
}
module.exports = getPermissionsAtInit()
Then, the classic route file:
/** FILE permissions.route.js **/
// Permission service
const PermissionService = require("../services/permission.service")
// Routes to get permissions from DB
app.get("/api/permissions", async (req, res) => {
res.send(await PermissionService.getPermissionsFromDB()} )
// Route to get permission from cache
app.get("/api/cached-permissions", async (req, res) => {
res.send(await PermissionService.getCachedPermissions()})
// Route to reset permission cache
app.get("/api/reset-cache", async (req, res) => {
await PermissionService.resetCache()
res.send("cache reseted")})
Here is the permission service module
/** FILE permission.service.js */
// Module with cached permissions
const cachedPermissions = require("./cached_permissions.js")
// Module to query the db
const permissionModel = require("../models/permission.model")
class PermissionService {
static getPermissionsFromDB = async () => {
// Mongoose query to get data from DB
return await permissionModel.find().exec()
}
static getCachedPermissions = () => {
// Directly returns value from cached module
return cachedPermissions
}
static resetCache = async () => {
// firstly we delete the cached module
delete require.cache[require.resolve("./cached_permissions.js")]
// then we require again the module to force the re-cache
require("./cached_permissions.js")
}
}
module.exports = PermissionService
So I can run the following:
http call /api/permissions >> returns the list of permissions from DB ~50ms
http call /api/cached-permissions >> returns the same list of permissions ~10ms
I manually update data in mongoDB
2nd http call to /api/permissions >> returned updated data
2nd http call to /api/cached-permissions >> returned first set of data
So for so good, this is the expected behavior
Then http call to /api/reset-cache
3rd call to /api/cached-permissions >> still return not updated data
I know the /api/reset-cache is done because I console.log Object.keys(require.cache), and I can see it deleted after the delete require.cache[...] and I can see it appear again after the new require("./cached_permissions.js").
But anyway, when I call again /api/cached-permissions, the value has not been updated. It feels like the const cachedPermissions = require("./cached_permissions.js") from my file permission.service.js has been loaded only once when the server starts when the permissions.route.js file require the permission.service.js file, and not each time a route is called, even if I reload a specific module.
Do you know a way I could reload the module in already loaded file, or update the value of a module in an alreay loaded file or some other way to make this work? I feel like I am a very little step away of a very simple DB cache system...
It feels like the const cachedPermissions = require("./cached_permissions.js") from my file permission.service.js has been loaded only once when the server starts, and not each time a route is called
Well that's exactly what your code is written to do. The cachedPermissions is a constant in your permission.service.js module, and you never update that constant. If instead you would do
static getCachedPermissions = () => {
// Directly returns value from cached module
return require("./cached_permissions.js");
}
it would load the module (from the module cache, or evaluate the module code if fresh) on every request.
But really there's no reason to use the module cache for this. A simple variable in your permission service module would suffice:
/* permission.service.js */
const permissionModel = require("../models/permission.model"); // Module to query the db
async function getPermissionsFromDB() {
// Mongoose query to get data from DB
return await permissionModel.find().exec()
}
let cachedPermissions = getPermissionsFromDB();
module.exports.getPermissionsFromDB = getPermissionsFromDB;
module.exports.getCachedPermissions = () => cachedPermissions;
module.exports.resetCache = () => {
cachedPermissions = getPermissionsFromDB();
};

How to do callback in our component using react jest test cases

How can we do callback on success and failue cases for below lines of code for test coverage using jest
const handleService = () => {
window.domain.service("1321",'',onSuccess, onFailure)
}
const onSuccess = () => {
....update state values
}
const onFailure = () => {
....update state values
}
Something like this:
Spy on window.domain.service to gain access to the calls it receives. This will allow you to access the parameters of those calls which will be "1321",'',onSuccess, onFailure
Assign the function you wish to test to a variable
Invoke the function to execute the code in it (this will get you the coverage)
(Optional) assert that the callback functions behave correctly
Here is a snippet to help demonstrate
it('should run', () => {
// Some setup to create the function on the window, may not be needed if done elsewhere.
// Could be good to do this in a beforeEach and clean up in afterEach to avoid contaminating the window object
window.domain = {
service: () => {},
}
// Spy on the window.domain.service method.
// Provide a mock implementation if you don't want the real one to be called
const serviceSpy = jest.spyOn(window.domain, 'service');
executeYourCode();
// capture the arguments to the call
const [_arg1, _arg2, onSuccess, onFailure] = serviceSpy.mock.calls[0];
// execute the callbacks
onSuccess();
onFailure();
});

Populating mongodb in one unit test intereferes with another unit test

I'm trying to run all of my unit tests asynchronously, but calling a function to populate the database with some dummy data interferes with the other unit tests that run at the same time and that make use of the same data.
collectionSeed.js file:
const {ObjectID} = require('mongodb');
import { CollectionModel } from "../../models/collection";
const collectionOneId = new ObjectID();
const collectionTwoId = new ObjectID();
const collections = [{
_id: collectionOneId
}, {
_id: collectionTwoId
}];
const populateCollections = (done) => {
CollectionModel.remove({}).then(() => {
var collectionOne = new CollectionModel(collections[0]);
collectionOne.save(() =>{
var collectionTwo = new CollectionModel(collections[1]);
collectionTwo.save(() => {
done();
});
});
});
};
unitTest1 file:
beforeEach(populateCollections);
it('Should run', (done) => {
//do something with collection[0]
})
unitTest2 file:
beforeEach(populateCollections);
it('Should run', (done) => {
//do something with collection[0]
})
I'm running unit tests that change, delete, and add data to the database, so using beforeEach is preferable to keep all of the data consistent, but the CollectionModel.remove({}) functions often run in between an it function from one file and a second it function inside the other unit test file, so one unit test is working fine, while the second it is trying to use data that doesn't exist.
Is there anyway to prevent the different unit test files from interfering with each other?
I recommend you create a database per test file, for example adding to the DB name the name of the file. So you just have to take care of tests not interfering inside the same file, but you can forget about tests in other files.
I think that managing fixtures is one the most troublesome parts of unit testing, so with this, creating and fixing unit tests is going to become smoother.
As a trade off, each test file will take more execution time; but in my opinion in most of the cases it is worth enough.
Ideally each test should be independent of the rest, but, in general, that would take way too much overhead, so I recommended the once per test file approach.

Resemblejs in jest hangs

I'm using ResembleJS for image comparison. I can get it to run when I run it in a standalone script. Here's the code:
var compareImages = require('resemblejs/compareImages');
var fs = require('fs');
var path = require('path');
// The parameters can be Node Buffers
// data is the same as usual with an additional getBuffer() function
async function getDiff() {
var img = path.join(__dirname, 'small.jpg');
const data = await compareImages(
fs.readFileSync(img),
fs.readFileSync(img)
);
console.log(data);
fs.writeFileSync('./output.png', data.getBuffer());
}
getDiff();
Everything works as expected.
But when I run the comparison inside of a test in with the jest framework, it hangs and eventually times out. At first I thought maybe it was just running really slow, so I set my max timeout in jest to be 1 minute. Still failed. So I set my test image to be 1 pixel so it's the simplest test. Still wouldn't finish.
Running from a docker container with Node 8.9.4 (which is what comes from the docker hub node:8). Running jest 22.0.4.
Anybody else have issues running these two together?
I know Resemblejs runs tests with Jest, so not sure what could be causing the issue.
could you please post the code for your tests ?
Are you sure you are returning something from your test block ? In order for an test not to hang you need to return a promise which will resolve before the timeout. Below two examples
test("test", () => {
// test is done when writeFile resolves
return new Promise((resolve, reject) => {
fs.writeFile("path", "encoding", (err) => {
if (err) {
reject(err);
} else {
resolve();
}
});
});
});
test("test", async function () {
// test is done after the assertion
const result = await fetch();
expect(result).toBe(); // test;
});
I had a similar problem with slow tests with Jest, React and Docker (but I'm not using Resemblejs).
I found the solution on Github:
And for me solution was simply add "roots": ["./src"] to jest.config.js

Resources