I'm using jest (^28.1.3) for running unit and e2e tests on backend (Apollo Server, TypeORM and PostgreSQL). There aren't problems to run unit tests (one file, two or all), but I have a problem with e2e testing.
On local machine
If I run one file with e2e tests (e.g. regression.e2e-spec.ts), I'll get an error.
If I run two files (e.g. regression.e2e-spec.ts + someUnit.test.ts, or regression.e2e-spec.ts + otherE2ETest.e2e-spec.ts), I won't get any error, testing will finished successful.
On GitHub Actions
I get same error in any cases (run one, two or all files)
May jest execute tests in main process/thread when it finds only one file? And split execution on several process when find several files?
The problem might be in an isolation, with one file environment isn't clean.
Files are filtered in jest.config.js, write file name in testMatch.
/** #type {import('ts-jest/dist/types').InitialOptionsTsJest} */
module.exports = {
roots: ['<rootDir>/src'],
testEnvironment: 'node',
testMatch: ['**/src/**/*.(test|e2e-spec).ts'], // keep in sync with src/__test__/setup/E2ERegExp.ts
globalSetup: './src/__test__/setup/globalSetup.ts',
globalTeardown: './src/__test__/setup/globalTeardown.js',
setupFilesAfterEnv: ['./src/__test__/setup/setup.ts'],
transform: {
'^.+\\.ts': 'ts-jest',
},
};
In globalSetup and globalTeardown I setup and drop databases. In setup.ts run server and connect to db.
Tests are executed in parallel.
"docker:test": "RUN_DB=true NODE_OPTIONS=--max-old-space-size=4096 jest --colors --maxWorkers=50%",
Related
I'm running tests in a node app using jest. I can get tests running properly, but I can't seem to tell jest to ignore directories. For example, when I try to test a specific file convertExistingImages.ts with the command: npm test convertExistingImages I get a response in my terminal of:
> mfa-bot-2022#1.0.0 test
> jest
FAIL dist/utils/maintenance.ts/convertExistingImages.test.js
● Test suite failed to run
Your test suite must contain at least one test.
(...)
FAIL src/utils/maintenance/convertExistingImages.test.ts
● Test suite failed to run
(...)
As you can see, a duplicate file in my /dist folder is also being tested, which I don't want.
I've tried updating my jest.config.ts file as follows:
module.exports = {
"preset": "#shelf/jest-mongodb",
"modulePathIgnorePatterns": ["/build/"],
"testPathIgnorePatterns": ["/node_modules/", "/build/"]
}
But the modulePathIgnorePatterns and testPathIgnorePatterns settings aren't having any effect.
Can anyone tell me what I'm doing wrong?
You configured it to ignore the build folder but your conflict is in the dist folder. Change build to dist in your ignore settings.
You can read more about this config on the Jest site here.
I have an Express.JS server which uses jest and supertest as a testing framework.
It has been working excellently.
When I call my test npm script, it runs npx jest and all of my test files run in parallel.
However I ran my tests recently and they ran sequentially which takes a very long time, they have done this ever since.
I haven't changed any jest or npm configuration, nor have I changed my test files themselves.
Has anyone experienced this? Or is it possible that something in my configuration is incorrect?
jest.config
export default {
setupFilesAfterEnv: ['./__tests__/jest.setup.js'],
}
jest.setup.js
import { connectToDatabase } from '/conn'
// Override the dotenv to use different .env file
require('dotenv').config({
path: '.env.test',
})
beforeAll(() => {
connectToDatabase()
})
test('', () => {
// just a dummy test case
})
EDIT: Immediately after posting the question, I re ran the tests and they ran in parallel, without me changing anything. If anyone has any knowledge around this i'd be interested to get a second opinion
After intermittent switching between parallel and sequential for unknown reasons. I have found it work consistently by adding the --no-cache arg to the npx jest call.
See below where I found the answer
Github -> jest not always running in parallel
In my application, while developing, I run:
npm run test src/components/component.test.tsx
This runs the specific test suite for the component I'm working on.
On top of that, I can then change it to:
npm run test src/components/component.test.tsx -- --coverage --coverageReporters=text-summary --collectCoverageFrom=src/components/component.tsx
Which will print a coverage report for that specific file once the tests have been run.
As you can see this is extremely wordy and only gets worse if I want to test two or three files at the same time.
Is there any way to automate collectCoverageFrom to collect coverage from the files that have been tested (not from all files in the project) so that I don't have to type it out manually every single time?
Just omit the "collectCoverageFrom" (or explicitly set it to an empty glob if you're overriding the config file).
Jest will then only collect coverage from files that are used during the test run.
Set it up in your jest configuration file.
your npm script will look like jest -c path/to/jest.config.js
jest.config.js will look like
module.exports = {
collectCoverage: true,
// The directory where Jest should output its coverage files
coverageDirectory: "./coverage",
// Indicates which provider should be used to instrument code for coverage
coverageProvider: "v8",
// A list of reporter names that Jest uses when writing coverage reports
coverageReporters: ["html", "text", "cobertura"],
}
If you do jest --init it will help you build a new config file
Side note: You may want to set up a jest wildcard so you don't need to individually write down every file you want to test.
I have a react app and I don't know why I don't need to require the jest module.
import Task from './Task';
describe('class Task', () => {
it('inProgress()', () => {
var t = new Task("prova");
expect(t.isInProgress()).not.toBeTruthy();
});
});
The test command for create-react-app runs react-scripts test --env=jsdom.
The script for react-scripts test requires jest on this line and after configuring everything it runs jest on this line.
jest then finds your test files, loads them, and runs them.
In other words, your tests don't load and run jest, it's jest that loads and runs your tests.
Since your tests run within jest they can take advantage of the globals, expect, environment, etc. provided by jest without having to "require or import anything to use them".
I've been messing around with node-replay (https://github.com/assaf/node-replay) to see if there is a way I can hook it up with my protractor tests to get my tests to run with recorded data (so they run quicker and not so damn slow).
I installed node-replay as instructed on the github page. Then in my test file I include some node replay code as follow
describe('E2E: Checking Initial Content', function(){
'use strict';
var ptor;
var Replay = require('replay');
Replay.localhost('127.0.0.1:9000/');
// keep track of the protractor instance
beforeEach(function(){
browser.get('http://127.0.0.1:9000/');
ptor = protractor.getInstance();
});
and my config file looks like this:
exports.config = {
seleniumAddress: 'http://0.0.0.0:4444/wd/hub',
// Capabilities to be passed to the webdriver instance.
capabilities: {
'browserName': 'chrome'
},
// Spec patterns are relatie to the current working directly when
// protractor is called.
specs: ['test/e2e/**/*.spec.js'],
// Options to be passed to Jasmine-node.
jasmineNodeOpts: {
showColors: true,
defaultTimeoutInterval: 300000
}
};
Then I try to rub my tests with grunt by saying
REPLAY=record grunt protractor
But I get tons of failures. Grunt protractor was running all of tests fine and with no failures before I added node-replay so maybe my logic is flawed in how to connect these two together. Any suggestions as to what I'm missing
1) E2E: Sample test 1
Message:
UnknownError:
Stacktrace:
UnknownError:
at <anonymous>
Problem is that http requests to 127.0.0.1:9000 are done by the Browser, not within your NodeJS Protractor code, so replay won't work in this infrastructure scenario.
There is ongoing discussion on Protractor Tests without a Backend here and some folks relies on mocking the backend client side with Protractor's addMockModule in a similar way they already do for Karma unit tests.
Personally I don't agree with mocking for e2e since the whole point of end-to-end was to test the whole real app.
HTTP replay may not be such a bad idea to get things go faster.
Ideally what i hoped to find was a tool that works like this:
Run a proxy capture http server the first time for later replay:
capture 127.0.0.1:9000 --into-port 3333
Run your e2e tests against a baseUrl = '127.0.0.1:3333';. All requests/responses will be cached/saved.
Serve the cached content from now on:
replay --at-port 3333
Run your e2e tests again still on baseUrl por 3333. This time it should run faster since it's serving cached content.
Couldn't find it, let me know if you have better luck!