I have my regular unit tests in folders with my services
Now I created new folder called integration/ and inside this folder all my tests look like anotherFolder/testSomeApi.integration.js
I did this, so that when I call node jest, it runs all the unit tests but not the integration tests. I want to call integration tests from my docker container with separate command
How can I call something like jest *integration.js so that all tests in integration folder with extension integration.js gets called?
jest --testPathPattern=".*/folderName/.*.spec.ts"
is working for me.
Inside your integration folder create a config file for jest, e.g. jest-integration.json
{
"rootDir": ".",
"testEnvironment": "node",
"testRegex": ".integration.js$",
}
Now you can run jest from your project root like so:
jest --config ./integration/jest-integration.json
You could save this line as an NPM script in your package.json and use it like
npm run test:integration.
In the end I did
jest "(/integration/.*|\\.(integration))\\.(js)$"
Related
I'm running tests in a node app using jest. I can get tests running properly, but I can't seem to tell jest to ignore directories. For example, when I try to test a specific file convertExistingImages.ts with the command: npm test convertExistingImages I get a response in my terminal of:
> mfa-bot-2022#1.0.0 test
> jest
FAIL dist/utils/maintenance.ts/convertExistingImages.test.js
● Test suite failed to run
Your test suite must contain at least one test.
(...)
FAIL src/utils/maintenance/convertExistingImages.test.ts
● Test suite failed to run
(...)
As you can see, a duplicate file in my /dist folder is also being tested, which I don't want.
I've tried updating my jest.config.ts file as follows:
module.exports = {
"preset": "#shelf/jest-mongodb",
"modulePathIgnorePatterns": ["/build/"],
"testPathIgnorePatterns": ["/node_modules/", "/build/"]
}
But the modulePathIgnorePatterns and testPathIgnorePatterns settings aren't having any effect.
Can anyone tell me what I'm doing wrong?
You configured it to ignore the build folder but your conflict is in the dist folder. Change build to dist in your ignore settings.
You can read more about this config on the Jest site here.
I have a SAM application with a bunch of Lambda functions and layers, using Mocha/Chai to run unit tests on the individual functions.
The issue is, that I am also using Layers for shared local modules.
The SAM project structure is like this..
functions/
function-one/
app.js
package.json
function-two/
app.js
package.json
layers/
layer-one/
moduleA.js
moduleB.js
package.json
layer-two/
moduleC.js
package.json
According to AWS once the function and layers are deployed, to require a local layer from a function you use this path...
const moduleA = require('/opt/nodejs/moduleA');
However, that running locally as a unit test wont resolve to anything.
Any idea on how to resolve the paths to the layer modules when running unit tests?
I could set an ENV var, and then set a base path for the layers based on that, but I was wondering if there was a more elegant solution I was missing...
Is there any way to alias the paths when running Mocha ?
other options are to use SAM INVOKE but that has massive overheads and is more integration testing...
I swapped over to using Jest which does support module mappings
In the package.json...
...
"scripts": {
"test": "jest"
},
"jest": {
"moduleNameMapper": {
"^/opt/nodejs/(.*)$": "<rootDir>/layers/common/$1"
}
}
...
In my application, while developing, I run:
npm run test src/components/component.test.tsx
This runs the specific test suite for the component I'm working on.
On top of that, I can then change it to:
npm run test src/components/component.test.tsx -- --coverage --coverageReporters=text-summary --collectCoverageFrom=src/components/component.tsx
Which will print a coverage report for that specific file once the tests have been run.
As you can see this is extremely wordy and only gets worse if I want to test two or three files at the same time.
Is there any way to automate collectCoverageFrom to collect coverage from the files that have been tested (not from all files in the project) so that I don't have to type it out manually every single time?
Just omit the "collectCoverageFrom" (or explicitly set it to an empty glob if you're overriding the config file).
Jest will then only collect coverage from files that are used during the test run.
Set it up in your jest configuration file.
your npm script will look like jest -c path/to/jest.config.js
jest.config.js will look like
module.exports = {
collectCoverage: true,
// The directory where Jest should output its coverage files
coverageDirectory: "./coverage",
// Indicates which provider should be used to instrument code for coverage
coverageProvider: "v8",
// A list of reporter names that Jest uses when writing coverage reports
coverageReporters: ["html", "text", "cobertura"],
}
If you do jest --init it will help you build a new config file
Side note: You may want to set up a jest wildcard so you don't need to individually write down every file you want to test.
I'm using Jest to make integration tests for my serverless framework service. Currently I have .env files specifying the environment variables and there is where I have my problem.
In my package.json i have :
...
"scripts": {
...
"start": "npx sls offline start --env local --httpPort xxxx --port xxxx --lambdaPort xxxx"
}
...
When I call yarn start the service starts correctly reading from .env.local file. But when I call exec('yarn start') inside the beforeAll function (because I need to run the service to test the endpoints) the service starts with the configuration from the .env file and not .env.local.
I ran out of ideas of how to set the right variables, I have used jest setupFiles and tried to set the variable manually like process.env.ENV1='XX' but it did not work. Until now the only thing that worked out was to change my test script from jest to ENV1=X ENV2=Y ENV3=Z jest but it does not feel right.
There is a nice serverless plugin called "serverless-export-env", it exports all environment variables you set in serverless.yml, so that you can use them in jest, or invoke them locally.
After install the plugin, you need to put it at the first item of the plugins key, like:
plugins:
- serverless-export-env
- serverless-plugin-log-retention
- serverless-offline
also specify the export settings, in custom
custom:
export-env:
filename: .env
overwrite: false
enableOffline: true
in this example, the environment variable are exported to a .env file in your project root.
Then, you can run serverless export-env to export the environment variable to .env.
In additional, you can automate this process by adding this command to your script in package.json, so that when you run npm test, it also run serverless export-env for you, for more, see this doc.
Hope it helps.
I'm interested in creating full mocked unit tests, as well as integration tests that check if some async operation has returned correctly. I'd like one command for the unit tests and one for the integration tests that way I can run them separately in my CI tools. What's the best way to do this? Tools like mocha, and jest only seem to focus on one way of doing things.
The only option I see is using mocha and having two folders in a directory.
Something like:
__unit__
__integration__
Then I'd need some way of telling mocha to run all the __unit__ tests in the src directory, and another to tell it to run all the __integration__ tests.
Thoughts?
Mocha supports directories, file globbing and test name grepping which can be used to create "groups" of tests.
Directories
test/unit/whatever_spec.js
test/int/whatever_spec.js
Then run tests against all js files in a directory with
mocha test/unit
mocha test/int
mocha test/unit test/int
File Prefix
test/unit_whatever_spec.js
test/int_whatever_spec.js
Then run mocha against specific files with
mocha test/unit_*_spec.js
mocha test/int_*_spec.js
mocha
Test Names
Create outer blocks in mocha that describe the test type and class/subject.
describe('Unit::Whatever', function(){})
describe('Integration::Whatever', function(){})
Then run the named blocks with mochas "grep" argument --grep/-g
mocha -g ^Unit::
mocha -g ^Integration::
mocha
It is still useful to keep the file or directory separation when using test names so you can easily differentiate the source file of a failing test.
package.json
Store each test command in your package.json scripts section so it's easy to run with something like yarn test:int or npm run test:int.
{
scripts: {
"test": "mocha test/unit test/int",
"test:unit": "mocha test/unit",
"test:int": "mocha test/int"
}
}
mocha does not support label or category. You understand correctly.
You must create two folders, unit and integration, and call mocha like this
mocha unit
mocha integration