I'm using Chutzpah, with Jasmine, to unit test a number of AMD modules using Require.js.
My unit test project is separate to both the modules under test and the require.js config file.
I'm using chutzpah.json to connect these together, as such:
{
"Framework": "jasmine",
"TestHarnessReferenceMode": "AMD",
"TestHarnessLocationMode": "SettingsFileAdjacent",
"EnableTestFileBatching": true,
"AMDBasePath": "matches baseUrl path in require.js config file",
"References" : [
{"Path" : "path to require.js" },
{ "Path": "path to require.js config file" }
],
"Tests" : [
{"Path": "Specs"}
]
}
The tests run okay as expected.
The issue is that somewhere in the magic of resolving dependencies, I'm getting errors that a number of the css files cannot be located. These are relative paths and I'm guessing that because I'm initiating the test from a separate project it cannot correctly identify the base path.
As I say this isn't an issue running tests locally, but would cause an issue when integrating with a CI build.
Has anybody experienced this before and know of a workaround?
Related
So I want to run an e2e test using jest. I'm using nx monorepo architecture, and I have all my assets in a library folder and also nestjs microservices for my backend. I have all my proto files for my microservices in the library, and when I want to load them in my microservices, I do it like this :
protoPath: join(__dirname, 'assets-shared/job.proto'),
and in my workspace.json in my build i change the assets-shared like this:
"targets": {
"build": {
"options": {
"assets": [
{
"input": "libs/backend/shared/src/lib/assets",
"glob": "**/*",
"output": "assets-shared"
}
]
},
all is good, but when I run the test and when it wants to import and give value to it, it doesn't change it, and I have this error which is trying in its folder and not the library folder
ENOENT: no such file or directory, open '/home/dev/Project/apps/backend/api/src/modules/product/assets-shared/job.proto'
I tried the moduleNameMapper to give the libs folder to it manually but no avail.
moduleNameMapper: {
'^.+\\.(proto)$':
'<rootDir>/libs/backend/shared/src/lib/assets/$1',
// '^assets-shared(.*)': '/libs/backend/shared/src/lib/assets/$1',
},
non of these two worked
Have you considered publishing the libraries to a private npm repository or something like artifactory e.g. #my-company/assets
The approach you are trying may work locally, but for a ci/cd pipeline it would be much better to have a versioned artifact in npm or artifactory
I'm trying to run e2e tests for a monorepo application that also utilises several libraries from within the monorepo, all imports throughout the application are resolved using "#app" imports, for example import { ConfigService } from "#app/config"
However, when trying to run e2e tests via the command:
"test:public": "jest --config ./apps/public-microservice/test/jest-e2e.json",
Jest throws:
Cannot find module '#app/config' from '../src/public-microservice.module.ts'
I've looked at this demo-repo from #jmcdo29 and can't find anything that is different with my configuration.
I've noticed there was an issue about wrong configurations being generated via jest here in 2019, but it has long been resolved, and my configuration for jest in package.json does indeed mention:
"moduleNameMapper": {
"#app/config/(.*)": "<rootDir>/libs/config/src/$1",
"#app/config": "<rootDir>/libs/config/src",
whilst the local targeted file by the package.json script command only contains:
{
"moduleFileExtensions": ["js", "json", "ts"],
"rootDir": ".",
"testEnvironment": "node",
"testRegex": ".e2e-spec.ts$",
"transform": {
"^.+\\.(t|j)s$": "ts-jest"
}
}
Is there something that's missing from my command or my configuration?
Is there anything I need to specify to tell jest to extend the configuration for jest available in package.json?
Any help investingating this is appreciated, thanks.
I added the "moduleNameMapper" key to my jest e2e suite config and updated the "../" relative paths to match where my libs are, in my specific scenario, it looks like this:
"moduleNameMapper": {
"#app/config/(.*)": "<rootDir>../../../libs/config/src/$1",
"#app/config": "<rootDir>../../../libs/config/src",
},
I have my node server on path F:\proj\dev-react-node-java\src\server. I used 'jasmine init' to create spec folder here and running 'jasmine' in terminal runs the specs (tests) correctly.
I wish to run the tests from F:\proj\dev-react-node-java so I used the command
jasmine --config=src/server/spec/support/jasmine.json
at this path but I get the message 'No specs found'. Why is it not using the correct configuration file (jasmine.json)?
I am sure --config reaches for this file because:
Giving wrong path gives 'Cannot find module' error.
Writing errorful json also generates and error.
Here is the jasmine.json code for reference:
{
"spec_dir": "spec",
"spec_files": [
"**/*[sS]pec.js"
],
"helpers": [
"helpers/**/*.js"
],
"stopSpecOnExpectationFailure": false,
"random": true
}
spec/support/jasmine.json is the default path as far as I understand since running 'jasmine' command at path say F:\proj\dev-react-node-java\src\server\spec also results in No specs found.
jasmine version is 3.6.1
P.S. This is my first question asked here. Please inform if I made any mistakes in asking. Thank you.
I did find the reason. It is indeed not an issue with the config flag but rather with my jasmine.json file.
What I thought the use of config flag was to specify the path to the file instead of the default spec/support/jasmine.json. It would then have the same behaviour as if the relative path to config was spec/support/jasmine.json.
But
F:\proj\dev-react-node-java>jasmine --config=src/server/spec/support/jasmine.json
is not the same as
F:\proj\dev-react-node-java\src\server>jasmine --config=spec/support/jasmine.json
What it does instead is like copying it to the path from where the command was called and then using it to run the tests.
Hence, what worked was changing the spec_dir field.
{
"spec_dir": "src/server/spec",
"spec_files": [
"**/*[sS]pec.js"
],
"helpers": [
"helpers/**/*.js"
],
"stopSpecOnExpectationFailure": false,
"random": true
}
A little more clarification/examples in the docs would have been nicer but perhaps I misunderstood the functionality.
With a node.js project, I've added eslint-plugin-security and it is giving a lot of warnings for code in my test/spec files (using mochajs). Since the test code won't be running in production, these don't seem as useful as they do in the project's actual code. (A lot of Generic Object Injection Sink warnings )
Is there a way to have the security plugin ignore certain files other than putting /* eslint-disable */ at the top of every spec file?
The best way I found to deal with this case is based on this answer.
You can override parts of your eslint file in a subfolder. In my case I'm disabling problematic rules from a jest plugin inside my e2e tests folder. Example .eslintrc.js in /e2e-tests/ :
module.exports = {
overrides: [
{
files: ["*.spec.js"],
rules: {
"jest/valid-expect": 0
}
}
]
};
There is three way to ignore files or folders:
1. Creating a .eslintignore on your project root folder with the thing you want to ignore:
**/*.js
2. Using eslint cli & the --ignore-path to specify another file where your ignore rules will be located
eslint --ignore-path .jshintignore file.js
3. Using your package.json
{
"name": "mypackage",
"version": "0.0.1",
"eslintConfig": {
"env": {
"browser": true,
"node": true
}
},
"eslintIgnore": ["*.spec.ts", "world.js"]
}
Official Documentation
On my side, I had issue with Intellij IDEA where eslint was checking files in a folder only dedicated to Typescript (+tslint) which was a pain, so I've picked solution 3.
I'm currently migrating a codebase from Babel 6 to 7. The code is made up of multiple individual projects with their own configs.
The main project imports files from external however the scripts being imported from external by main aren't being transpiled and fails on "Unexpected token import". Scripts located directly in main do transpile correctly.
I'm using the following command within the main project to transpile the scripts:
babel-node ./index.js
Another project uses Webpack to do the same thing and handles everything correctly.
This setup also worked fine with Babel 6.
.babelrc for main
{
"ignore": [
"node_modules"
],
"presets": [
["#babel/preset-env", {
"targets": {
"node": "current"
},
"useBuiltIns": "entry"
}]
],
"plugins": [
[
"module-resolver", {
"alias": {
"External": "../external"
}
}
],
"#babel/plugin-proposal-decorators",
"#babel/plugin-proposal-object-rest-spread",
"#babel/plugin-proposal-export-default-from",
"#babel/plugin-proposal-export-namespace-from",
"#babel/plugin-proposal-class-properties"
]}
.babelrc for external
{
"presets": [
"#babel/preset-react"
],
"plugins": [
"#babel/plugin-proposal-class-properties",
"#babel/plugin-proposal-object-rest-spread",
"#babel/plugin-transform-runtime"
]}
I've created an example to detail my problem at:
https://gitlab.com/nerdyman/babel-7-external-import-broken
TL;DR I'm trying to import scripts from outside of a project's root directory but they don't get transpiled by Babel, the scripts from within the project do transpile.
I've managed to fix this by following this comment.
The solution is:
Move .babelrc in the main project to babel.config.js and make it a CommonJS module
Add --ignore=node_modules when running babel-node from the main project
This still seems pretty hacky and Babel doesn't seem to acknowledge the ignore property within babel.config.js it must be specified as a flag.
Babel 7 appears to only allow imports within the directory the babel config is in, however explicitly setting the --ignore flag overrides this.
You can view my working demo and the diff of what I changed to get it working.
The issue is still open on GitHub so there may be a better solution in the future.
current directory's .babelrc won't be loaded while import files in external directory, you may place a .babelrc in that directory and set its presets by relative path(instead of short name):
{ "presets": ["..\pad\node_modules\babel-preset-env"],
"retainLines": true }