change the path of asset files in jest - jestjs

So I want to run an e2e test using jest. I'm using nx monorepo architecture, and I have all my assets in a library folder and also nestjs microservices for my backend. I have all my proto files for my microservices in the library, and when I want to load them in my microservices, I do it like this :
protoPath: join(__dirname, 'assets-shared/job.proto'),
and in my workspace.json in my build i change the assets-shared like this:
"targets": {
"build": {
"options": {
"assets": [
{
"input": "libs/backend/shared/src/lib/assets",
"glob": "**/*",
"output": "assets-shared"
}
]
},
all is good, but when I run the test and when it wants to import and give value to it, it doesn't change it, and I have this error which is trying in its folder and not the library folder
ENOENT: no such file or directory, open '/home/dev/Project/apps/backend/api/src/modules/product/assets-shared/job.proto'
I tried the moduleNameMapper to give the libs folder to it manually but no avail.
moduleNameMapper: {
'^.+\\.(proto)$':
'<rootDir>/libs/backend/shared/src/lib/assets/$1',
// '^assets-shared(.*)': '/libs/backend/shared/src/lib/assets/$1',
},
non of these two worked

Have you considered publishing the libraries to a private npm repository or something like artifactory e.g. #my-company/assets
The approach you are trying may work locally, but for a ci/cd pipeline it would be much better to have a versioned artifact in npm or artifactory

Related

Using NodeJS development dependencies in Heroku review-app post-deploy step

I have a (demo) application hosted on Heroku. I've enabled Heroku's "review app" feature to spin up new instances for pull request reviews. These review instances all get a new MongoDB (on mLab) provisioned for them through Heroku's add-on system. This works great.
In my repository, I've defined some seeder scripts to quickly get a test database up and running. Running yarn seed (or npm run seed) will fill the database with test data. This works great during development, and it would be perfect for review apps as well. I want to execute the seeder command in the postdeploy hook of the Heroku review app, which can be done by specifying it under the environment.review section of the app.json file. Like so:
{
"name": "...",
"addons": [
"mongolab:sandbox"
],
"environments": {
"review": {
"addons": [
"mongolab"
],
"scripts": {
"postdeploy": "npm run seed"
}
}
}
}
The problem is, the seeder script relies on some development-only dependencies (faker, ts-node [this is a TypeScript project], and mongo-seeding) to execute. And these dependencies are not available in the postdeploy phase of an Heroku app.
I also don't think that "compiling" the typescript in the regular build step is the best idea. This seeder script is only used in development (and review apps). Besides, I'm not sure that would resolve the issue with missing dependencies like faker.
How would one go about this? Any tricks I'm missing?
Can I maybe skip Heroku's step where it actively deletes development dependencies? But only for review apps? Or even better, can I "exclude" just the couple of dependencies I need, and only for review apps?
The Heroku docs indicate that when the NODE_ENV variable contains anything but "production", the devDependencies will not be removed after the build step.
To make sure this only happens for Heroku review apps, you can set the NODE_ENV variable under the environments.review section of the app.json file. The following config should do the trick:
{
"name": "...",
"addons": [
"mongolab"
],
"environments": {
"review": {
"addons": [
"mongolab:sandbox"
],
"env": {
"NODE_ENV": "development"
},
"scripts": {
"postdeploy": "npm run seed"
}
}
}
}

Generate package.json on nx build / deployment

I've a monorepo using nx with multiple node/nestjs apps. Some of the apps doesn't require all the packages used in the other apps. Because it's a monorepo, I need to install all packages for every apps during the deployment.
Is there a way generate a package.json on build that would contain only the packages needed for the app that I'm building?
I've tryed to use "generate-package-json-webpack-plugin" to generate the package.json, but it only detect half the dependencies.
I've also tried to build a single js file containing all the apps, but it doesn't seem to work and always require tslib.
After I look at the nx source code I found the answer.
Set generatePackageJson to true in workspace.json where <project-name>/targets/build/options.
This will generate you package.json with the necessary dependencies for your app.
Here example:
"node-api": {
"root": "apps/node-api",
"sourceRoot": "apps/node-api/src",
"projectType": "application",
"prefix": "node-api",
"targets": {
"build": {
"executor": "#nrwl/node:build",
"outputs": ["{options.outputPath}"],
"options": {
"showCircularDependencies": false,
"outputPath": "dist/apps/node-api",
"main": "apps/node-api/src/main.ts",
"tsConfig": "apps/node-api/tsconfig.app.json",
"assets": ["apps/node-api/src/assets"],
"generatePackageJson": true <----------------------
},
....
Nx encourages a single-version policy and has a single package.json.
If the issue is that you are installing all the dependencies every time in CI before building then you might need to rely on the functionality provided by your CI system to cache these between runs - a lot of existing CI systems do provide these:
* Gitlab: https://docs.gitlab.com/ee/ci/caching/
* CircleCI: https://circleci.com/docs/2.0/caching/
* Travis: https://docs.travis-ci.com/user/caching/
However this comes with its own set of issues (e.g. parallel jobs where one or more is changing the dependencies).
We can try to explore having a command in Nx: sort of a "affected:dep-install" that will detect which packages to install as part of the affected command. Please create an issue for it here: https://github.com/nrwl/nx/issues

Sharing code between Firebase Functions and React

I'm using Firebase functions with a React application. I have some non-trivial code that I don't want to duplicate, so I want to share it between the deployed functions and my React client. I've got this working locally in my React client (though I haven't tried deploying) - but I can't deploy my functions.
The first thing I tried was npm link. This worked locally, but the functions won't deploy (which makes sense, since this leaves no dependency in your package.json). Then I tried npm install ../shared/ - this looked promising because it did leave a dependency in package.json with a file: prefix - but Firebase still won't deploy with this (error below).
My project directory structure looks like this:
/ProjectDir
firebase.json
package.json (for the react app)
/src
* (react source files)
/functions
package.json (for firebase functions)
index.js
/shared
package.json (for the shared module)
index.js
My shared module package.json (extraneous details omitted):
{
"name": "myshared",
"scripts": {
},
"dependencies": {
},
"devDependencies": {
},
"engines": {
"node": "8"
},
"private": true,
"version": "0.0.1"
}
My firebase functions package.json (extraneous details omitted):
{
"name": "functions",
"scripts": {
},
"dependencies": {
"myshared": "file:../shared",
},
"devDependencies": {
},
"engines": {
"node": "8"
},
"private": true
}
When I try to deploy with:
firebase deploy --only functions
It's telling me it can't load the module:
Function failed on loading user code. Error message: Code in file index.js can't be loaded.
Did you list all required modules in the package.json dependencies?
And I don't think the issue is how I export/imported my code- but just in case:
The export:
exports.myFunc = () => { some code };
The import (functions/index.js)
const myFunc = require('myshared');
And in my react code:
import { myFunc } from 'myshared';
So far the searching I've done hasn't yielded anything that works. Someone did mention entering the shared module path in firebase.json, but I couldn't find any details (including in the firebase docs) that show what that would look like. Thanks for any tips to get this going.
I found a solution. I'm not sure if it's the only or even the best solution, but it seems to work for this scenario, and is easy. As Doug noted above, Firebase doesn't want to upload anything not in the functions directory. The solution was to simply make my shared module a subdirectory under functions (ie ./functions/shared/index.js). I can then import into my functions like a normal js file. However, my shared folder also has a package.json, for use as a dependency to the react app. I install it using:
npm install ./functions/shared
This creates a dependency in my react app, which seems to resolve correctly. I've created a production build without errors. I haven't deployed the react app yet, but I don't think this would be an issue.
Another solution is to create a symlink. In terminal, under /ProjectDir, execute:
ln -s shared functions/shared
cd functions
npm i ./shared

Is there a way to ignore test files for eslint-plugin-security?

With a node.js project, I've added eslint-plugin-security and it is giving a lot of warnings for code in my test/spec files (using mochajs). Since the test code won't be running in production, these don't seem as useful as they do in the project's actual code. (A lot of Generic Object Injection Sink warnings )
Is there a way to have the security plugin ignore certain files other than putting /* eslint-disable */ at the top of every spec file?
The best way I found to deal with this case is based on this answer.
You can override parts of your eslint file in a subfolder. In my case I'm disabling problematic rules from a jest plugin inside my e2e tests folder. Example .eslintrc.js in /e2e-tests/ :
module.exports = {
overrides: [
{
files: ["*.spec.js"],
rules: {
"jest/valid-expect": 0
}
}
]
};
There is three way to ignore files or folders:
1. Creating a .eslintignore on your project root folder with the thing you want to ignore:
**/*.js
2. Using eslint cli & the --ignore-path to specify another file where your ignore rules will be located
eslint --ignore-path .jshintignore file.js
3. Using your package.json
{
"name": "mypackage",
"version": "0.0.1",
"eslintConfig": {
"env": {
"browser": true,
"node": true
}
},
"eslintIgnore": ["*.spec.ts", "world.js"]
}
Official Documentation
On my side, I had issue with Intellij IDEA where eslint was checking files in a folder only dedicated to Typescript (+tslint) which was a pain, so I've picked solution 3.

Errors with relative paths in using Chutzpah and Require.js

I'm using Chutzpah, with Jasmine, to unit test a number of AMD modules using Require.js.
My unit test project is separate to both the modules under test and the require.js config file.
I'm using chutzpah.json to connect these together, as such:
{
"Framework": "jasmine",
"TestHarnessReferenceMode": "AMD",
"TestHarnessLocationMode": "SettingsFileAdjacent",
"EnableTestFileBatching": true,
"AMDBasePath": "matches baseUrl path in require.js config file",
"References" : [
{"Path" : "path to require.js" },
{ "Path": "path to require.js config file" }
],
"Tests" : [
{"Path": "Specs"}
]
}
The tests run okay as expected.
The issue is that somewhere in the magic of resolving dependencies, I'm getting errors that a number of the css files cannot be located. These are relative paths and I'm guessing that because I'm initiating the test from a separate project it cannot correctly identify the base path.
As I say this isn't an issue running tests locally, but would cause an issue when integrating with a CI build.
Has anybody experienced this before and know of a workaround?

Resources