Unit test for aws lambda using jest - node.js

const invokeApi = require("/opt/nodejs/kiwiCall");
const decrypt = require("/opt/nodejs/encryption");
const cors = require("/opt/nodejs/cors");
When I am testing my index.js file by manual mocking these dependencies in mocks directory as follows:
__mocks__
|_invokeApi
|_decrypt
|_cors
it says
FAIL ./index.test.js
● Test suite failed to run
Cannot find module '/opt/nodejs/kiwiCall' from 'index.js'
However, Jest was able to find:
'../../../../lambdas/Flights/Locations/index.js'
You might want to include a file extension in your import, or update your 'moduleFileExtensions', which is currently ['js', 'json', 'jsx', 'ts', 'tsx', 'node'].
See https://jestjs.io/docs/en/configuration#modulefileextensions-array-string
1 | "use strict";
2 |
> 3 | const invokeApi = require("/opt/nodejs/kiwiCall");
Wanted to know how can I mock the dependencies of AWS lambda in inedx.test.js file

In your package.json or jest.config you could add a moduleNameMapper for that directory.
"jest": {
"moduleNameMapper": {
"/opt/nodejs/(.*)": "<rootDir>/../nodejs/$1"
},
},

So I managed to figure out something based on my repository.
I'm using the moduleNameMapper to map the absolute path to another location in my repository to where I have the layer stored.
Eg.
moduleNameMapper: {'^/opt/config/config': '<rootDir>/src/layers/layers-core/config/config'}
In your case you could use a regex expression to match /opt/nodejs/ and map it elsewhere. Hope that helped.
EDIT:
I completely changed my approach and used babel-plugin-module-resolver with babel-rewire. I did this because the above method was incompatible with rewire. It's quite easy setup and you just need to setup a babel alias within .babelrc.
eg.
{
"plugins": [
["rewire"],
["babel-plugin-module-resolver", {
"alias": {
"/opt/config/config": "./src/layers/layers-core/config/config",
"/opt/utils/util-logger": "./src/layers/layers-core/utils/util-logger",
"/opt/slack": "./src/layers/layers-slack/slack"
}
}]
]
}
Combine this with IDE jsconfig.json path alias and you get full IDE support.
{
"compilerOptions": {
"module": "commonjs",
"target": "es2018",
"baseUrl": "./",
"paths": {
"/opt/config/config": ["src/layers/layers-core/config/config"],
"/opt/utils/util-logger": ["src/layers/layers-core/utils/util-logger"],
"/opt/slack/*": ["src/layers/layers-slack/slack/*"],
}
},
"exclude": ["node_modules", "dist"]
}
You can then reference your layers with jest.doMock('/opt/config/config', mockConfig);
EDIT 2:
Found a way to get Jest to mock it. Just slip {virtual: true} into the mock!
jest.doMock('/opt/config/config', mockConfig, {virtual: true});

I have pretty much the same issue. I have defined a layer which contains common code that's shared between other functions in my project. My project structure looks something like this:
project/
functions/
function1/
app.js
function2/
app.js
shared/
shared.js
I import my shared library like this:
const { doSomething } = require('/opt/shared');
exports.handler = async (event) => {
const result = await doSomething();
// etc...
return {statusCode: 200};
}
This works when I deploy to AWS Lambda because the /opt/shared exists and it can be referenced correctly. It also works if I run this on my machine using sam local invoke Function1 because it's running in a container, which makes /opt/shared available to the code.
However, I'm struggling to work out how I can mock this dependency in a unit test. If I simply do this: jest.mock('/opt/shared'), I'm getting: Cannot find module '/opt/shared' from app.test.js

You can use the modulePaths option, from this post.
Documentation
jest.config.js
"jest": {
"modulePaths": [
"<rootDir>/src/layers/base/nodejs/node_modules/"
]
}
You can dynamically create this array by scanning a directory
const testFolder = './functions/';
const fs = require('fs');
const modulePaths = fs.readdirSync(testFolder)
.reduce((modulePaths, dirName) => {
modulePaths.push(`functions/${dirName}/dependencies/nodejs/node_modules/`);
return modulePaths;
}, []);

Related

React: Absolute Paths from Root Folder using ViteJS with TypeScript

So, in React, using vite, I'm trying to do the following structure, but seems I can't get it to work because I'm missing a concept or something, so the structure is as follows:
src/utils
src/routes
src/index.tsx
src/main.tsx
And on the index.tsx, I want to import utils and routes, and then call them at any root level as following: import {Routes, Utils} from "#", but the way I did is not working.
Meanwhile, this is how I configured it with vite:
resolve: {
alias: {
"#": path.resolve(__dirname, "src"),
},
},
Make sure index.tsx exports everything from src/utils and src/routes:
// src/index.tsx
export * from './routes'
export * from './utils'
And configure TypeScript with a path alias for #:
// tsconfig.json
{
"compilerOptions": {
"paths": {
"#": ["./src"], // 👈 needed for barrel imports from '#'
"#/*": ["./src/*"]
}
}
}
demo

jest.config.ts: "registerer.enabled is not a function" error when running jest from Github Actions

When running jest locally, it instantiates my app and runs tests without any issues.
When running jest inside github actions, I'm getting this error:
Error: Jest: Failed to parse the TypeScript config file /home/runner/work/myproject/myproject/jest.config.ts
TypeError: registerer.enabled is not a function
at readConfigFileAndSetRootDir (/home/runner/work/myproject/myproject/node_modules/#jest/core/node_modules/jest-config/build/readConfigFileAndSetRootDir.js:118:13)
the package.json script entry is just:"test": "jest"
and the jest.config.ts file is:
import tsJestUtils from 'ts-jest/utils'
import tsConf from './tsconfig.json'
const rootDir = __dirname
const { pathsToModuleNameMapper } = tsJestUtils
const {
compilerOptions: { paths },
} = tsConf
const config = {
preset: 'ts-jest',
testEnvironment: 'node',
roots: [`${rootDir}/src`],
transform: {
'^.+\\.tsx?$': 'ts-jest',
},
moduleNameMapper: pathsToModuleNameMapper(paths, {
prefix: `${rootDir}/src`,
}),
}
export default config
So I just bypassed use of typescript for my jest config entirely, and went with an equivalent jest.config.js file based on the docs. Works in Github Actions now, runner does not fail! \o/
I am still not sure what the issue was, but I think ts-node just wasn't processing the config file properly. I feel like the actual failure was with the attempt to load a .ts config file, specifically at this point in the source code when it tries to call registerer.enabled().
It can be fixed by upgrading to "ts-node": "^8.5.0"

Package.json exports with webpack 5 - dynamically imported module not found

I am having a bit of trouble reconciling the path of a dynamic import for i18n locales. Here's the relevant code -
function getLoader(
lang: SupportedLanguage,
ns: SupportedNamespace
): NamespaceLoader | undefined {
const matrixToCheck = UNSUPPORTED_MATRIX[ns];
const isSupported = matrixToCheck && matrixToCheck.indexOf(lang) === -1;
if (isSupported) {
const path = `./locales/${lang}/${ns}.json`;
const name = `${lang}_${ns}`;
const named = {
[name]: () => import(`${path}`),
};
return named[name];
}
}
...
// eventual output
const SUPPORTED_LANGUAGES = {en: {namespace1: () => import('./locales/en/namespace1.json')}
My goal is manage all of the relevant translations in a single npm package, handle all of the dynamic import set-up at build time, and then consumers can invoke the getter (getTranslation in this case) in their respectives apps for the language and namespace of their choice to get the payload at runtime.
Based on this GH thread, I wanted to reconcile the locale dist path via the package.json
...
"exports": {
".": "./dist/src/main.js",
"./": "./dist/"
},
...
e.g. when I publish the package, based on that exports config, the consumer would know know how to reconcile the path, either relative or package-name-prefix when the getter is invoked
const fn = () => import('./locales/fr/myNamespace.json') /// doesn't work
const anotherFn = () => import('#examplePackageName/locales/fr/myNamespace.json') /// doesn't work
Since everything is dynamic, I am using the CopyWebpackPlugin to include the locales in the dist folder.
This works as expected locally, but when I create the dist, I get the error Error: Module not found ./relative/path/to/the/json/I/want.json.
My question:
What am I missing? Is there a simple way to expose these translations so that other apps can include them in their bundles via an npm-installed package?
Here's my Webpack config, happy to provide other info as needed
const path = require("path");
const CopyPlugin = require("copy-webpack-plugin");
const { CleanWebpackPlugin } = require("clean-webpack-plugin");
const getPlugins = () => {
return [
new CleanWebpackPlugin(),
new CopyPlugin({
patterns: [{ from: "locales", to: "locales" }],
}),
];
};
module.exports = {
mode: "production",
entry: {
main: "./src/main.ts",
},
output: {
path: path.join(__dirname, "dist"),
filename: "src/[name].js",
chunkFilename: "chunk.[name].js",
libraryTarget: "commonjs2",
},
resolve: {
extensions: [".json", ".ts", ".js"],
alias: {
"#locales": path.resolve(__dirname, "locales/*"),
},
},
plugins: getPlugins(),
module: {
rules: [
{
test: /\.ts$/,
exclude: [/\.test\.ts$/],
include: path.join(__dirname, "src"),
loader: "ts-loader",
},
],
},
};
Exports directive prescribes to define all files allowed for import explicitly (documentation). It allows developer to hide internal package file structure. What's not exported by this directive is only available to import inside the package and not outside of it. It's made to simplify maintenance. It allows developers to rename files or change file structure without fear of breaking dependent packages and applications.
So if you want to make internal files visible for import, you should export them with exports directive explicitly, like this:
{
"exports": {
".": "./dist/esm/src/main.js",
"./dist/shared/locale/fr_fr.json": "./dist/shared/locale/fr_fr.json"
}
}
I'm not sure wether Webpack handling this case, because it's an experimental feature yet. But this is how Node.js works now.
Why it is so
Changing your app file structure is a major change in semver terms, so you need to bump a version everytime you rename or delete files. To avoid it you can specify which files are part of public interface of the package.

Webpack import from files that use module.exports

I have React app and a file where I want to store things related to api.
const proxy = require('http-proxy-middleware');
const path = require('path');
//.....
const targetApi = (objectWithUrlEntries) => {
Object.keys(objectWithUrlEntries).forEach((key) => {
objectWithUrlEntries[key] = path.join('/api/', objectWithUrlEntries[key]);
});
};
module.exports.proxyExpressCalls = proxyExpressCalls;
module.exports.devServerProxyConfig = devServerProxyConfig;
module.exports.targetApi = targetApi;
Some of those things will be used by webpack itself, and some will be used inside the app (to correctly target api calls).
However when I try to import things in my app:
// #flow
import { buildUrl } from 'data/utils';
import type { Axios } from './flow.types';
import { targetApi } from './api';
console.log(targetApi());
I get errors. In terminal:
WARNING in ./src/data/redux/api/user.js 6:12-21 "export 'targetApi'
was not found in './api'
in browser:
api.js?d669:39 Uncaught TypeError: Cannot set property 'proxyExpressCalls' of undefined
at Object.eval (api.js?d669:39)
at eval (api.js:60)
at Object../src/data/redux/api/api.js (client.bundle.js:11620)
at __webpack_require__ (client.bundle.js:708)
at fn (client.bundle.js:113)
at eval (user.js:15)
at Object../src/data/redux/api/user.js (client.bundle.js:11668)
at __webpack_require__ (client.bundle.js:708)
at fn (client.bundle.js:113)
at eval (user.js:18)
So the problem is that when app is being bundled commonjs exports fail, but if I would use es6 export syntax then Node would fail.
I had a similar problem: I had a javascript class with some validation rules that I wanted to use in Node JS and also in the client code. What worked for me was converting everything to Common JS, the shared code, the node code, and the client code. But I still had some problems. Then I added "module": "commonjs" to my .babelrc of the folder that imports the shared code and it finally worked. This is my .babelrc file:
{
"presets": [
"react",
[
"env",
{
"debug": true,
"modules": "commonjs",
"targets": {
"browsers": [
"last 2 versions",
"safari >= 7"
],
}
}
],
],
"plugins": [
"transform-object-rest-spread",
"transform-es2015-arrow-functions",
"transform-class-properties"
]
}
Another possible solutions is (not tested!) to create a library out of your shared code, using webpack. Check the output.library and output.libraryTarget options to see which options you have to expose the library in different module systems. Then import your shared library in your node and client code.
The browser error holds the key: it looks like module.exports is null. And sure enough, you're setting values on it but it was not initialized. If instead you do this:
module.exports = {
proxyExpressCalls: proxyExpressCalls,
devServerProxyConfig: devServerProxyConfig,
targetApi: targetApi
};
(or simply set module.exports = {} first) this should solve the problem. The console warning is likely a side effect of code that keeps going even after the failure to set values on a null variable.

TypeScript AMD compilation and "barrel" modules

I'm trying to set up a Node.js + TypeScript project using Intern for testing. Everything works fine when I compile the project using "commonjs" (which I do for the normal build); and TypeScript is equally happy when compiling for "amd", which is required by Intern. However, when passing the tests with intern-client, it complains about a couple of things:
First, imports from "index.ts" files (so-called "barrel" modules) won't work. My setup is something like this (everything in the same directory):
// index.ts
export { x } from './x'
// x.ts
export function x() {}
// x.test.ts
import { x } from '.' // "Error: Failed to load module ..."
In fact, the generated JavaScript code (for x.test.ts) looks something like this:
define(["require", "exports", "."], function (...) { ... })
And I'm not sure that AMD knows how to handle the ".".
The second issue happens under the same circumstances (TypeScript compiles happily, but intern-client complains). In summary, I get an error when doing:
import jsdom = require('jsdom')
Which I need to transform to:
const jsdom = require('jsdom')
For Intern to be able to deal with it.
Here is the tsconfig.json file I use to compile the tests:
{
"compilerOptions": {
"target": "es6",
"module": "amd",
"moduleResolution": "node",
"sourceMap": true,
"rootDir": "src",
"outDir": "build/tests",
"noImplicitAny": true,
"suppressImplicitAnyIndexErrors": true
}
}
And here is my intern.js configuration file, in case it helps:
define({
suites: ['build/tests/**/*.test.js'],
excludeInstrumentation: true,
filterErrorStack: true
})
Edit (2017-05-03)
To help understand the issue, here is an excerpt of the directory tree of the project:
build
tests // The compiled tests will end up here
src
core
utils
x.ts
x.test.ts
// Other files, each containing a function that I would like to unit-test...
intern.js
package.json
tsconfig.json
...
Regarding the first issue, AMD's handling of an import like '.' is different than Node's. While both of them will map '.' to a package, Node uses a default module name of index.js, while AMD uses main.js. To get things working in an AMD loader, you'll need to first define a package for '.', and then tell the AMD loader what default module to use for that package. Given your project layout, you could configure Intern like this:
loaderOptions: {
map: {
// When a module in src/ references 'src/utils', redirect
// it to 'utils'
'src': {
'src/utils': 'utils'
}
},
packages: [
// Define a package 'utils' with files in 'src/utils' that defaults
// to the module index.js
{ name: 'utils', location: 'src/utils', main: 'index.js' }
]
}
Regarding the second issue, its not clear what the problem actually is. Import statements will be transpiled into define dependencies by TypeScript, so Intern should never be seeing them.

Resources