How to mock #google-cloud/kms using jest - node.js

I'm trying to write unit test cases for decrypt. I've my own implementation of decrypting an encrypted file. While trying to import the decrypt.mjs facing the following error.
Must use import to load ES Module: /node_modules/bignumber.js/bignumber.mjs
My application is a react frontend and NodeJS backend. I've used ES6 modules for NodeJS. Here is my decrypt.mjs file
import { readFile } from 'fs/promises';
import path from 'path';
import { KeyManagementServiceClient } from '#google-cloud/kms';
const decrypt = async (APP_MODE, __dirname) => {
if (APP_MODE === 'LOCALHOST') {
const keys = await readFile(
new URL(`./stagingfile.json`, import.meta.url)
).then((data) => JSON.parse(data));
return keys;
}
const { projectId, locationId, keyRingId, cryptoKeyId, fileName } =
getKMSDefaults(APP_MODE);
const ciphertext = await readFile(
path.join(__dirname, `/${fileName}`)
);
const formattedName = client.cryptoKeyPath(
projectId,
locationId,
keyRingId,
cryptoKeyId
);
const request = {
name: formattedName,
ciphertext,
};
const client = new KeyManagementServiceClient();
const [result] = await client.decrypt(request);
return JSON.parse(result.plaintext.toString('utf8'));
};
const getKMSDefaults = (APP_MODE) => {
//Based on APP_MODE the following object contains different values
return {
projectId: PROJECT_ID,
locationId: LOCATION_ID,
keyRingId: KEY_RING_ID,
cryptoKeyId: CRYPTO_KEY_ID,
fileName: FILE_NAME,
};
};
export default decrypt;
I tried to mock the #google-cloud/kms using manual mock (jest) but it didn't work. I tried multiple solutions to mock but nothing worked and it ended with the Must use import to load ES Module error.

I've had successfully used jest to mock #google-cloud/kms with TypeScript, so hopefully this will be the same process for ES modules that you can use.
Example working code:
// jest will "hoist" jest.mock to top of the file on its own anyway
jest.mock("#google-cloud/kms", () => {
return {
KeyManagementServiceClient: jest.fn().mockImplementation(() => {
return {
encrypt: kmsEncryptMock,
decrypt: kmsDecryptMock,
cryptoKeyPath: () => kmsKeyPath,
};
}),
};
});
// give names to mocked functions for easier access in tests
const kmsEncryptMock = jest.fn();
const kmsDecryptMock = jest.fn();
const kmsKeyPath = `project/location/keyring/keyname`;
// import of SUT must be after the variables used in jest.mock() are defined, not before.
import { encrypt } from "../../src/crypto/google-kms";
describe("Google KMS encryption service wrapper", () => {
const plaintext = "some text to encrypt";
const plaintextCrc32 = 1897295827;
it("sends the correct request to kms service and raise error on empty response", async () => {
// encrypt function is async that throws a "new Error(...)"
await expect(encrypt(plaintext)).rejects.toMatchObject({
message: "Encrypt: no response from KMS",
});
expect(kmsEncryptMock).toHaveBeenNthCalledWith(1, {
name: kmsKeyPath,
plaintext: Buffer.from(plaintext),
plaintextCrc32c: { value: plaintextCrc32 },
});
});
});

Related

Fastify CLI decorators undefined

I'm using fastify-cli for building my server application.
For testing I want to generate some test JWTs. Therefore I want to use the sign method of the fastify-jwt plugin.
If I run the application with fastify start -l info ./src/app.js everything works as expected and I can access the decorators.
But in the testing setup I get an error that the jwt decorator is undefined. It seems that the decorators are not exposed and I just can't find any error. For the tests I use node-tap with this command: tap \"test/**/*.test.js\" --reporter=list
app.js
import { dirname, join } from 'path'
import autoload from '#fastify/autoload'
import { fileURLToPath } from 'url'
import jwt from '#fastify/jwt'
export const options = {
ignoreTrailingSlash: true,
logger: true
}
export default async (fastify, opts) => {
await fastify.register(jwt, {
secret: process.env.JWT_SECRET
})
// autoload plugins and routes
await fastify.register(autoload, {
dir: join(dirname(fileURLToPath(import.meta.url)), 'plugins'),
options: Object.assign({}, opts),
forceESM: true,
})
await fastify.register(autoload, {
dir: join(dirname(fileURLToPath(import.meta.url)), 'routes'),
options: Object.assign({}, opts),
forceESM: true
})
}
helper.js
import { fileURLToPath } from 'url'
import helper from 'fastify-cli/helper.js'
import path from 'path'
// config for testing
export const config = () => {
return {}
}
export const build = async (t) => {
const argv = [
path.join(path.dirname(fileURLToPath(import.meta.url)), '..', 'src', 'app.js')
]
const app = await helper.build(argv, config())
t.teardown(app.close.bind(app))
return app
}
root.test.js
import { auth, build } from '../helper.js'
import { test } from 'tap'
test('requests the "/" route', async t => {
t.plan(1)
const app = await build(t)
const token = app.jwt.sign({ ... }) //-> jwt is undefined
const res = await app.inject({
method: 'GET',
url: '/'
})
t.equal(res.statusCode, 200, 'returns a status code of 200')
})
The issue is that your application diagram looks like this:
and when you write const app = await build(t) the app variable points to Root Context, but Your app.js contains the jwt decorator.
To solve it, you need just to wrap you app.js file with the fastify-plugin because it breaks the encapsulation:
import fp from 'fastify-plugin'
export default fp(async (fastify, opts) => { ... })
Note: you can visualize this structure by using fastify-overview (and the fastify-overview-ui plugin together:

how to mock react-query useQuery in jest

I'm trying to mock out axios that is inside an async function that is being wrapped in useQuery:
import { useQuery, QueryKey } from 'react-query'
export const fetchWithAxios = async () => {
...
...
...
const response = await someAxiosCall()
...
return data
}
export const useFetchWithQuery = () => useQuery(key, fetchWithAxios, {
refetchInterval: false,
refetchOnReconnect: true,
refetchOnWindowFocus: true,
retry: 1,
})
and I want to use moxios
moxios.stubRequest('/some-url', {
status: 200,
response: fakeInputData,
})
useFetchWithQuery()
moxios.wait(function () {
done()
})
but I'm getting all sorts of issues with missing context, store, etc which I'm iterested in mocking out completely.
Don't mock useQuery, mock Axios!
The pattern you should follow in order to test your usages of useQuery should look something like this:
const fetchWithAxios = (axios, ...parameters) => {
const data = axios.someAxiosCall(parameters);
return data;
}
export const useFetchWithQuery = (...parameters) => {
const axios = useAxios();
return useQuery(key, fetchWithAxios(axios, ...parameters), {
// options
})
}
Where does useAxios come from? You need to write a context to pass an axios instance through the application.
This will allow your tests to look something like this in the end:
const { result, waitFor, waitForNextUpdate } = renderHook(() => useFetchWithQuery(..., {
wrapper: makeWrapper(withQueryClient, withAxios(mockedAxios)),
});
await waitFor(() => expect(result.current.isFetching).toBeFalsy());

How to override url for RTK query

I'm writing pact integration tests which require to perform actual call to specific mock server during running tests.
I found that I cannot find a way to change RTK query baseUrl after initialisation of api.
it('works with rtk', async () => {
// ... setup pact expectations
const reducer = {
[rtkApi.reducerPath]: rtkApi.reducer,
};
// proxy call to configureStore()
const { store } = setupStoreAndPersistor({
enableLog: true,
rootReducer: reducer,
isProduction: false,
});
// eslint-disable-next-line #typescript-eslint/no-explicit-any
const dispatch = store.dispatch as any;
dispatch(rtkApi.endpoints.GetModules.initiate();
// sleep for 1 second
await new Promise((resolve) => setTimeout(resolve, 1000));
const data = store.getState().api;
expect(data.queries['GetModules(undefined)']).toEqual({modules: []});
});
Base api
import { createApi } from '#reduxjs/toolkit/query/react';
import { graphqlRequestBaseQuery } from '#rtk-query/graphql-request-base-query';
import { GraphQLClient } from 'graphql-request';
export const client = new GraphQLClient('http://localhost:12355/graphql');
export const api = createApi({
baseQuery: graphqlRequestBaseQuery({ client }),
endpoints: () => ({}),
});
query is very basic
query GetModules {
modules {
name
}
}
I tried digging into customizing baseQuery but were not able to get it working.

Mock exported class in Typescript Jest

Hi i wrote following code to fetch blobs from Azure Blob Storage.
import { BlobServiceClient, ContainerClient, ServiceFindBlobsByTagsSegmentResponse } from '#azure/storage-blob';
import { GetBlobPageInput, GetBlobPageOutput, PutBlobItemsInput, GetBlobItem } from './interfaces/blob.service.interface';
export const getBlobsPage = async<T>(input: GetBlobPageInput) => {
const blobServiceClient = BlobServiceClient.fromConnectionString(input.blobConnectionString);
const iterator = blobServiceClient
.findBlobsByTags(input.condition)
.byPage({ maxPageSize: input.pageSize });
return getNextPage<T>({
iterator,
blobServiceClient,
blobContainer: input.blobContainer,
});
};
[...]
I am trying to write a unit test for it, but i have trouble when i try to mock BlobServiceClient from #azure/storage-blob. I wrote sample test and mock as this:
import { getBlobsPage } from './../../services/blob.service';
const fromConnectionStringMock = jest.fn();
jest.mock('#azure/storage-blob', () => ({
BlobServiceClient: jest.fn().mockImplementation(() => ({
fromConnectionString: fromConnectionStringMock,
})),
}));
describe('BLOB service tests', () => {
beforeEach(() => {
jest.clearAllMocks();
});
it('should fetch first page and return function to get next', async () => {
const input = {
blobConnectionString: 'testConnectionString',
blobContainer: 'testContainer',
condition: "ATTRIBUTE = 'test'",
pageSize: 1,
};
const result = await getBlobsPage(input);
expect(fromConnectionStringMock).toHaveBeenCalledTimes(1);
});
});
But when i try to run test i am getting:
TypeError: storage_blob_1.BlobServiceClient.fromConnectionString is not a function
24 |
25 | export const getBlobsPage = async<T>(input: GetBlobPageInput) => {
> 26 | const blobServiceClient = BlobServiceClient.fromConnectionString(input.blobConnectionString);
| ^
27 |
28 | const iterator = blobServiceClient
29 | .findBlobsByTags(input.condition)
at nhsdIntegration/services/blob.service.ts:26:47
at nhsdIntegration/services/blob.service.ts:1854:40
at Object.<anonymous>.__awaiter (nhsdIntegration/services/blob.service.ts:1797:10)
at Object.getBlobsPage (nhsdIntegration/services/blob.service.ts:25:65)
at tests/services/blob.service.test.ts:27:26
at tests/services/blob.service.test.ts:8:71
Any tips on how should I properly implement mock for azure module?
I've tried following several diffrent answers on StackOverflow and looked through articles on web (like: https://dev.to/codedivoire/how-to-mock-an-imported-typescript-class-with-jest-2g7j). And most of them show that this is the proper solution, so i guess i am missing some small thing here but can't figure it out.
The exported BlobServiceClient is supposed to be a literal object but you're now mocking as function which is the issue.
So you might need to simply mock returning a literal object. Another issue is to access a var fromConnectionStringMock from outside of the mock scope would end up with another issue.
So here's possibly the right mock:
jest.mock('#azure/storage-blob', () => ({
...jest.requireActual('#azure/storage-blob'), // keep other props as they are
BlobServiceClient: {
fromConnectionString: jest.fn().mockReturnValue({
findBlobsByTags: jest.fn().mockReturnValue({
byPage: jest.fn(),
}),
}),
},
}));

Jest - mocking and testing pino multi streams based on log levels

I am struggling to find out correct way of mocking and using pino in a test logging service,
So here is my implementation of pino logger. This write to different file streams based on log levels.
getChildLoggerService(fileNameString): pino.Logger {
const streams: Streams = [
{ level: 'fatal', stream: fs.createWriteStream(path.join(process.cwd(), './logs/database-connect-fatal.log'))},
{ level: 'error', stream: fs.createWriteStream(path.join(process.cwd(), './logs/database-connect-error.log'))},
{ level: 'debug', stream: fs.createWriteStream(path.join(process.cwd(), './logs/database-connect-debug.log'))},
{ level: 'info', stream: fs.createWriteStream(path.join(process.cwd(), './logs/database-connect-info.log'))},
];
return pino({useLevelLabels: true,
base: {
hostName: os.hostname(),
platform: os.platform(),
processId: process.pid,
timestamp: this.appUtilService.getCurrentLocaleTimeZone(),
// tslint:disable-next-line: object-literal-sort-keys
fileName: this.appUtilService.getFileName(fileNameString),
} ,
level: this.appUtilService.getLogLevel(),
messageKey: LOGGER_MSG_KEY,
prettyPrint: this.appUtilService.checkForDevEnv(process.env.NODE_ENV),
timestamp: () => {
return this.appUtilService.getCurrentLocaleTimeZone()
},
}, multistream(streams)).child({
connectorReqId: (process.env.REQ_APP_NAME === null ? 'local': process.env.REQ_APP_NAME)
+uuid.v4().toString()
});
}
The most important part I wanted to test is the multistreams where I need to write to different log files based on the log levels and so far I couldn't figure out a way to do that
import pino, { DestinationStream } from 'pino';
const sinon = require('sinon');
import pinoms from 'pino-multi-stream';
const fs = require('fs');
const path = require('path');
const stream = require('stream');
const { PassThrough } = require('stream');
class EchoStream extends stream.Writable {
_write(chunk, enc, next) {
console.log('ssdsdsd',chunk.toString());
next();
}
}
import * as _ from 'lodash';
import { Writable } from 'stream';
import { mocked } from 'ts-jest/utils';
import { LogServiceInstance } from './log.service';
// jest.mock('pino', () => jest.fn().mockImplementation(() => { ====> Tried this inline mock, doesnt work
// return {
// child: jest.fn().mockReturnValue(jest.requireActual('pino').Logger)
// }
// }));
// jest.mock('pino', () => {
// return jest.fn().mockImplementation(() => {
// return {
// child: jest.fn().mockReturnValue(jest.requireActual('pino').Logger),
// stream: jest.fn().mockImplementation(() => {
// return [
// {
// level: 'info',
// stream: fs.createWriteStream(
// path.resolve(process.cwd(), '/test/database-connector-logs/info.log')
// ),
// },
// {
// level: 'warn',
// stream: fs.createWriteStream(
// path.resolve(process.cwd(), '/test/database-connector-logs/warn.log')
// ),
// },
// ];
// }),
// };
// });
// });
describe('Test suite for Log service', () => {
//const mockedPino = mocked(pino, true);
test('Test case for getLoggerInstance', () => {
const mockedPinoMsStream = [
const mockedPinoStream = (pino.prototype.stream = jest.fn(() => mockedPinoMsStream));
console.dir(pino);
const prop = Reflect.ownKeys(pino).find((s) => {
return s === 'symbols';
});
// Tried this but it did not work as the actual files are written with the values
pino[prop]['streamSym'] = jest.fn().mockImplementation(() => {
return fs.createWriteStream(path.resolve(process.cwd(), './test/database-connector-logs/info.log'))
});
console.dir(pino);
const log = LogServiceInstance.getChildLoggerService(__filename);
console.dir(Object.getPrototypeOf(log));
log.info('test logging');
expect(2).toEqual(2);
});
Could someone let me know where the mocking is wrong and how to mock it properly
UPDATE:
I came to understand that mocking pino-multi-stream might do the trick, so tried it this way. This was added at the very top and rest of all mockings are all removed (even inside the test suite as well)
const mockedPinoMultiStream = {
stream: jest.fn().mockImplementation(() => {
return {write: jest.fn().mockReturnValue(new PassThrough())}
})
}
jest.mock('pino-multi-stream', () => {
return {
multistream: jest.fn().mockReturnValue(mockedPinoMultiStream)
}
});
wanted to mock to test if based on the level, respective named files are being used, but this also results in exception
TypeError: stream.write is not a function
at Pino.write (/XXX/node_modules/pino/lib/proto.js:161:15)
at Pino.LOG (/XXXX/node_modules/pino/lib/tools.js:39:26)
LATEST UPDATE:
So I resolved the exception by modifying the way pino multistream is mocked
const { PassThrough } = require('stream');
...
...
const mockedPinoMultiStream = {
write: jest.fn().mockImplementation((data) => {
return new Passthrough();
})
};
Now there is no more exception and write(method) is properly mocked when I print "pino". BUt I do not understand how to test the different files based on different log levels. Could someone let me know, how that is to be done.?
Note: I tried setting a return value of fs.createWriteStream instead of a Passthrough but that didnt work
Atlast, I found the answer to making use of pino streams based on different log levels.
I went ahead and created a test directory to house the test log files. In reality, we do not want pino to be adulterating the actual log files. So I decided to mock the pino streams during the start of the jest test. This file gets executed first before any test suite is triggered. So I modified the jest configuration in package.json like
"setupFiles": [
"<rootDir>/jest-setup/stream.logger.js"
],
in the stream.logger.js file, I added
const pinoms = require('pino-multi-stream');
const fs = require('fs');
const path = require('path');
const stream = require('stream');
const Writable = require('stream').Writable;
const { PassThrough } = require('stream');
const pino = require('pino');
class MyWritable extends Writable {
constructor(options) {
super(options);
}
_write(chunk, encoding, callback) {
const writeStream =fs.createWriteStream(path.resolve(process.cwd(), './test/logs/info.log'));
writeStream.write(chunk,'utf-8');
writeStream.emit('close');
writeStream.end();
}
}
const mockedPinoMultiStream = {
write: jest.fn().mockImplementation((data) => {
const writeStream = new MyWritable();
return writeStream._write(data);
})
};
jest.mock('pino-multi-stream', () => {
return {
multistream: jest.fn().mockReturnValue(mockedPinoMultiStream)
}
});
Now I went ahead and created the test file - log.service.spec.ts
import * as pino from 'pino';
const sinon = require('sinon');
import pinoms from 'pino-multi-stream';
const fs = require('fs');
const path = require('path');
const stream = require('stream');
import * as _ from 'lodash';
import { Writable } from 'stream';
import { mocked } from 'ts-jest/utils';
import { LogServiceInstance } from './log.service';
describe('Test suite for Log service', () => {
//const mockedPino = mocked(pino, true);
afterEach(() => {
// delete the contents of the log files after each test suite
fs.truncate((path.resolve(process.cwd(), './test/logs/info.log')), 0, () => {
console.dir('Info log file deleted');
});
fs.truncate((path.resolve(process.cwd(), './test/logs/warn.log')), 0, () => {
console.dir('Warn log file deleted');
});
fs.truncate((path.resolve(process.cwd(), './test/logs/debug.log')), 0, () => {
console.dir('Debug log file deleted');
});
});
test('Test case fir getLoggerInstance', () => {
const pinoLoggerInstance = LogServiceInstance.getChildLoggerService(__filename);
pinoLoggerInstance.info('test logging');
_.map(Object.getOwnPropertySymbols(pinoLoggerInstance), (mapItems:any) => {
if(mapItems.toString().includes('Symbol')) {
if(mapItems.toString().includes('pino.level')) {
expect(pinoLoggerInstance[mapItems]).toEqual(20);
}
}
if(mapItems.toString().includes('pino.chindings')) {
const childInstance = pinoLoggerInstance[mapItems].toString().substr(1);
const jsonString = '{'+ childInstance+ '}';
const expectedObj = Object.create(JSON.parse(jsonString));
expect(expectedObj.fileName).toEqual('log.service.spec');
expect(expectedObj.appName).toEqual('AppJestTesting');
expect(expectedObj.connectorReqId).toEqual(expect.objectContaining(new String('AppJestTesting')));
}
});
// make sure the info.log file is written in this case
const infoBuffRead = fs.createReadStream(path.resolve(process.cwd(), './test/logs/info.log')).read(1024);
expect(infoBuffRead).toBeDefined();
// now write a warn log
pinoLoggerInstance.warn('test warning log');
const warnBuffRead = fs.createReadStream(path.resolve(process.cwd(), './test/logs/warn.log')).read(1024);
expect(warnBuffRead).toBeDefined();
// now write a debug log
pinoLoggerInstance.debug('test warning log');
const debugBuffRead = fs.createReadStream(path.resolve(process.cwd(), './test/logs/warn.log')).read(1024);
expect(debugBuffRead).toBeDefined();
});
});
I also made sure that the test log files do not get overwhelmed with data over time , by deleting their contents after each execution
Hope this helps people trying to test pino multi stream

Resources