How to Mock `fs.promises.writeFile` with Jest - node.js

I am trying to mock the promise version of fs.writeFile using Jest, and the mocked function is not being called.
Function to be tested (createFile.js):
const { writeFile } = require("fs").promises;
const createNewFile = async () => {
await writeFile(`${__dirname}/newFile.txt`, "Test content");
};
module.exports = {
createNewFile,
};
Jest Test (createFile.test.js):
const fs = require("fs").promises;
const { createNewFile } = require("./createFile.js");
it("Calls writeFile", async () => {
const writeFileSpy = jest.spyOn(fs, "writeFile");
await createNewFile();
expect(writeFileSpy).toHaveBeenCalledTimes(1);
writeFileSpy.mockClear();
});
I know that writeFile is actually being called because I ran node -e "require(\"./createFile.js\").createNewFile()" and the file was created.
Dependency Versions
Node.js: 14.1.0
Jest: 26.6.3
-- Here is another attempt at the createFile.test.js file --
const fs = require("fs");
const { createNewFile } = require("./createFile.js");
it("Calls writeFile", async () => {
const writeFileMock = jest.fn();
jest.mock("fs", () => ({
promises: {
writeFile: writeFileMock,
},
}));
await createNewFile();
expect(writeFileMock).toHaveBeenCalledTimes(1);
});
This throws the following error:
ReferenceError: /Users/danlevy/Desktop/test/src/createFile.test.js: The module factory of `jest.mock()` is not allowed to reference any out-of-scope variables.
Invalid variable access: writeFileMock

Since writeFile is destructured at import time instead of being consistently referred as fs.promises.writeFile method, it cannot be affected with spyOn.
It should be mocked as any other module:
jest.mock("fs", () => ({
promises: {
writeFile: jest.fn().mockResolvedValue(),
readFile: jest.fn().mockResolvedValue(),
},
}));
const fs = require("fs");
...
await createNewFile();
expect(fs.promises.writeFile).toHaveBeenCalledTimes(1);
It make sense to mock fs scarcely because unmocked functions provide side effects and potentially have negative impact on test environment.

Mock "fs/promises" async functions in jest
Here is a simple example using fs.readdir(), but it would also apply to any of the other async fs/promises functions.
files.service.test.js
import fs from "fs/promises";
import FileService from "./files.service";
jest.mock("fs/promises");
describe("FileService", () => {
var fileService: FileService;
beforeEach(() => {
// Create a brand new FileService before running each test
fileService = new FileService();
// Reset mocks
jest.resetAllMocks();
});
describe("getJsonFiles", () => {
it("throws an error if reading the directory fails", async () => {
// Mock the rejection error
fs.readdir = jest.fn().mockRejectedValueOnce(new Error("mock error"));
// Call the function to get the promise
const promise = fileService.getJsonFiles({ folderPath: "mockPath", logActions: false });
expect(fs.readdir).toHaveBeenCalled();
await expect(promise).rejects.toEqual(new Error("mock error"));
});
it("returns an array of the .json file name strings in the test directory (and not any other files)", async () => {
const allPotentialFiles = ["non-json.txt", "test-json-1.json", "test-json-2.json"];
const onlyJsonFiles = ["test-json-1.json", "test-json-2.json"];
// Mock readdir to return all potential files from the dir
fs.readdir = jest.fn().mockResolvedValueOnce(allPotentialFiles);
// Get the promise
const promise = fileService.getJsonFiles({ folderPath: "mockPath", logActions: false });
expect(fs.readdir).toBeCalled();
await expect(promise).resolves.toEqual(onlyJsonFiles); // function should only return the json files
});
});
});
files.service.ts
import fs from "fs/promises";
export default class FileService {
constructor() {}
async getJsonFiles(args: FilesListArgs): Promise<string[]> {
const { folderPath, logActions } = args;
try {
// Get list of all files
const files = await fs.readdir(folderPath);
// Filter to only include JSON files
const jsonFiles = files.filter((file) => {
return file.includes(".json");
});
return jsonFiles;
} catch (e) {
throw e;
}
}
}

I know this is an old thread, but in my case, I wanted to handle different results from readFile (or writeFile in your case). So I used the solution Estus Flask suggested with the difference that I handle each implementation of readFile in each test, instead of using mockResolvedValue.
I'm also using typescript.
import { getFile } from './configFiles';
import fs from 'fs';
jest.mock('fs', () => {
return {
promises: {
readFile: jest.fn()
}
};
});
describe('getFile', () => {
beforeEach(() => {
jest.resetAllMocks();
});
it('should return results from file', async () => {
const mockReadFile = (fs.promises.readFile as jest.Mock).mockImplementation(async () =>
Promise.resolve(JSON.stringify('some-json-value'))
);
const res = await getFile('some-path');
expect(mockReadFile).toHaveBeenCalledWith('some-path', { encoding: 'utf-8' });
expect(res).toMatchObject('some-json-value');
});
it('should gracefully handle error', async () => {
const mockReadFile = (fs.promises.readFile as jest.Mock).mockImplementation(async () =>
Promise.reject(new Error('not found'))
);
const res = await getFile('some-path');
expect(mockReadFile).toHaveBeenCalledWith('some-path', { encoding: 'utf-8' });
expect(res).toMatchObject('whatever-your-fallback-is');
});
});
Note that I had to cast fs.promises.readFile as jest.Mock in order to make it work for TS.
Also, my configFiles.ts looks like this:
import { promises as fsPromises } from 'fs';
const readConfigFile = async (filePath: string) => {
const res = await fsPromises.readFile(filePath, { encoding: 'utf-8' });
return JSON.parse(res);
};
export const getFile = async (path: string): Promise<MyType[]> => {
try {
const fileName = 'some_config.json';
return readConfigFile(`${path}/${fileName}`);
} catch (e) {
// some fallback value
return [{}];
}
};

Related

sinon stub for lambda function which is inside another lambda

Am writing unit test case for my code, as am calling another lambda function inside my lambda am not sure how to mock the inner lambda value, so because of this my test case is getting timed out. Attaching my code below
Test case file
"use strict";
const sinon = require("sinon");
const AWS = require("aws-sdk");
const expect = require("chai").expect;
const models = require("common-lib").models;
const { Organization } = models;
const DATA_CONSTANTS = require("./data/deleteOrganization");
const wrapper = require("../../admin/deleteOrganization");
const sandbox = sinon.createSandbox();
describe("Start Test updateOrganization", () => {
beforeEach(() => {
sandbox.stub(Organization, "update").resolves([1]);
});
afterEach(async () => {
sandbox.restore();
});
it("Test 03: Test to check success returned by handler", async () => {
const mLambda = {
invoke: sinon.stub().returnsThis(),
promise: sinon.stub(),
};
const response = await wrapper.handler(
DATA_CONSTANTS.API_REQUEST_OBJECT_FOR_200
);
console.log({ response });
expect(response.statusCode).to.be.equal(200);
const body = JSON.parse(response.body);
expect(body.message).to.be.equal("Updated successfully");
});
});
Code function
exports.handler = asyncHandler(async (event) => {
InitLambda("userService-deleteOrganization", event);
const { id } = event.pathParameters;
if (isEmpty(id)) {
return badRequest({
message: userMessages[1021],
});
}
try {
const orgrepo = getRepo(Organization);
const [rowsUpdated] = await orgrepo.update(
{ isDeleted: true },
{ org_id: id }
);
if (!rowsUpdated) {
return notFound({
message: userMessages[1022],
});
}
const lambda = new AWS.Lambda({
region: process.env.region,
});
await lambda
.invoke({
FunctionName:
"user-service-" + process.env.stage + "-deleteOrganizationDetail",
InvocationType: "Event",
Payload: JSON.stringify({
pathParameters: { id },
headers: event.headers,
}),
})
.promise();
return success({
message: userMessages[1023],
});
} catch (err) {
log.error(err);
return failure({
error: err,
message: err.message,
});
}
});
It seems that you are not properly stubbing the AWS.Lambda object.
try this,
const sinon = require("sinon");
const AWS = require("aws-sdk");
const expect = require("chai").expect;
const models = require("common-lib").models;
const { Organization } = models;
const DATA_CONSTANTS = require("./data/deleteOrganization");
const wrapper = require("../../admin/deleteOrganization");
const sandbox = sinon.createSandbox();
describe("Start Test updateOrganization", () => {
beforeEach(() => {
sandbox.stub(Organization, "update").resolves([1]);
});
afterEach(async () => {
sandbox.restore();
});
it("Test 03: Test to check success returned by handler", async () => {
const mLambda = { invoke: sinon.stub().returnsThis(), promise: sinon.stub() };
// you missed the below line
sinon.stub(AWS, 'Lambda').callsFake(() => mLambda);
const response = await wrapper.handler(
DATA_CONSTANTS.API_REQUEST_OBJECT_FOR_200
);
console.log({ response });
expect(response.statusCode).to.be.equal(200);
const body = JSON.parse(response.body);
expect(body.message).to.be.equal("Updated successfully");
sinon.assert.calledOnce(AWS.Lambda);
sinon.assert.calledWith(mLambda.invoke, {});
sinon.assert.calledOnce(mLambda.promise);
});
});
I can see that,
You are writing entire logic inside your handler function. This makes it less testable.
To overcome this you can write your code in such a way that is divided into small functions, which are easy to mock in test case files or testable independently. Handler function should only make call to those functions and return the result to the caller.
for Eg.
Lambda Handler:
exports.lambdaHandler = async (event) => {
// do some init work here
const lambdaInvokeResponse = await exports.invokeLambda(params);
}
exports.invokeLambda = async (params) {
const response = await lambda.invoke(params).promise();
return response;
}
test cases:
it('My Test Case - Success', async () => {
const result = await app.lambdaHandler(event);
const invikeLambdaResponse = {
// some dummy response
};
sinon.replace(app, 'invokeLambda', sinon.fake.returns(invikeLambdaResponse ));
});
This is now mocking the only lambda invoke part.
You can mock all the external calls like this (dynamodb, invoke, sns, etc.)
You can set spy and check if the called method is called as per desired arguments

mock third party method +jest

I am using jest for my backend unit testing.I need to mock third party library module methods using that.I tried the following code:
My controller file:
const edgejs = require('apigee-edge-js');
const apigeeEdge = edgejs.edge;
async get(req, res) {
const abc= await apigeeEdge.connect(connectOptions);
const Details = await abc.developers.get(options);
return res.status(200).send(Details);
}
test.spec.js
let edgejs = require('apigee-edge-js');
const ctrl = require('../../controller');
describe("Test suite for abc", () => {
test("should return ...", async() =>{
edgejs.edge = jest.fn().mockImplementationOnce(async () =>
{return {"connect":{"developers":{"get":[{}]}}}}
);
ctrl.get(req, res)
});
But its not mocking , its calling the actual library connect method. What i am doing wrong here. Please share your ideas. Thanks in advance.
WORKING CODE
jest.mock('apigee-edge-js', () => {
return { edge: { connect: jest.fn() } };
});
const edgejs = require('apigee-edge-js');
test("should return ...", async () => {
edgejs.edge.connect.mockImplementationOnce(() => Promise.resolve(
{"developers":{"get":[{}]}}
));
edgejs.edge.connect()
expect(edgejs.edge.connect).toBeCalled();
})
ERROR CODE:
jest.mock('apigee-edge-js', () => {
return { edge: { connect: jest.fn() } };
});
const Ctrl = require('../../controllers/controller'); ----> Extra line
const edgejs = require('apigee-edge-js');
test("should return ...", async () => {
edgejs.edge.connect.mockImplementationOnce(() => Promise.resolve(
{"developers":{"get":[{}]}}
));
const req = mockRequest();
const res = mockResponse();
await Ctrl.get(req, res) ---> Extra line
expect(edgejs.edge.connect).toBeCalled();
});
Receceivig erro : TypeError: edgejs.edge.connect.mockImplementationOnce is not a function
The mock doesn't affect anything because controller dereferences edgejs.edge right after it's imported, apigeeEdge = edgejs.edge. This would be different if it were using edgejs.edge.connect instead of apigeeEdge.connect.
Methods shouldn't be mocked as ... = jest.fn() because this prevents them from being restored and may affect other tests after that; this is what jest.spyOn is for. Furthermore, edge is an object and not a method.
Jest provides module mocking functionality. Third-party libraries generally need to be mocked in unit tests.
It should be:
jest.mock('apigee-edge-js', () => {
return { edge: { connect: jest.fn() } };
});
const ctrl = require('../../controller');
const edgejs = require('apigee-edge-js');
test("should return ...", async () => {
edgejs.edge.connect.mockImplementationOnce(() => Promise.resolve(
{"developers":{"get":[{}]}}
));
await ctrl.get(req, res)
...
});

nodeJS share variable between modules

i have a problem with my project, i need sharing the configurations parameters between modules, y have my entry point
app.js
const { requestConfiguracion } = require('./clases/servicios');
( async () => {
const dbConfig = await requestConfiguracion();
})();
servicios.js
const axios = require('axios').default;
const requestConfiguracion = async () => {
try {
const request = await axios.get('http://localhost/config/getConfig');
return request.data;
} catch (error) {
console.error(error)
}
}
module.exports = {
requestConfiguracion,
}
I need that the configuration in dbConfig is available for the other modules.
You can create a separated module to read and export that dbConfig.
// db-config.js
const { requestConfiguracion } = require('./clases/servicios');
let dbConfig = null;
module.exports.getDbConfig = async () => {
if (dbConfig) return dbConfig; // cache dbConfig for next time
dbConfig = await requestConfiguracion();
return dbConfig;
};
And in other modules:
// app.js
const { getDbConfig } = require('./db-config');
async function someFunction() {
const dbConfig = await getDbConfig();
// do something
}
Since your requestConfiguracion() function is an async function, you have to use await every time trying to get that config.

Jest mocking google-cloud/storage typescript

I have been trying to mock the #google-cloud/storage for my implementation so that I could test it without having to hit the cloud-storge in gcp and so far it has all been in vain
I have tried to mock the node_module scope folder using the jest doc and that didnt work out
Hence I tried using below
This is my implementation class
import { GcloudAuthenticationInstance } from '../common/services/gcloud.authentication';
import * as fs from 'fs';
import pump from 'pump';
import pino from 'pino';
import * as _ from 'lodash';
import {
ENV_NAME_DEV,
GCLOUD_DATABASE_BUCKET_DEV,
GCLOUD_DATABASE_BUCKET_PROD,
GCLOUD_ENV_STR_BUCKET_NAME,
GCLOUD_STORED_FILE_NAME_DEV,
GCLOUD_STORED_FILE_NAME_PROD,
GCLOUD_UPLOAD_FILE_DEV_LOCAL_PATH,
GCLOUD_UPLOAD_FILE_PROD_LOCAL_PATH,
} from '../common/util/app.constants';
import { PinoLoggerServiceInstance } from '../common/services/pino.logger.service';
import { AppUtilServiceInstance } from '../common/services/app.util.service';
export const uploadEnvFiles = async (env_name: string) => {
const LOGGER: pino.Logger = PinoLoggerServiceInstance.getLogger(__filename);
return new Promise(async (res, rej) => {
// This just returns the Storage() instance with keyFileName and projectID
//of google cloud console being set so authentication takes place
const str = GcloudAuthenticationInstance.createGcloudAuthenticationBucket();
const bucketToUpload = GCLOUD_ENV_STR_BUCKET_NAME;
let uploadLocalFilePath;
let destinationBucketPath;
if (!AppUtilServiceInstance.isNullOrUndefined(env_name)) {
uploadLocalFilePath = ENV_NAME_DEV === env_name ? GCLOUD_UPLOAD_FILE_DEV_LOCAL_PATH : GCLOUD_UPLOAD_FILE_PROD_LOCAL_PATH;
destinationBucketPath = ENV_NAME_DEV === env_name ? GCLOUD_DATABASE_BUCKET_DEV : GCLOUD_DATABASE_BUCKET_PROD;
}
LOGGER.info('after authentication');
pump(
fs.createReadStream(uploadLocalFilePath),
str
.bucket(bucketToUpload)
.file(destinationBucketPath)
.createWriteStream({
gzip: true,
public: true,
resumable: true,
})
)
.on('error', (err) => {
LOGGER.error('Error occured in uploading:', err);
rej({ status: 'Error', error: err, code: 500 });
})
.on('finish', () => {
LOGGER.info('Successfully uploaded the file');
res({ status: 'Success', code: 201, error: null });
});
});
};
export const downloadEnvFiles = async (env_name): Promise<any> => {
const LOGGER: pino.Logger = PinoLoggerServiceInstance.getLogger(__filename);
return new Promise(async (res, rej) => {
const str = GcloudAuthenticationInstance.createGcloudAuthenticationBucket();
try {
const [files] = await str.bucket(GCLOUD_ENV_STR_BUCKET_NAME).getFiles();
const filteredFile =
ENV_NAME_DEV === env_name
? _.find(files, (file) => {
c
return file.name.includes(GCLOUD_STORED_FILE_NAME_DEV);
})
: _.find(files, (file) => {
return file.name.includes(GCLOUD_STORED_FILE_NAME_PROD);
});
res({
status: 'Success',
code: 200,
error: null,
stream: str
.bucket(GCLOUD_ENV_STR_BUCKET_NAME)
.file(filteredFile.name)
.createReadStream()
});
} catch (err) {
LOGGER.error('Error in retrieving files from gcloud:'+err);
rej({ status: 'Error', error: err, code: 500 });
}
});
};
This is my jest ts
bucket.operations.int.spec.ts
I've tried to include the mock inline
import { GcloudAuthenticationInstance } from '../common/services/gcloud.authentication';
const { Storage } = require('#google-cloud/storage');
const { Bucket } = require('#google-cloud/storage');
import { File } from '#google-cloud/storage';
import { mocked } from 'ts-jest/utils'
const fs = require('fs');
import * as path from 'path';
import pump from 'pump';
import * as BucketOperations from './bucket.operations';
import { GCLOUD_ENV_STR_BUCKET_NAME } from '../common/util/app.constants';
const { PassThrough } = require('stream');
const fsMock = jest.mock('fs');
// Here we are trying to mock pump with a function returned
// since pump () is the actual fucntion, we are mocking the function to return a value
// which is just a value of "on" eventlistener.. so we indicate that this will be substituted
// with another mocked function
jest.genMockFromModule('#google-cloud/storage');
jest.mock('#google-cloud/storage', () => {
const mockedFile = jest.fn().mockImplementation(() => {
return {
File: jest.fn().mockImplementation(() => {
return {
name: 'dev.txt',
createReadStream: jest
.fn()
.mockReturnValue(
fs.createReadStream(
path.resolve(process.cwd(), './tests/cloud-storage/sample-read.txt')
)
),
createWriteStream: jest
.fn()
.mockReturnValue(
fs.createWriteStream(
path.resolve(process.cwd(), './tests/cloud-storage/sample-write.txt')
)
)
};
})
};
});
const mockedBUcket = jest.fn().mockImplementation(() => {
return {
Bucket: jest.fn().mockImplementation(() => {
return {
constructor: jest.fn().mockReturnValue('test-bucket'),
getFiles: jest.fn().mockReturnValue([mockedFile])
}
})
}
});
return {
Storage: jest.fn().mockImplementation(() => {
return {
constructor: jest.fn().mockReturnValue('test-storage'),
bucket: mockedBUcket,
file: mockedFile,
createWriteStream: jest.fn().mockImplementation(() =>
fs.createWriteStream(path.resolve(process.cwd(), './tests/cloud-storage/sample-write.txt')))
};
})
};
});
jest.mock('pump', () => {
const mPump = { on: jest.fn() };
return jest.fn(() => mPump);
});
describe('Test suite for testing bucket operations', () => {
const mockedStorage = mocked(Storage, true);
const mockeddFile = mocked(File, true);
const mockeddBucket = mocked(Bucket, true);
function cancelCloudStorageMock() {
//mockCloudStorage.unmock('#google-cloud/storage');
mockedStorage.mockClear();
mockeddBucket.mockClear();
mockeddFile.mockClear();
jest.unmock('#google-cloud/storage');
jest.requireActual('#google-cloud/storage');
}
function cancelFsMock() {
jest.unmock('fs');
jest.requireActual('fs');
}
afterEach(() => {
jest.clearAllMocks();
//jest.restoreAllMocks();
});
test('test for uploadfiles - success', async (done) => {
cancelFsMock();
pump().on = jest.fn(function(this: any, event, callback) {
if (event === 'finish') {
callback();
}
return this;
});
const actual = await BucketOperations.uploadEnvFiles('dev');
expect(actual).toEqual(
expect.objectContaining({
status: 'Success',
code: 201,
})
);
done();
});
test('test downloadEnvFiles - success', async (done) => {
jest.unmock('fs');
const fMock = (File.prototype.constructor = jest.fn().mockImplementation(() => {
return {
storage: new Storage(),
bucket: 'testBucket',
acl: 'test-acl',
name: 'dev.txt',
parent: 'parent bucket',
};
}));
const bucketGetFilMock = (Bucket.prototype.getFiles = jest.fn().mockImplementation(() => {
return [fMock];
}));
// Get files should be an array of File from google-cloud-storage
//Bucket.prototype.getFiles = jest.fn().mockReturnValue([mockedFsConstructor]);
//Storage.prototype.bucket = jest.fn().mockReturnValue(new Storage());
const mockReadable = new PassThrough();
const mockWritable = new PassThrough();
jest.spyOn(fs, 'createReadStream').mockReturnValue(
fs.createWriteStream(path.resolve(process.cwd(), './tests/cloud-storage/sample-read.txt'))
);
await BucketOperations.downloadEnvFiles('dev');
done();
});
});
This is the exception I end up with. Upon debugging I see that the mocked instances are trying to execute, but it doesn't execute the file method in Storage mock. This is not available in #google-cloud/storage but I did try to mock it. Is there a way to mock just the usage of google-cloud/storage using jest?
EDIT:
Here is the exception:
TypeError: str.bucket(...).file is not a function
at /home/vijaykumar/Documents/Code/Nestjs/cloud-storage-app/src/gcloud/bucket.operations.ts:37:6
at Generator.next (<anonymous>)
at /home/vijaykumar/Documents/Code/Nestjs/cloud-storage-app/src/gcloud/bucket.operations.ts:8:71
at new Promise (<anonymous>)
at Object.<anonymous>.__awaiter (/home/vijaykumar/Documents/Code/Nestjs/cloud-storage-app/src/gcloud/bucket.operations.ts:4:12)
at /home/vijaykumar/Documents/Code/Nestjs/cloud-storage-app/src/gcloud/bucket.operations.ts:22:40
at new Promise (<anonymous>)
at /home/vijaykumar/Documents/Code/Nestjs/cloud-storage-app/src/gcloud/bucket.operations.ts:22:9
Thanks to #ralemos. I was able to find the answer on how I mocked
Here is the complete implementation.
I've added a few more test stories as well
So jest.mock() esp the #google-cloud/storage modules, needs to be mocked in a different way. The Bucket of the Storage has all the details of the files in gcp storage, so that needs to be mocked first, I also mocked the File (this is of type #google-cloud/storage). Now I added the mockedFile to the mockedBucket and from there to the mockedStorage. I've also added all the methods and properties and implemented a mock for all of them.
There is a lodash node_module usage in my test file, so I mocked that implementation as well. Now everything works fine.
import { GcloudAuthenticationInstance } from '../common/services/gcloud.authentication';
const { Storage } = require('#google-cloud/storage');
const fs = require('fs');
import * as path from 'path';
import pump from 'pump';
import * as BucketOperations from './bucket.operations';
const { PassThrough } = require('stream');
const fsMock = jest.mock('fs');
const mockedFile = {
name: 'dev.txt',
createWriteStream: jest.fn().mockImplementation(() => {
return fs.createWriteStream(path.resolve(process.cwd(), './tests/cloud-storage/sample-write.txt'));
}),
createReadStream: jest.fn().mockImplementation(() => {
return fs.createReadStream(path.resolve(process.cwd(), './tests/cloud-storage/sample-read.txt'));
}),
};
jest.mock('lodash', () => {
return {
find: jest.fn().mockImplementation(() => {
return mockedFile;
})
};
});
const mockedBucket = {
file: jest.fn(() => mockedFile),
getFiles: jest.fn().mockImplementation(() => {
const fileArray = new Array();
fileArray.push(mockedFile);
return fileArray;
})
};
const mockedStorage = {
bucket: jest.fn(() => mockedBucket)
};
jest.mock('#google-cloud/storage', () => {
return {
Storage: jest.fn(() => mockedStorage)
};
});
jest.mock('pump', () => {
const mPump = { on: jest.fn() };
return jest.fn(() => mPump);
});
describe('Test suite for testing bucket operations', () => {
function cancelCloudStorageMock() {
jest.unmock('#google-cloud/storage');
jest.requireActual('#google-cloud/storage');
}
function cancelFsMock() {
jest.unmock('fs');
jest.requireActual('fs');
}
afterEach(() => {
jest.clearAllMocks();
//jest.restoreAllMocks();
});
test('test for uploadfiles - success', async (done) => {
pump().on = jest.fn(function(this: any, event, callback) {
if (event === 'finish') {
callback();
}
return this;
});
const actual = await BucketOperations.uploadEnvFiles('dev');
expect(actual).toEqual(
expect.objectContaining({
status: 'Success',
code: 201,
})
);
done();
});
test('test downloadEnvFiles - success', async (done) => {
jest.unmock('fs');
const downloadRes = await BucketOperations.downloadEnvFiles('dev');
expect(downloadRes).toBeDefined();
expect(downloadRes).toEqual(expect.objectContaining({code:200, status: 'Success'}));
done();
});
test('test for uploadfiles- failure', async (done) => {
cancelCloudStorageMock();
const bucketStorageSpy = jest
.spyOn(GcloudAuthenticationInstance, 'createGcloudAuthenticationBucket')
.mockImplementation(() => {
return new Storage({
projectId: 'testId',
keyFilename: path.resolve(process.cwd(), './tests/cloud-storage/sample-read.txt'),
scopes: ['testScope'],
autoRetry: false,
});
});
const mockReadable = new PassThrough();
const mockWritable = new PassThrough();
fs.createWriteStream = jest.fn().mockReturnValue(mockWritable);
fs.createReadStream = jest.fn().mockReturnValue(mockReadable);
pump().on = jest.fn(function(this: any, event, callback) {
if (event === 'error') {
callback();
}
return this;
});
const actual = BucketOperations.uploadEnvFiles('prod');
expect(actual).rejects.toEqual(
expect.objectContaining({
status: 'Error',
code: 500,
})
);
expect(bucketStorageSpy).toHaveBeenCalledTimes(1);
done();
});
test('test download - make the actual call - rej with auth error', async (done) => {
cancelCloudStorageMock();
console.dir(Storage);
const mockReadable = new PassThrough();
const mockWritable = new PassThrough();
fs.createWriteStream = jest.fn().mockReturnValue(mockWritable);
fs.createReadStream = jest.fn().mockReturnValue(mockReadable);
const createGcloudAuthenticationBucketSpy = jest
.spyOn(GcloudAuthenticationInstance, 'createGcloudAuthenticationBucket')
.mockImplementation(() => {
return new Storage();
});
try {
await BucketOperations.downloadEnvFiles('dev');
} catch (err) {
expect(err.code).toBe(500);
expect(err.status).toBe('Error');
}
expect(createGcloudAuthenticationBucketSpy).toHaveBeenCalledTimes(1);
createGcloudAuthenticationBucketSpy.mockReset();
done();
});
});

i want to import a ascynchronus function in my app in node.js

THe following code is my async code i want to import in my app.js file
module.exports = {
foo : async () =>{
const axios = require('axios')
const [, pairA, pairB ] = require('./pairs.json')
const { SERVER_URL } = require('./lib/sdk')
const fundAccounts = async (pairs) => await Promise.all(
pairs.map(
async (pair) => await axios.get('/friendbot', {
baseURL: SERVER_URL,
params: { addr: pair.publicKey }
})
)
)
fundAccounts([pairA, pairB])
.then(() => console.log('ok'))
.catch((e) => { console.error(e); throw e})
}
}
the following is my server.js file trying to import and check the output
var funding = require('./scripts/2_fundAccounts');
console.log(typeof(funding));
console.log(funding.foo())
// i AM GETTING undifined
Next i tried with set time out method
function calling() {
var funding = require('./scripts/2_fundAccounts');
funding.foo()
}
setTimeout(calling,4000)
Any suggestions how to import the code and make it work
Try this:
module.exports = async function foo () {
// etc etc...
}

Resources