Mocking a custom node module using jest.mock method - node.js

So, I am new to mocking a module using jest.mock()
So here is the scenario, I have created a node module and uploaded to the private npm registry from where I can that module in my app.
If suppose the name of the module is #a/lp-mod and if this the index.js of that module:
export const lpr = async () => {
// some code that does something and returns some data back
}
export default {
lpr
}
now let us all suppose I need to mock this module (#a/lp-mod) and have to return some static data whenever lpr function is gets called in the context of a test case.
So here is the code I have written:
> proj_directory
|---->__mocks__
| |--->#a
| | |--->lp-mod --> this directory has index.js with mock data
|---->node_modules
| |--->#a
| | |--->lp-mod --> this directory has index.js inside /src
| | |---> src
| | | |---> index.js
|---->test
| |--->1.test.js --->
node_modules/#a/lp-mod/src/index.js
// some npm imports like axios
export const lpr = async () => {
// has some I/O calls, but let's just keep it simple
return Promise.resolve('I am original call')
}
export default {
lpr
}
__mocks__/#a/lp-mod/index.js
const p = jest.genMockFromModule('#a/lp-mod')
p.lpr = () => Promise.resolve('This is mocked call')
export default p
1.test.js
// before describe (test case) I wrote this
jest.mock('#a/lp-mod')
But I am getting undefined when the file is trying to import the original #a/lp-mod module for the usage inside the test context.
As per my expectation, it should get the mocked module and return the data from there only, if I am testing my app.
Please shed some light & bear with me incase some of the info is missing, please let me know in case there is any doubt.
Happy coding :)

Related

why can't you mock a re-exported primitive value?

I'm trying to change the value of a primitive config object during tests. One of my files under test re-exports a primitive that is conditional on the config values.
I'm finding that when the value is wrapped in a function, then mocking it and asserting on it works perfectly.
However when the value is re-exported as a primitive, the value is not mocked, and is undefined.
Simplified example:
config.ts
export const config = {
environment: 'test'
};
app.ts
import { config } from './config';
export const app = () => config.environment;
export const environment = config.environment;
app.spec.ts
import { app, environment } from './app';
import * as config from './config';
jest.mock('./config', () => ({
config: {},
}));
beforeEach(() => {
jest.resetAllMocks();
});
const mockConfig = config.config as jest.Mocked<typeof config.config>;
test('app', () => {
mockConfig.environment = 'prod';
expect(app()).toEqual('prod');
});
test('environment', () => {
mockConfig.environment = 'nonprod';
expect(environment).toEqual('nonprod');
});
The first test passes, but the second test "environment" fails. Why?
✕ environment (3 ms)
● environment
expect(received).toEqual(expected) // deep equality
Expected: "nonprod"
Received: undefined
19 | test('environment', () => {
20 | mockConfig.environment = 'nonprod';
> 21 | expect(environment).toEqual('nonprod');
| ^
22 | });
23 |
at Object.<anonymous> (config/app.spec.ts:21:29)
The problem could be related with the order files are read. The app file is the first one to be read, and then the config file is read because its imported on the app one. But, probably the app code run first, so the variable was set as undefined (because the config one had not a value at the time).
The same does not happen with the app function, because it reads the config variable only after the function is called. And at that time, the variable already was set.

global modules in node.js

I have made a custom error class for throwing error in node.js. I want to use this class in my entire project. But the problem is wherever I need to use first I have to require in that file and then use this class which is quite tedious. Anyone any idea how to make it global so that all the modules can use it without requiring it in every file.
This is a custom error class file "cti_error.js"
'use strict';
class CTIError extends Error {
constructor (status,message,details='') {
super(message)
this.name = status
Error.captureStackTrace(this, this.constructor);
this.status = status || 500;
this.details = details?details:message;
this.reason = message;
}
}
module.exports = CTIError;
My project structure:-
my_project
|
|____utility
| |
| |____cti_error.js
|
|____routes
| |
| |_____product.js
| |_____defects.js
|
|
|_____server.js
The solution what I know is to require the custom error class as below in every file where I want to throw an error:-
const cti_error = require('../../utility/cti_error.js');
throw new cti_error(403,"wrong details");
Any idea how to use cti_eror without requiring in every file?
You can assign it to nodejs's global variable.
Like this:
global.CTIError = CTIError;
Now you can access the CTIError anywhere as:
new CTIError()
Your linter might tell you that CTIError is not declared though.

Any way to restrict export within a package in nodejs?

Is there a way or pattern in NodeJS to share functions within the modules of a package only and not allow them to be shared from another package?
E.g if Package A has file1.js, file2.js and index.js. index.js uses functions from file1 and file2.
Package B uses Package A. It seems like all modules exported from file1 and file2 are also available to Package B. Can it be restricted to those exported from index.js of Package A only?
In short, is there a support for something like a protected scope?
as AZ_ said: only export that what you want to export.
example:
// file1
export const foo = () => { console.log("foo"); }
// file2
import {foo} from "./file1"
export const bar = () => {
foo();
console.log("bar");
}
// index
import {bar} from "./bar"
// the package will only export bar
export { bar }

Dynamically exporting functions in firebase

I'm having the typical (according to many posts) issue with cold boot times in cloud functions. A solution that seemed promised suggests to import / export only the function actually being executed, as can be seen here:
https://github.com/firebase/functions-samples/issues/170#issuecomment-323375462
if (!process.env.FUNCTION_NAME || process.env.FUNCTION_NAME === 'sendFollowerNotification') {
exports.sendFollowerNotification = require('./sendFollowerNotification');
}
Which is a Javascript example, but I'm using typescript. I've tried a number of variations, and while some build, in the end I'm always stuck with my function not being exported and deploy warning me that I'm going to delete the existing function.
This is one of the numerous attempts:
if (!process.env.FUNCTION_NAME || process.env.FUNCTION_NAME === 'generateInviteURL') {
import ('./invite_functions').then ((mod) => { console.log ("mod follows" ); console.log (mod); exports.generateInviteURL = functions.https.onRequest( mod.generateInviteURL ); } )
.catch ((err) => {console.log ("Trying to import/export generateInviteURL ", err);}) ;
}
At mentioned, at deploy time what happens is that I get a warning about the function being deleted.
I was able to "avoid" that message with something like this:
console.log ("Function name: ", process.env.FUNCTION_NAME);
function dummy_generateInviteURL (req, res) { ; }
exports.generateInviteURL = functions.https.onRequest( dummy_generateInviteURL );
if (!process.env.FUNCTION_NAME || process.env.FUNCTION_NAME === 'generateInviteURL') {
console.log ("Doing the good import");
import ('./invite_functions').then ((mod) => { console.log ("mod follows" ); console.log (mod); exports.generateInviteURL = functions.https.onRequest( mod.generateInviteURL ); } )
.catch ((err) => {console.log ("Trying to import/export generateInviteURL ", err);}) ;
}
console.log ("Exported");
console.log (exports.generateInviteURL);
Which the idea of course that an empty function would be always be exported but would be replaced with the real one if that's the one being called.
In that case logs look like this:
generateInviteURL Function name: generateInviteURL generateInviteURL
generateInviteURL Exported generateInviteURL
{ [Function: cloudFunction] __trigger: { httpsTrigger: {} } }
So the first part looks promising (the environment variable is defined), then the import does something (enters the then block, never the catch), but the exported variable is not replaced.
I'm not sure if this is a TypeScript problem, a firebase problem, or a developer problem - probably I'm just missing something obvious.
So the question - how can I avoid importing/exporting anything I don't need for each specific function?
You can stick to the original index.js, with minor changes. I have tried a few times and came with a solution. Am including a sample dir structure and two typescript files (index.ts and another for your custom function). With this you will never have to change the index.ts to modify or add functions.
Directory Structure
+ functions
|
-- + src
| |
| -- index.ts
| |
| -- + get
| |
| -- status.f.ts
-- package.json (auto generated)
|
-- package-lock.json (auto generated)
|
-- tsconfig.json (auto generated)
|
-- tslint.json (auto generated)
|
-- + lib (auto generated)
|
-- + node_modules (auto generated)
src/index.ts
import * as glob from "glob";
import * as camelCase from "camelcase";
const files = glob.sync('./**/*.f.js', { cwd: __dirname, ignore: './node_modules/**'});
for(let f=0,fl=files.length; f<fl; f++){
const file = files[f];
const functionName = camelCase(file.slice(0, -5).split('/').join('_')); // Strip off '.f.js'
if (!process.env.FUNCTION_NAME || process.env.FUNCTION_NAME === functionName) {
exports[functionName] = require(file);
}
}
src/get/status.f.ts
import * as functions from 'firebase-functions';
exports = module.exports = functions.https.onRequest((req, res) => {
res.status(200).json({'status':'OK'})
})
Once you have created the above file install npm packages 'glob' and 'camelcase',
then try to deploy the firebase functions
Firebase will deploy a function named 'getStatus'
NOTE that the name of the function is camel case version of folder names and the file name where the function exists, so you can export only one function per .f.ts file
EDIT
I have updated the dir structure. Note that index.ts and all the subsequent files and folders resides within the parent folder 'src'
I've been having similar issues, and I wrote a pretty cool solution that worked for me.
I decided to release it as an open-source package - I've never done that before so if anyone can check it out, contribute or give feedback that would be great.
The package is called better-firebase-functions - https://www.npmjs.com/package/better-firebase-functions
All you have to do is import and run it. Two lines. Pretty much everything else is automated.
It will pick up the default exports of all function files in your directory and deploy them as properly named functions.
Here is an example:
import { exportFunctions } from 'better-firebase-functions';
exportFunctions({
__filename, // standard node var (leave as is).
exports, // standard node var (leave as is).
functionDirectoryPath: './myFuncs', // define root functions folder
// relative to this file.
searchGlob: '**/*.js' // file search glob pattern.
});
The only other thing you need to do is export your function.http.onRequest()... as a default export from every file that contains a cloud function that you want to deploy. So:
export default functions.http.onRequest()...
/// OR
const func1 = functions.http.onRequest()...
export default func1;
And that's it!
UPDATE: Answer edited to reflect newer version of package.

Jasmine spy on exported function on NodeJS

I've had trouble spying on exported function in a NodeJS (v9.6.1) app using Jasmine.
The app is written in typescript, transpiled with tsc in a dist folder to be executed as javascript.
App
I have a Foo utils file (foo.utils.ts) which exports functions:
import {readFile} from "fs";
export function getData(filePath: string){
const data = readFile(filePath)
// various checks happens here.
return data;
}
And a Bar class in a bar.ts file:
import {getData} from "../utils/foo.utils
export class Bar {
public do(){
// ... does stuff
const data = getData(filePath);
// etc.
}
}
Test
Now I'm trying to spyOn the exported getData method to check if it've been called and to return a mocked value. I don't want to read file in my unit test (and even to use the real getData method at all)
The bar.spec.ts file for testing:
describe("Bar", ()=>{
let bar: Bar;
beforeEach(function() {
bar = new Bar();
})
it(" should do stuff", ()=>{
const spy = spyOn(???, "getData").and.returnValue([]);
bar.do();
expect(spy).toHaveBeenCalled();
})
});
Problems
As it is a NodeJS app, I've been trying to use global as the object to spy on, but I get the error:
Error: : getAttachementData() method does not exist
I've also tried to add
import * as fooutils from ".../src/utils/foo.utils
And spy on fooutils but I still goes through the exported function (and crashes when it tries to read the file...)
Now I'm kinda lost. From what I've found it's not really possible to mock an exported function (even though they should be added to the global object).
I thought about refactoring the utils file to create an utils class which exports static method and spying on them.
Questions
Am I doing something wrong ?
Is it possible to spy (and replace) exported function ?
Would using static method in class (rather than export function) work ?
Any other way to that ?
You can wrap getData in an object { getData } and write your unit test as follows.
import { getData } from "../utils/foo.utils";
...
it(" should do stuff", () => {
const spy = spyOn({ getData }, "getData").and.returnValue([]);
bar.do();
expect(spy).toHaveBeenCalled();
});

Resources