NestJS Mocking Injected Connection Imported from Different Module - node.js

I've been getting this error all day long:
Nest can't resolve dependencies of the ClubsService (ClubsApiService, AuthApiService, ClubFollowersRepo, ClubsRepo, ClubPrivacyRepo, ?). Please make sure that the argument DatabaseConnection at index [5] is available in the ClubsModule context.
Potential solutions:
- If DatabaseConnection is a provider, is it part of the current ClubsModule?
- If DatabaseConnection is exported from a separate #Module, is that module imported within ClubsModule?
#Module({
imports: [ /* the Module containing DatabaseConnection */ ]
})
I figured that the problem is, that I have not mocked the Mongo DB connection. The error is quite clear, the #InjectConnection in ClubsService should be mocked (see below).
ClubsService:
#Injectable()
export class ClubsService {
constructor(
private readonly clubsApiService: ClubsApiService,
private readonly authApiService: AuthApiService,
private readonly clubFollowersRepo: ClubFollowersRepo,
private readonly clubsRepo: ClubsRepo,
private readonly clubPrivacyRepo: ClubPrivacyRepo,
#InjectConnection() private readonly connection: Connection, // <--- THIS GUY
) {}
// ...
}
The problem is that the test file I am executing is in a different module than where ClubsService is. And so in the different module (let's call it YModule), I have this piece of code:
YModule:
import { getConnectionToken } from '#nestjs/mongoose';
import { MongoMemoryServer } from 'mongodb-memory-server';
import { Connection, connect } from 'mongoose';
describe('YService.spec.ts in YModule', () => {
beforeAll(async () => {
mongod = await MongoMemoryServer.create();
const uri = mongod.getUri();
mongoConnection = (await connect(uri)).connection;
});
beforeEach(async () => {
const module: TestingModule = await Test.createTestingModule({
providers: [
// ...
],
imports: [ClubsModule], // <--- ClubsModule is not provider, but imported module
})
.overrideProvider(getConnectionToken())
.useValue(mongoConnection)
.compile();
});
});
This approach with getConnectionToken() won't work as I have to mock a connection coming from the imported ClubsModule, not a provider.
How would you mock a connection injected in a different module that you imported?
Thanks a lot! :)

As Jay McDoniel mentioned in the post comment, you should not import modules in your unit testing file but mock the needed dependencies instead. Why is that? Consider the example from the question above:
ClubsModule has the connection dependency that can be replaced with an in-memory database server, that's true, but this should be done within the ClubsModule itself (clubs folder), not outside in different modules.
What you really want to do outside ClubsModule, let's say, in the YModule (y folder), is to mock every provider ClubsModule exports that you use within the test file of YModule.
This makes sense as you should test ClubsModule specific dependencies only within its module and everywhere else just mock it.
I originally imported ClubsModule because I wanted to use the repository (a provider) that ClubsModule exports. But then I realized that I don't want to test the functionality of the repository's function, I already test them inside the ClubsModule, so there is no need to do that twice. Instead, it is a good idea to mock the repository instead.
Code Example:
y.service.spec.ts:
import { YService } from './y.service'; // <--- For illustration; Provider within YModule
import { ClubsRepo } from '../clubs/clubs.repo'; // <--- Import the real Repository Provider from different Module (ClubsModule)
describe('y.service.spec.ts in YModule', () => {
const clubsRepo = { // <--- Mocking ClubRepo's functions used within this test file
insertMany: () => Promise.resolve([]),
deleteMany: () => Promise.resolve(),
}
beforeEach(async () => {
const module: TestingModule = await Test.createTestingModule({
providers: [
ClubsRepo, // <--- The real Repository Provider from the import statement above
],
})
.overrideProvider(ClubsRepo) // <--- Overriding the Repository Provider from imports
.useValue(clubsRepo) // <--- Overriding to the mock 'clubsRepo' (const above)
.compile();
service = module.get<YService>(YService); // <--- For illustration; unlike ClubsRepo, this provider resides within this module
});
it('example', () => {
// ...
jest.spyOn(clubsRepo, 'insertMany'); // <--- Using "insertMany" from the mocked clubsRepo (the const) defined at the beginning
// ...
});
});
The reason for importing ClubsRepo to the test file y.service.spec.ts is because y.service.ts (the actual Provider in YModule) uses the functions of the ClubsRepo. In this case, don't forget importing ClubsModule in y.module.ts too.
y.module.ts:
import { ClubsModule } from '../clubs/clubs.module';
#Module({
imports: [
// ...
ClubsModule, // <--- Don't forget this line
// ...
],
providers: [
// ...
],
})
export class YModule {}
That's it, happy testing! :)

Related

Nestjs common repo dependency in conflicts with Project repo dependency

I'm working with a home grown mono repo structure in with NestJS and legacy code. The NestJS parts of the monorepo depend on a common folder in the root that is imported into each Nest Project via "commonPackage":"file:../common" in the package.json file.
The issue I'm experiencing is that the common folders install of #nestjs/config is conflicting with the consuming project's install of the same package. I've been using a workaround to import the necessary code from commonPackage/node_modules/#nestjs/config however that is using the common folder's .env file instead of the consuming project's .env
I have no runtime dependencies in the common package, and I've set #nestjs/config as a peer dependency with a version flag of ^1 however, when attempting to import the consuming project's config
(i.e. import {ConfigService} from '#nestjs/config'; and not the above) service I get an error about an internal property not matching in the spec like below.
src/app.module.ts:16:26 - error TS2345: Argument of type '(config: ConfigService) => ConnectionOptions' is not assignable to parameter of type '(config: ConfigService<Record<string, unknown>>) => ConnectionOptions'.
Types of parameters 'config' and 'config' are incompatible.
Type 'ConfigService<Record<string, unknown>>' is not assignable to type 'ConfigService<Record<string, unknown>, false>'.
Types have separate declarations of a private property 'internalConfig'.
16 MysqlModule.register(sqlConfig),
~~~~~~~~~
[3:47:23 PM] Found 1 error. Watching for file changes.
The workaround solution I worked out was to simply export the config service I was using for my internal module, however I now think it should be possible to pass in the config service when registering the module.
so My current solution is to export the config service from the module that uses it in the common repo:
export declare type SpecConfig = ConfigService;
Use this when defining factories that get put into that module.
*** One caveat is that you will not be able to specify a custom config file with this method as the import is handled in your register logic.***
another option is to add the config service as a dependency to the module registration, but I have yet to work that out.
for example this is my Dynamic Module that was causing the issue.
import { DynamicModule, Module, Provider } from '#nestjs/common';
import { ConfigModule, ConfigService } from '#nestjs/config';
import * as mysql from 'mysql2';
import { MysqlConnectionService } from './mysql-connection/mysql-connection.service';
#Module({})
export class MysqlModule {
static register(...options: MysqlModuleOptions[]): DynamicModule {
const providers: Provider<any>[] = options.map(({config: connectionConfig, name}) => {
const config = {
provide: `${name}-mysql-config`,
useFactory: connectionConfig,
inject: [ConfigService]
}
const provider = {
provide: `${name}-mysql`,
useFactory: (config: mysql.ConnectionOptions) => new MysqlConnectionService(config),
inject: [`${name}-mysql-config`],
}
return [config, provider];
}).reduce((acc, val) => acc.concat(val), [])
return {
module: MysqlModule,
imports: [ConfigModule.forRoot({isGlobal: true})],
providers: [...providers],
exports: [...providers]
}
}
}
export type MysqlConfigFunc = (config: ConfigService) => mysql.ConnectionOptions
export type MsqlConfigService = ConfigService;
export class MysqlModuleOptions {
name: string;
config: MysqlConfigFunc;
}

How to pass a #liaoliaots/nestjs-redis redis connection to global guard constructor

i'm new in NestJS and have some misunderstands with #liaoliaots/nestjs-redis(https://github.com/liaoliaots/nestjs-redis) package. For example, i have a guard with following constructor:
import { InjectRedis } from '#liaoliaots/nestjs-redis';
import { Redis } from 'ioredis';
#Injectable()
export class SomeGuard implements CanActivate {
constructor(#InjectRedis() redis: Redis) {}
...
}
and then i want that guard to be global:
//main.ts
...
app.useGlobalGuards(new SomeGuard(/* what??? */));
...
so thats a problem: what i need to pass? #InjectRedis makes weird things:)
thx for responding
Instead of app.useGlobalGuards, use this another way:
// ...
import { Module } from '#nestjs/common'
import { APP_GUARD } from '#nestjs/core'
#Module({
// ...
providers: [
{
provide: APP_GUARD,
useClass: SomeGuard,
},
],
})
export class AppModule {}
is cleaner and helps you avoid polluting your boostrap funcion. Also, it lets Nest resolves that Redis dependency. Otherwise you'll need to get this dependency and pass it to new SomeGuard using
const redis = app.get(getRedisToken())
https://docs.nestjs.com/guards#binding-guards

How to dynamically inject providers in NestJs

My team is trying to identify a way to dynamically DI such that there can a bulk insertion point without a need to spell out + import each Module.
#Module({
...
providers: [
WorkerService,
WorkerResolver,
Worker2Service,
Worker2Resolver........
]
})
want to achieve
var allModules = ... // logic here to include all my resolvers, or all my services
#Module({
...
providers: [
...allModules
]
})
You can use glob package to dynamically find modules then use NestJs dynamic module feature to load them dynamically.
Suppose all of your worker files are stored in a directory named workers and with extention .worker.ts:
#Module({})
export class WorkerModule {
static forRootAsync(): DynamicModule {
return {
module: WorkerModule ,
imports: [WorkerCoreModule.forRootAsync()],
};
}
}
export class WorkerCoreModule {
static async forRootAsync(): Promise<DynamicModule> {
// Feel free to change path if your structure is different
const workersPath = glob.sync('src/**/workers/*.worker.ts');
const workersRelativePathWithoutExt = modelsPath
// Replace src, because you are probably running the code
// from dist folder
.map((path) => path.replace('src/', './../'))
.map((path) => path.replace('.ts', ''));
const workerProviders: Provider<any>[] = [];
const importedModules = await Promise.all(
workersRelativePathWithoutExt.map((path) => import(path)),
);
importedModules.forEach((modules) => {
// Might be different if you are using default export instead
const worker = modules[Object.keys(modules)[0]];
workerProviders.push({
provide: worker.name,
useValue: worker,
});
});
return {
module: WorkerCoreModule,
providers: [...workerProviders],
// You can omit exports if providers are meant to be used
// only in this module
exports: [...workerProviders],
};
}
}
Now suppose you have a simple worker class with path src/anyModule/workers/simple-worker.ts, you can use it like this:
class WrokersService {
constructor(#Inject('SimpleWorker') simpleWroker: SimpleWorker) {}
.
.
.
}
If you want to omit #Inject('SimpleWorker') and automatically inject modules like NestJs services then you need to make these changes to WorkerCoreModule:
workerProviders.push({
provide: worker,
});
However in order for this to work you need to be sure that your workers classes are decorated with #injectable().

Programatically declare RabbitMQ consumers in NestJS / Node.js?

I am using a NestJS application to consume a RabbitMQ queue.
Each message can be processed no matter the order, so I'm wondering what would be the best practise to declare new consumers for the same queue.
Expected behaviour: The queue is processed by this service, which is using several consumers
Queue: [1,2,3,4,5,6, ...N];
In nestJS you can use the #RabbitSubscribe decorator to assign a function to process the data. What I want to do could be achieved by simply duplicating (and renaming) the function with the decorator, so this function will also be called to process data from the queue
#RabbitSubscribe({
...
queue: 'my-queue',
})
async firstSubscriber(data){
// 1, 3, 5...
}
#RabbitSubscribe({
...
queue: 'my-queue',
})
async secondSubscriber(data){
// 2, 4, 6...
}
I am aware that I could duplicate the project and scale horizontally, but I'd prefer doing this on the same process.
How could I declare subscribers to get this same behaviour programatically, so I could process the data with more concurrent processing?
You will benefit if you use #golevelup/nestjs-rabbitmq package as its supports different messages queque consumption and more if your app is hybrid.
First install
npm i #golevelup/nestjs-rabbitmq
then your nestjs app structure should look like this
src --
|
app.module.ts
main.ts
app.module.ts
someHttpModule1 --
|
someHttpModule1.controller.ts
someHttpModule1.module.ts
someHttpModule1.service.ts
...
someHttpModule2 --
|
someHttpModule2.controller.ts
someHttpModule2.module.ts
someHttpModule2.service.ts
...
...
// Events module is designed for consuming messages from rabbitmq
events --
|
events.module.ts
someEventConsumerModule1 --
|
someEventConsumerModule1.module.ts
someEventConsumerModule1.service.ts
someEventConsumerModule2 --
|
someEventConsumerModule2.module.ts
someEventConsumerModule2.service.ts
...
In the src/app.module.ts file
// module imports
import { SomeHttpModule1 } from './someHttpModule1/someHttpModule1.module'
import { SomeHttpModule2 } from './someHttpModule2/someHttpModule.module'
import { EventsModule } from './events/events.module'
// and other necessery modules
#Module(
imports: [
SomeHttpModule1,
SomeHttpModule2,
EventsModule,
// and other dependent modules
],
controller: [],
providers: []
})
export class AppModule{}
And in your events.module.ts file
// imports
import { RabbitMQModule } from '#golevelup/nestjs-rabbitmq'
import { SomeEventConsumerModule1 } from './someEventConsumerModule1/someEventConsumerModule1.module'
import { SomeEventConsumerModule2 } from './someEventConsumerModule2/someEventConsumerModule2.module'
// and so on
#Module({
imports: [
RabbitMQModule.forRoot(RabbitMQModule, {
exchanges: [{
name: 'amq.direct',
type: 'direct' // check out docs for more information on exchange types
}],
uri: 'amqp://guest:guest#localhost:5672', // default login and password is guest, and listens locally to 5672 port in amqp protocol
connectionInitOptions: { wait: false }
}),
SomeEventConsumerModule1,
SomeEventConsumerModule2,
// ... and other dependent consumer modules
]
})
export class EventsModule {}
And below is example for one consumer module (someEventConsumerModule1.module.ts)
// imports
import { SomeEventConsumerModule1Service } from './someEventConsumerModule1.service'
// ...
#Module({
imports: [
SomeEventConsumerModule1,
// other modules if injected
],
providers: [
SomeEventConsumerModule1Service
]
})
export class SomeEventConsumerModule1 {}
And in your service file put your business logic how to handle messages
// imports
import { RabbitSubscribe } from '#golevelup/nestjs-rabbitmq'
import { Injectable } from '#nestjs/common'
import { ConsumeMessage, Channel } from 'amqplib' // for type safety you will need to install package first
// ... so on
#Injectable()
export class SomeEventConsumerModule1Service {
constructor(
// other module services if needs to be injected
) {}
#RabbitSubscribe({
exchange: 'amq.direct',
routingKey: 'direct-route-key', // up to you
queue: 'queueNameToBeConsumed',
errorHandler: (channel: Channel, msg: ConsumeMessage, error: Error) => {
console.log(error)
channel.reject(msg, false) // use error handler, or otherwise app will crush in not intended way
}
})
public async onQueueConsumption(msg: {}, amqpMsg: ConsumeMessage) {
const eventData = JSON.parse(amqpMsg.content.toString())
// do something with eventData
console.log(`EventData: ${eventData}, successfully consumed!`)
}
// ... and in the same way
}

Stub an export from a native ES Module without babel

I'm using AVA + sinon to build my unit test. Since I need ES6 modules and I don't like babel, I'm using mjs files all over my project, including the test files. I use "--experimental-modules" argument to start my project and I use "esm" package in the test. The following is my ava config and the test code.
"ava": {
"require": [
"esm"
],
"babel": false,
"extensions": [
"mjs"
]
},
// test.mjs
import test from 'ava';
import sinon from 'sinon';
import { receiver } from '../src/receiver';
import * as factory from '../src/factory';
test('pipeline get called', async t => {
const stub_factory = sinon.stub(factory, 'backbone_factory');
t.pass();
});
But I get the error message:
TypeError {
message: 'ES Modules cannot be stubbed',
}
How can I stub an ES6 module without babel?
According to John-David Dalton, the creator of the esm package, it is only possible to mutate the namespaces of *.js files - *.mjs files are locked down.
That means Sinon (and all other software) is not able to stub these modules - exactly as the error message points out. There are two ways to fix the issue here:
Just rename the files' extension to .js to make the exports mutable. This is the least invasive, as the mutableNamespace option is on by default for esm. This only applies when you use the esm loader, of course.
Use a dedicated module loader that proxies all the imports and replaces them with one of your liking.
The tech stack agnostic terminology for option 2 is a link seam - essentially replacing Node's default module loader. Usually one could use Quibble, ESMock, proxyquire or rewire, meaning the test above would look something like this when using Proxyquire:
// assuming that `receiver` uses `factory` internally
// comment out the import - we'll use proxyquire
// import * as factory from '../src/factory';
// import { receiver } from '../src/receiver';
const factory = { backbone_factory: sinon.stub() };
const receiver = proxyquire('../src/receiver', { './factory' : factory });
Modifying the proxyquire example to use Quibble or ESMock (both supports ESM natively) should be trivial.
Sinon needs to evolve with the times or be left behind (ESM is becoming defacto now with Node 12) as it is turning out to be a giant pain to use due to its many limitations.
This article provides a workaround (actually 4, but I only found 1 to be acceptable). In my case, I was exporting functions from a module directly and getting this error: ES Modules cannot be stubbed
export function abc() {
}
The solution was to put the functions into a class and export that instead:
export class Utils {
abc() {
}
}
notice that the function keyword is removed in the method syntax.
Happy Coding - hope Sinon makes it in the long run, but it's not looking good given its excessive rigidity.
Sticking with the questions Headline „Stub an export from a native ES Module without babel“ here's my take, using mocha and esmock:
(credits: certainly #oligofren brought me on the right path…)
package.json:
"scripts": {
...
"test": "mocha --loader=esmock",
"devDependencies": {
"esmock": "^2.1.0",
"mocha": "^10.2.0",
TestDad.js (a class)
import { sonBar } from './testSon.js'
export default class TestDad {
constructor() {
console.log(purple('constructing TestDad, calling...'))
sonBar()
}
}
testSon.js (a 'util' library)
export const sonFoo = () => {
console.log(`Original Son 'foo' and here, my brother... `)
sonBar()
}
export const sonBar = () => {
console.log(`Original Son bar`)
}
export default { sonFoo, sonBar }
esmockTest.js
import esmock from 'esmock'
describe.only(autoSuiteName(import.meta.url),
() => {
it('Test 1', async() => {
const TestDad = await esmock('../src/commands/TestDad.js', {
'../src/commands/testSon.js': {
sonBar: () => { console.log('STEPSON Bar') }
}
})
// eslint-disable-next-line no-new
new TestDad()
})
it('Test 2', async() => {
const testSon = await esmock('../src/commands/testSon.js')
testSon.sonBar = () => { console.log('ANOTHER STEPSON Bar') }
testSon.sonFoo() // still original
testSon.sonBar() // different now
})
})
autoSuiteName(import.meta.url)
regarding Test1
working nicely, import bended as desired.
regarding Test1
Bending a single function to do something else is not a problem.
(but then there is not much test value in calling your very own function you just defined, is there?)
Enclosed function calls within the module (i.e. from sonFoo to sonBar) remain what they are, they are indeed a closure, still pointing to the prior function
Btw also tested that: No better results with sinon.callsFake() (would have been surprising if there was…)

Resources