Using multiple sockets adapters with NestJS - node.js

I'm working on an Node.js application with NestJS. I need to communicate with 2 other apps.
The first one over WebSockets (Socket.io) and the other one over TCP sockets with net module.
Is it possible to use two gateways with specific adapters, one based on Socket.io and the other one on Net module, or do I have to split this application?

You don't need to split the application.
You can define your module as:
#Module({
providers: [
MyGateway,
MyService,
],
})
export class MyModule {}
with the gateway being in charge of the web sockets channel
import { SubscribeMessage, WebSocketGateway } from '#nestjs/websockets'
import { Socket } from 'socket.io'
...
#WebSocketGateway()
export class MyGateway {
constructor(private readonly myService: MyService) {}
#SubscribeMessage('MY_MESSAGE')
public async sendMessage(socket: Socket, data: IData): Promise<IData> {
socket.emit(...)
}
}
and the service being in charge of the TCP channel
import { Client, ClientProxy, Transport } from '#nestjs/microservices'
...
#Injectable()
export class MyService {
#Client({
options: { host: 'MY_HOST', port: MY_PORT },
transport: Transport.TCP,
})
private client: ClientProxy
public async myFunction(): Promise<IData> {
return this.client
.send<IData>({ cmd: 'MY_MESSAGE' })
.toPromise()
.catch(error => {
throw new HttpException(error, error.status)
})
}
}

Related

Cannot start two instances of Microservice using package #golevelup/nestjs-rabbitmq

I would build a publish/subscribe message pattern between microservices using nestjs and rabbitmq. The problem with the built-in NestJS microservices Rabbitmq does not support pub/sub pattern,but it's easy to start multiple instances of microservice to test random queue message. And i tried to use #golevelup/nestjs-rabbitmq to implement the features. The problem with this package is that it seem like the port 3000 is the default using by the package and I don't know where and how I could change the port. I couldn't start multiple instances of consummers to test the pattern.
// Module Subcriber
import { RabbitMQModule } from '#golevelup/nestjs-rabbitmq';
import { Module } from '#nestjs/common';
import { SomeEventConsumerModule1Module } from './some-event-consumer-module1/some-event-consumer-module1.module';
#Module({
imports: [
RabbitMQModule.forRoot(RabbitMQModule, {
exchanges: [
{
name: 'amq.topic',
type: 'topic', // check out docs for more information on exchange types
},
],
uri: 'amqp://guest:guest#localhost:5672', // default login and password is guest, and listens locally to 5672 port in amqp protocol
// connectionInitOptions: { wait: false },
channels: {
'channel-1': {
prefetchCount: 15,
default: true,
},
'channel-2': {
prefetchCount: 2,
},
},
}),
SomeEventConsumerModule1Module,
],
})
export class EventsModule {}
Here is the service of Subscriber to get messesage from publisher.
// Service Subscriber
// imports
import { RabbitSubscribe } from '#golevelup/nestjs-rabbitmq';
import { Injectable } from '#nestjs/common';
import { ConsumeMessage, Channel } from 'amqplib'; // for type safety you will need to install package first
// ... so on
#Injectable()
export class SomeEventConsumerModule1Service {
constructor() {} // other module services if needs to be injected
#RabbitSubscribe({
exchange: 'amq.direct',
routingKey: 'direct-route-key', // up to you
queue: 'queueNameToBeConsumed',
errorHandler: (channel: Channel, msg: ConsumeMessage, error: Error) => {
console.log(error);
channel.reject(msg, false); // use error handler, or otherwise app will crush in not intended way
},
})
public async onQueueConsumption(msg: {}, amqpMsg: ConsumeMessage) {
const eventData = JSON.parse(amqpMsg.content.toString());
// do something with eventData
console.log(
`EventData: ${
eventData.bookName
}, successfully consumed!${amqpMsg.content.toString()}`,
);
}
// ... and in the same way
}
The here is the code of publisher
//app module
import { Module } from '#nestjs/common';
import { ClientsModule, Transport } from '#nestjs/microservices';
import { AppController } from './app.controller';
import { AppService } from './app.service';
#Module({
imports: [
ClientsModule.register([
{
name: 'GREETING_SERVICE',
transport: Transport.RMQ,
options: {
urls: ['amqp://localhost:5672'],
queue: 'queueNameToBeConsumed',
},
},
]),
],
controllers: [AppController],
providers: [AppService],
})
export class AppModule {}
// App service
import { Inject, Injectable } from '#nestjs/common';
import { ClientProxy } from '#nestjs/microservices';
#Injectable()
export class AppService {
constructor(#Inject('GREETING_SERVICE') private client: ClientProxy) {}
async testEvent() {
this.client.emit('book-created', {
bookName: 'The Way Of Kings',
author: 'Brandon Sanderson',
});
}
}
the Error message display when trying to start the second instance
ERROR [Server] Error: listen EADDRINUSE: address already in use 127.0.0.1:3000
Error: listen EADDRINUSE: address already in use 127.0.0.1:3000
at Server.setupListenHandle [as _listen2] (node:net:1380:16)
at listenInCluster (node:net:1428:12)
at GetAddrInfoReqWrap.doListen (node:net:1567:7)
at GetAddrInfoReqWrap.onlookup [as oncomplete] (node:dns:85:8)
I don't know why the port 3000 is using by the service and how to change the port.

Socket connection from NextJS to a node backend

I am trying to implement a basic socket connection from my NextJS client side (running on localhost:3000) to my NestJs server (running on localhost:3003).
The server code looks like this
ChatGateway.ts
import {
SubscribeMessage,
WebSocketGateway,
OnGatewayInit,
WebSocketServer,
OnGatewayConnection,
OnGatewayDisconnect,
} from '#nestjs/websockets';
import {
Logger
} from '#nestjs/common';
import {
Socket,
Server
} from 'socket.io';
#WebSocketGateway()
export class ChatGateway implements OnGatewayInit, OnGatewayConnection, OnGatewayDisconnect {
#WebSocketServer() server: Server;
private logger: Logger = new Logger('ChatGateway');
#SubscribeMessage('msgToServer')
handleMessage(client: Socket, payload: string): void {
console.log(payload);
this.server.emit('msgToClient', payload);
}
afterInit(server: Server) {
this.logger.log('Init');
}
handleDisconnect(client: Socket) {
this.logger.log(`Client disconnected: ${client.id}`);
}
handleConnection(client: Socket, ...args: any[]) {
this.logger.log(`Client connected: ${client.id}`);
this.server.emit('msgToClient', "payload");
}
}
ChatModule.ts
import { Module } from '#nestjs/common';
import { TypeOrmModule } from '#nestjs/typeorm';
import { ChatGateway } from "./chat.gateway";
#Module({
imports: [],
controllers: [],
providers: [ChatGateway],
})
export class ChatModule {}
AppModule.ts
#Module({
imports: [TypeOrmModule.forRoot(), NewsletterModule, AuthModule, UsersModule, ListingsModule, ChatModule]
})
export class AppModule {
constructor(private connection: Connection) {}
But when I try to connect to the socket from my client side
import {
io
} from "socket.io-client";
function Chat() {
const socket = io("http://127.0.0.1:3003");
useEffect(() => {
console.log("chat useEffect")
socket.emit('msgToServer', "message")
}, [])
socket.on('msgToClient', (message) => {
console.log(message)
})
I am not getting any errors, but also there is nothing happening when I emit or try to receive events from the server.
Even the server console doesnt log the emit events. The only thing that happens on the server is that the client gets connected and disconnected all the time without even me doing anything
Any idea why cant I connect to the sockets and why is the server constantly connecting and disconnecting even if I disable the socket connection form the client side.
Thanks!
Socket.io client needs to be version 2. Version 3 and 4 are breaking changes and don't communicate with a v2 server. Once Nest v8 hits, socket.io v4 will be used by default.

NestJS - Communication between 2 microservice

I built a Http app and 2 microservices using TCP protocol.
This is my application diagram.
// Http App/app.service.ts
constructor() {
this.accountService = ClientProxyFactory.create({
transport: Transport.TCP,
options: {
host: 'localhost',
port: 8877,
},
});
this.friendService = ClientProxyFactory.create({
transport: Transport.TCP,
options: {
host: 'localhost',
port: 8080,
},
});
}
I tried to send message from Account Service to Friend Service by #Messagepattern().
ClientProxy is set up each service. But it doesn't work.
I read offical documentaion #nestjs/microservices, But i don't know which one is appropriate.
Is there right way to send message from one microservice to another microservice?
You need to set up a message broker, something like RabbitMQ or Kafka, ie for RabbitMQ enter the command below and create a RabbitMQ container.
docker run -it --rm --name rabbitmq -p 0.0.0.0:5672:5672 -p 0.0.0.0:15672:15672 -d rabbitmq:3-management
Then pass RabbitMQ options to your main.ts bootstrap function:
async function bootstrap() {
const rabbitmqPort = 5672
const rabbitmqHost = 127.0.0.1
const app = await NestFactory.create(AppModule);
app.connectMicroservice<MicroserviceOptions>({
transport: Transport.RMQ,
options: {
urls: [
`amqp://${rabbitmqHost}:${rabbitmqPort}`,
],
queue: 'myqueue',
queueOptions: {
durable: false,
},
},
});
app
.startAllMicroservices(() => {
logger.log('Microservice is listening!');
})
.listen(3000, () => {
logger.log('Api Server is listening on 3000');
});
}
bootstrap();
For receiving messages:
#MessagePattern('my_pattern')
async myController(
#Payload() data: MYDTO,
): Promise<MY TYPE> {
return await this.accountService.myFunction(data);
}
Now when a client sends a message on myqueue with my_pattern pattern, the data that client sends will be data which comes from #playload() annotation.
For sending messages on any queue you need to add RabbitMQ configurations to your application module, ie account.module.ts, by the assumption that you want to send a message on FriendService
const rabbitmqPort = 5672
const rabbitmqHost = 127.0.0.1
#Module({
imports: [
ClientsModule.registerAsync([
{
name: 'Friend',
useFactory: {
transport: Transport.RMQ,
options: {
urls: [
`amqp://${rabbitmqHost}:${rabbitmqPort}`,
],
queue: 'friend_queue',
queueOptions: {
durable: false,
},
},
}
},
]),
],
controllers: [AccountController],
providers: [AccountService],
})
export class AccountModule {}
And then inject Friend client to your service constructor like this:
#Inject('Friend')
private friendClient: ClientProxy,
Send messages like this:
const myVar = await this.friendClient.send('Some_pattern', {SOME DATA}).toPromise();
Set up all the above configurations for your both microservices and it shall work.

Nestjs - connect bull-board in a normal controller

I created an app using nest.js and bull.
I added bull-board package to monitor my queues, but in documentation, only one way to add it to the app is mount as middleware:
In main.ts:
app.use('/admin/queues', bullUI);
Is there any way to add bullUI in a normal nest controller, after jwt auth? Like:
#UseGuards(JwtAuthGuard)
#Get("queues")
activate() {
return UI
}
You can use any express middleware like this inside controllers, but maybe some cases cause errors like serving static files with Guard exception and etc.
#UseGuards(JwtAuthGuard)
#Get("queues/*")
activate(#Req() req, #Res() res) {
bullUI(req, res)
}
I've got this working via a middleware consumer, so something like this:
import { router } from 'bull-board';
#Module({
imports: [
NestBullModule.forRoot({ redis }),
],
providers: [],
})
export class BullModule {
configure(consumer: MiddlewareConsumer): void {
consumer
.apply(router)
.forRoutes('/admin/queues');
}
}
I'd like to extend the original answer of #JCF, mainly, because it's working and much easier to understand.
I am using not default bull, with #nestjs/queue, but an improved version of BullMQ from anchan828 repo, with NestJS decorators, but I guess in both cases, the result will be the same.
The queue.module file:
#Module({
imports: [
BullModule.forRoot({
options: {
connection: {
host: redisConfig.host,
port: redisConfig.port,
},
},
}),
/** DI all your queues and Redis connection */
BullModule.registerQueue('yourQueueName'),
],
controllers: [],
providers: [],
})
export class QueueModule {
constructor (
#BullQueueInject('yourQueueName')
private readonly queueOne: Queue,
) {
/** Add queues with adapter, one-by-one */
setQueues([new BullMQAdapter(this.queueOne, { readOnlyMode: false })])
}
configure(consumer: MiddlewareConsumer): void {
consumer
.apply(router)
.forRoutes('/admin/queues');
}
}
Then just add it, to parent AppModule, via import, like that:
I am not sure, that Redis connection is needed here, for parent AppModule
#Module({
imports: [
BullModule.forRoot({
options: {
connection: {
host: redisConfig.host,
port: redisConfig.port,
},
},
}),
QueueModule
],
controllers: [],
providers: [],
})
export class AppModule {}
run the main.js, and visit, localhost:8000/admin/queues

Caching return value from a service method

I am using nestjs and have just installed the cache-manager module and are trying to cache a response from a service call.
I register the cache module in a sample module (sample.module.ts):
import { CacheInterceptor, CacheModule, Module } from '#nestjs/common';
import { SampleService } from './sample.service';
import { APP_INTERCEPTOR } from '#nestjs/core';
import * as redisStore from 'cache-manager-redis-store';
#Module({
imports: [
CacheModule.register({
ttl: 10,
store: redisStore,
host: 'localhost',
port: 6379,
}),
],
providers: [
SampleService,
{
provide: APP_INTERCEPTOR,
useClass: CacheInterceptor,
}
],
exports: [SampleService],
})
export class SampleModule {}
Then in my service (sample.service.ts):
#Injectable()
export class SampleService {
#UseInterceptors(CacheInterceptor)
#CacheKey('findAll')
async findAll() {
// Make external API call
}
}
Looking at redis I can see that nothing is cached for the service method call. If I use the same approach with a controller, then everything works fine and I can see the cached entry in my redis database. I am thinking that there is no way out of the box to cache individual service method calls in nestjs.
Reading the documentation it seems that I am only able to use this approach for controllers, microservices and websockets, but not ordinary services?
Correct, it is not possible to use the cache the same way for services as for controllers.
This is because the magic happens in the CacheInterceptor and Interceptors can only be used in Controllers.
However, you can inject the cacheManager into your service and use it directly:
export class SampleService {
constructor(#Inject(CACHE_MANAGER) protected readonly cacheManager) {}
findAll() {
const value = await this.cacheManager.get(key)
if (value) {
return value
}
const respone = // ...
this.cacheManager.set(key, response, ttl)
return response
}

Resources