Not receiving messages in the reply queue that sent to RabbitMQ using amqplib and processed by NestJS - node.js

So I'm using NestJS (v8) with the RabbitMQ transport (Transport.RMQ) to listen for messages
My NestJS code look something like this:
// main.ts
const app = await NestFactory.createMicroservice<MicroserviceOptions>(AppModule, {
transport: Transport.RMQ,
options: {
urls: ['amqp://localhost:5672'],
queue: 'my-queue',
replyQueue: 'my-reply-queue'
},
});
// my.controller.ts
import { Controller } from '#nestjs/common';
import { MessagePattern } from '#nestjs/microservices';
#Controller()
export class MyController {
#MessagePattern('something')
do(data: {source: string}): {source: string} {
console.log(data);
data.source += ' | MyController';
return data;
}
}
And in Node.JS application, I use amqplib to send to the NestJS application and receive the response
this is the code of the Node.JS application:
const queueName = 'my-queue';
const replyQueueName = 'my-reply-queue';
const amqplib = require('amqplib');
async function run() {
const conn = await amqplib.connect('amqp://localhost:5672');
const channel = await conn.createChannel();
await channel.assertQueue(queueName);
await channel.assertQueue(replyQueueName);
// Consumer: Listen to messages from the reply queue
await channel.consume(replyQueueName, (msg) => console.log(msg.content.toString()));
// Publisher: Send message to the queue
channel.sendToQueue(
queueName,
Buffer.from(
JSON.stringify({
pattern: 'something',
data: { source: 'node-application' },
})
),
{ replyTo: replyQueueName }
);
}
run()
When I run the node and the Nest.JS applications, the Nest.JS gets the message from the Node.JS publisher but the Node.JS consumer is never called with the reply

The fix was to add an id key in the data that the Node.JS application sends:
// ...
// Publisher: Send message to the queue
channel.sendToQueue(
queueName,
Buffer.from(
JSON.stringify({
// Add the `id` key here so the Node.js consumer will get the message in the reply queue
id: '',
pattern: 'something',
data: { source: 'node-application' },
})
),
{ replyTo: replyQueueName }
);
// ...
Detailed explanation (in Nest.JS source code)
This is because in the handleMessage function in server-rmq.ts file there is a check if id property of the message is undefined
// https://github.com/nestjs/nest/blob/026c1bd61c561a3ad24da425d6bca27d47567bfd/packages/microservices/server/server-rmq.ts#L139-L141
public async handleMessage(
message: Record<string, any>,
channel: any,
): Promise<void> {
// ...
if (isUndefined((packet as IncomingRequest).id)) {
return this.handleEvent(pattern, packet, rmqContext);
}
// ...
}
And there is no logic of sending messages to the reply queue in the handleEvent function, just handling the event

Related

NestJs, RabbitMq, CQRS & BFF: Listening for event inside the bff

I'm about to implement a Microservice Architecture with CQRS Design Pattern. The Microservices are communicating with RMQ.
Additionally, I'm adding a BFF for my Application UI.
In this scenario, the BFF needs to listen to certain domain events.
For instance: After the user sends an request to the BFF, the BFF calls a method of a Microservice which invokes an asynchronous event.
The events result will go back to the BFF and then to the user.
I'm thinking of different ways I might be able to implement this and I came up with this concept:
// BFF Application: sign-up.controller.ts
import { Controller, Post, Body } from '#nestjs/common';
import { Client, ClientProxy, Transport } from '#nestjs/microservices';
#Controller('signup')
export class SignupController {
#Client({ transport: Transport.RMQ, options: { urls: ['amqp://localhost:5672'], queue: 'signup_request' } })
client: ClientProxy;
#Post()
async signup(#Body() body: any) {
// Generate a unique identifier for the request
const requestId = uuid();
// Send the request with the unique identifier
const response = await this.client.send<any>({ cmd: 'signup', requestId }, body).toPromise();
// Wait for the SignUpEvent to be emitted before sending the response
return new Promise((resolve, reject) => {
this.client.subscribe<any>('signup_response', (response: any) => {
// Check the unique identifier to ensure the event corresponds to the original request
if (response.requestId === requestId) {
resolve(response);
}
});
});
}
}
After sending the request with the unique identifier, the microservice will execute a sign up command:
// Microservice Application: sign-up.handler.ts
import { CommandHandler, ICommandHandler } from '#nestjs/cqrs';
import { SignUpCommand } from './commands/sign-up.command';
import { SignUpEvent } from './events/sign-up.event';
import { EventBus } from '#nestjs/cqrs';
#CommandHandler(SignUpCommand)
export class SignUpCommandHandler implements ICommandHandler<SignUpCommand> {
constructor(private readonly eventBus: EventBus) {}
async execute(command: SignUpCommand) {
// Validating the user/account aggregate
// ...
// Emit the SignUpEvent with the unique identifier
this.eventBus.publish(new SignUpEvent(user, command.requestId));
}
}
Now the Event Handler gets called:
// Microservice Application: signed-up.handler.ts
import { EventsHandler, IEventHandler } from '#nestjs/cqrs';
import { SignUpEvent } from './events/sign-up.event';
import { ClientProxy, Transport } from '#nestjs/microservices';
#EventsHandler(SignUpEvent)
export class SignUpEventHandler implements IEventHandler<SignUpEvent> {
#Client({ transport: Transport.RMQ, options: { urls: ['amqp://localhost:5672'], queue: 'signup_response' } })
client: ClientProxy;
async handle(event: SignUpEvent) {
// Persist the user
const user = await this.persistUser(event.user);
// Generate access and refresh tokens
const tokens = this.generateTokens(user);
// Emit the SignUpResponse event with a unique identifier
await this.client.emit('signup_response', { user, tokens, requestId: event.requestId });
}
}
Is this, a valid way to implement this type of behaviour?
Thank you in advance.

How to set up NestJS microservices integration/e2e testing KafkaJS consumer (MessagePattern/EventPattern handler)

I'm proofing out an integration test with NestJS/KafkaJS.
I have everything implemented except the function on the event listener (consumer) for the topic I'm emitting to is not being called.
I read somewhere you can't consume a message until the consumer event GROUP_JOIN has completed, not sure if this is right, and/or how I could get my e2e test to wait for this to happen?
Here's the setup of the e2e test -
describe('InventoryController(e2e)', () => {
let app: INestApplication
let client: ClientKafka
beforeAll(async () => {
const moduleFixture: TestingModule = await Test.createTestingModule({
imports: [InventoryModule, KafkaClientModule],
}).compile()
app = moduleFixture.createNestApplication()
await app.connectMicroservice({
transport: Transport.KAFKA,
options: {
client: {
clientId: 'test_clientid',
brokers: process.env.KAFKA_BROKERS.split(' '), // []
ssl: true,
sasl: {
mechanism: 'plain',
username: process.env.KAFKA_CLUSTER_APIKEY,
password: process.env.KAFKA_CLUSTER_SECRET,
},
},
consumer: {
groupId: 'test_consumerids',
},
},
})
await app.startAllMicroservices()
await app.init()
client = moduleFixture.get<ClientKafka>('test_name')
await client.connect()
await app.listen(process.env.port || 3000)
})
afterAll(async () => {
await app.close()
await client.close()
})
it('/ (GET)', async () => {
return request(app.getHttpServer()).get('/inventory/kafka-inventory-test')
})
it('Emits a message to a topic', async () => {
await client.emit('inventory-test', { foo: 'bar' })
})
The client is emitting the message fine,
In my controller I have the event handler for the topic 'inventory-test'
#EventPattern('inventory-test')
async consumeInventoryTest(
// eslint-disable-next-line
#Payload() inventoryMessage: any,
#Ctx() context: KafkaContext,
): Promise<void> {
console.log('inventory-test consumer')
}
I have also logged the microservice with the app.getMicroservices() method and can see under the messageHandlers object it has 'inventory-test' which returns a function
server: ServerKafka {
messageHandlers: Map(1) { 'inventory-test' => [Function] }
Also the message handler is working when I run the app locally.
I've been searching google a lot and the docs for both kafkajs and nest, there isn't a lot of info out there
Thanks for any advice/help!
I actually finally solved this, you need to await a new promise after your client emits a message for the handler to have time to read it.

NestJs #Sse - event is consumed only by one client

I tried the sample SSE application provided with nest.js (28-SSE), and modified the sse endpoint to send a counter:
#Sse('sse')
sse(): Observable<MessageEvent> {
return interval(5000).pipe(
map((_) => ({ data: { hello: `world - ${this.c++}` }} as MessageEvent)),
);
}
I expect that each client that is listening to this SSE will receive the message, but when opening multiple browser tabs I can see that each message is consumed only by one browser, so if I have three browsers open I get the following:
How can I get the expected behavior?
To achieve the behavior you're expecting you need to create a separate stream for each connection and push the data stream as you wish.
One possible minimalistic solution is below
import { Controller, Get, MessageEvent, OnModuleDestroy, OnModuleInit, Res, Sse } from '#nestjs/common';
import { readFileSync } from 'fs';
import { join } from 'path';
import { Observable, ReplaySubject } from 'rxjs';
import { map } from 'rxjs/operators';
import { Response } from 'express';
#Controller()
export class AppController implements OnModuleInit, OnModuleDestroy {
private stream: {
id: string;
subject: ReplaySubject<unknown>;
observer: Observable<unknown>;
}[] = [];
private timer: NodeJS.Timeout;
private id = 0;
public onModuleInit(): void {
this.timer = setInterval(() => {
this.id += 1;
this.stream.forEach(({ subject }) => subject.next(this.id));
}, 1000);
}
public onModuleDestroy(): void {
clearInterval(this.timer);
}
#Get()
public index(): string {
return readFileSync(join(__dirname, 'index.html'), 'utf-8').toString();
}
#Sse('sse')
public sse(#Res() response: Response): Observable<MessageEvent> {
const id = AppController.genStreamId();
// Clean up the stream when the client disconnects
response.on('close', () => this.removeStream(id));
// Create a new stream
const subject = new ReplaySubject();
const observer = subject.asObservable();
this.addStream(subject, observer, id);
return observer.pipe(map((data) => ({
id: `my-stream-id:${id}`,
data: `Hello world ${data}`,
event: 'my-event-name',
}) as MessageEvent));
}
private addStream(subject: ReplaySubject<unknown>, observer: Observable<unknown>, id: string): void {
this.stream.push({
id,
subject,
observer,
});
}
private removeStream(id: string): void {
this.stream = this.stream.filter(stream => stream.id !== id);
}
private static genStreamId(): string {
return Math.random().toString(36).substring(2, 15);
}
}
You can make a separate service for it and make it cleaner and push stream data from different places but as an example showcase this would result as shown in the screenshot below
This behaviour is correct. Each SSE connection is a dedicated socket and handled by a dedicated server process. So each client can receive different data.
It is not a broadcast-same-thing-to-many technology.
How can I get the expected behavior?
Have a central record (e.g. in an SQL DB) of the desired value you want to send out to all the connected clients.
Then have each of the SSE server processes watch or poll that central record
and send out an event each time it changes.
you just have to generate a new observable for each sse connection of the same subject
private events: Subject<MessageEvent> = new Subject();
constuctor(){
timer(0, 1000).pipe(takeUntil(this.destroy)).subscribe(async (index: any)=>{
let event: MessageEvent = {
id: index,
type: 'test',
retry: 30000,
data: {index: index}
} as MessageEvent;
this.events.next(event);
});
}
#Sse('sse')
public sse(): Observable<MessageEvent> {
return this.events.asObservable();
}
Note: I'm skipping the rest of the controller code.
Regards,

angular-nestjs socket.io doesn't update if page doesn't refresh

I am using Angular and NestJS with socket.io to create a chat application.
When I started saving data on my db, my Angular side stopped updating messages unless page refreshes.
I couldn't find a way to pass the userid on the server gateway in order to search in the db only chat of the user.
Angular service
sendMessage(chat: Chat): void {
this.socket.emit('sendMessage', chat)
} //using it when a message is sent
getNewMessage(): Observable<Chat[]> {
return this.socket.fromEvent<any>('lastChats');
} //using it onInit
sendId(userId: number): void {
this.socket.emit('loadMessages', userId);
} //using it onInit
NestJS Gateway
#WebSocketGateway({ cors: { origin: ['http://localhost:4200'] } })
export class ChatGateway implements OnGatewayConnection, OnGatewayDisconnect {
constructor(
#InjectRepository(Chat) private chatRepository: Repository<Chat>,
) { }
#WebSocketServer()
server: Server
async handleConnection(client: any, ...args: any[]) {
const chats = await this.chatRepository.createQueryBuilder('chat')
.innerJoinAndSelect('chat.ally', 'ally')
.innerJoinAndSelect('chat.talent', 'talent')
.getMany();
this.server.emit('lastChats', chats)
}
handleDisconnect(client: any, ...args: any[]) {
console.log("Disconnected")
}
#SubscribeMessage('sendMessage')
handleMessage(socket: Socket, chat: Chat) {
if (chat.messagesJSON && chat.messagesJSON.trim() !== '') {
this.chatRepository.save(chat)
this.server.emit('newChat', chat)
}
#SubscribeMessage('loadMessages')
async handleLoad(socket: Socket, id: number) {
const chats = await this.chatRepository.createQueryBuilder('chat')
.innerJoinAndSelect('chat.ally', 'ally')
.innerJoinAndSelect('chat.talent', 'talent')
.where('ally = :user OR talent = :user', { user: id })
.getMany();
this.server.emit('lastChats', chats)
}
}
To send from angular to nestjs you can use
Angular:
this.socket = io(environment.SOCKET_URL, {
extraHeaders: {Authorization: localStorage.getItem('token')},
});
Nestjs: in the handleConnexion
Const jwt =socket.handshake.headers.authorization
I think you can replace the token by usrerId

Unable to use ActiveMQ priority messages using STOMP protocol in nodejs

I have an application which sends messages to a queue, and another application which subscribes to the queue and process it. I want OTP messages to be given higher priority than other messages, hence I am trying to use ActiveMQ message priority to achieve this.
This is the code for ActiveMQ connection using STOMP protocol in nodejs using stompit library:
const serverPrimary = {
host: keys.activeMQ.host,
port: keys.activeMQ.port,
ssl: ssl,
connectHeaders: {
host: '/',
login: keys.activeMQ.username,
passcode: keys.activeMQ.password,
'heart-beat': '5000,5000',
},
}
connManager = new stompit.ConnectFailover(
[serverPrimary, serverFailover],
reconnectOptions,
)
connManager.on('error', function (e) {
const connectArgs = e.connectArgs
const address = connectArgs.host + ':' + connectArgs.port
logger.error({ error: e, customMessage: address })
})
channelPool = new stompit.ChannelPool(connManager)
Code for sending message
const pushMessageToAMQ = (queue, message) => {
const queues = Object.values(activeMQ.queues)
if (!queues.includes(queue)) {
_mqLog(mqLogMessages.unknownQueue + queue)
return
}
//Priority header is set
const header = {
destination: queue,
priority: 7
}
//If message is not a string
if (typeof message !== 'string') message = JSON.stringify(message)
//Logging message before sending
_mqLog(
mqLogMessages.sending,
{ service: services.amq },
{ header: header, message: message },
)
//Sending message to amq
_sendMessageToAMQ(header, message, error => {
if (error) {
_mqError(error, mqLogMessages.sendingError, { service: services.amq })
}
})
}
const _sendMessageToAMQ = (headers, body, callback) => {
channelPool.channel((error, channel) => {
if (error) {
callback(error)
return
}
channel.send(headers, body, callback)
})
}
Here's the code for subscribing to queue in the second application:
const amqSubscribe = (queue, callback, ack = 'client-individual') => {
log({ customMessage: 'Subscribing to ' + queue })
const queues = Object.values(activeMQ.queues)
if (!queues.includes(queue)) {
return
}
channelPool.channel((error, channel) => {
let header = {
destination: queue,
ack: ack,
'activemq.prefetchSize': 1,
}
//Check for error
if (error) {
_mqError(error, mqLogMessages.baseError, header)
} else {
channel.subscribe(
header,
_synchronisedHandler((error, message, next) => {
//Check for error
if (error) {
_mqError(error, mqLogMessages.subscriptionError, header)
next()
} else {
//Read message
message.readString('utf-8', function (error, body) {
if (error) {
_mqError(error, mqLogMessages.readError, header)
next()
} else {
//Message read successfully call callback
callback(body, () => {
//Acknowledgment callback
channel.ack(message)
next()
})
}
})
}
}),
)
}
})
}
Activemq.xml
<policyEntries>
<policyEntry queue=">" prioritizedMessages="true" useCache="false" expireMessagesPeriod="0" queuePrefetch="1" />
.......
I tried pushing different messages with different priority and turned on the second application (i.e. the one which subscribes to the messages) after all the messages were pushed to queue. However, the execution order of the messages was the same as the one which was sent. The priority didn't change anything. Is there something that I am missing?
Do I have to add something in consumer end for it to work?
Support for priority is disabled by default in ActiveMQ "Classic" (used by Amazon MQ). As the documentation states:
...support [for message priority] is disabled by default so it needs to be be enabled using per destination policies through xml configuration...
You need to set prioritizedMessages="true" in the policyEntry for your queue, e.g.:
<destinationPolicy>
<policyMap>
<policyEntries>
<policyEntry queue=">" prioritizedMessages="true"/>
...
To be clear, this is configured on the broker (i.e. not the client) in activemq.xml, and it applies to every kind of client.

Resources