How to create rooms with nestjs and socket.io - node.js

I'm trying to create a room on my nestjs backend but can't find any information on this subject. You can find the docs here. The docs don't seem to have anything on this subject.
import {
SubscribeMessage,
WebSocketGateway,
WebSocketServer,
WsResponse,
} from '#nestjs/websockets';
import { Client, Server } from 'socket.io';
#WebSocketGateway({namespace: 'story'})
export class StoryEventsGateway {
#WebSocketServer()
server: Server;
#SubscribeMessage('createRoom')
createRoom(client: Client, data: string): WsResponse<unknown> {
return { event: 'roomCreated', data };
}
}

By changing client: Client to socket: Socket you're able to use the socket object you are used to when using socket.io.
Here is the edited function.
import { Socket } from 'socket.io';
import { WsResponse } from '#nestjs/websockets';
createRoom(socket: Socket, data: string): WsResponse<unknown> {
socket.join('aRoom');
socket.to('aRoom').emit('roomCreated', {room: 'aRoom'});
return { event: 'roomCreated', room: 'aRoom' };
}

With the latest Nest JS update you can use this code where the room name can be sent from the front-end and it will passed on to the 'data' variable:
#SubscribeMessage('createRoom')
createRoom(#MessageBody() data: string, #ConnectedSocket() client: Socket) {
client.join(data, err => {
if (err) {
this.logger.error(err);
}
});
}

The issue I had was coming from the wrong import.
import { Socket } from 'socket.io-client' //wrong
import { Socket } from 'socket.io' //good
#SubscribeMessage('room')
joinRoom(socket: Socket, roomId: string) {
socket.join(roomId);
}

Related

NestJs, RabbitMq, CQRS & BFF: Listening for event inside the bff

I'm about to implement a Microservice Architecture with CQRS Design Pattern. The Microservices are communicating with RMQ.
Additionally, I'm adding a BFF for my Application UI.
In this scenario, the BFF needs to listen to certain domain events.
For instance: After the user sends an request to the BFF, the BFF calls a method of a Microservice which invokes an asynchronous event.
The events result will go back to the BFF and then to the user.
I'm thinking of different ways I might be able to implement this and I came up with this concept:
// BFF Application: sign-up.controller.ts
import { Controller, Post, Body } from '#nestjs/common';
import { Client, ClientProxy, Transport } from '#nestjs/microservices';
#Controller('signup')
export class SignupController {
#Client({ transport: Transport.RMQ, options: { urls: ['amqp://localhost:5672'], queue: 'signup_request' } })
client: ClientProxy;
#Post()
async signup(#Body() body: any) {
// Generate a unique identifier for the request
const requestId = uuid();
// Send the request with the unique identifier
const response = await this.client.send<any>({ cmd: 'signup', requestId }, body).toPromise();
// Wait for the SignUpEvent to be emitted before sending the response
return new Promise((resolve, reject) => {
this.client.subscribe<any>('signup_response', (response: any) => {
// Check the unique identifier to ensure the event corresponds to the original request
if (response.requestId === requestId) {
resolve(response);
}
});
});
}
}
After sending the request with the unique identifier, the microservice will execute a sign up command:
// Microservice Application: sign-up.handler.ts
import { CommandHandler, ICommandHandler } from '#nestjs/cqrs';
import { SignUpCommand } from './commands/sign-up.command';
import { SignUpEvent } from './events/sign-up.event';
import { EventBus } from '#nestjs/cqrs';
#CommandHandler(SignUpCommand)
export class SignUpCommandHandler implements ICommandHandler<SignUpCommand> {
constructor(private readonly eventBus: EventBus) {}
async execute(command: SignUpCommand) {
// Validating the user/account aggregate
// ...
// Emit the SignUpEvent with the unique identifier
this.eventBus.publish(new SignUpEvent(user, command.requestId));
}
}
Now the Event Handler gets called:
// Microservice Application: signed-up.handler.ts
import { EventsHandler, IEventHandler } from '#nestjs/cqrs';
import { SignUpEvent } from './events/sign-up.event';
import { ClientProxy, Transport } from '#nestjs/microservices';
#EventsHandler(SignUpEvent)
export class SignUpEventHandler implements IEventHandler<SignUpEvent> {
#Client({ transport: Transport.RMQ, options: { urls: ['amqp://localhost:5672'], queue: 'signup_response' } })
client: ClientProxy;
async handle(event: SignUpEvent) {
// Persist the user
const user = await this.persistUser(event.user);
// Generate access and refresh tokens
const tokens = this.generateTokens(user);
// Emit the SignUpResponse event with a unique identifier
await this.client.emit('signup_response', { user, tokens, requestId: event.requestId });
}
}
Is this, a valid way to implement this type of behaviour?
Thank you in advance.

web sockets rooms not works in nestjs

I'm creating a simple chat app with rooms and I meet the problem, that, seems like socket.io can not join the room, I have schemas in mongodb, those are users, messages and rooms, for room id I provide the id of rooms schema and then want to join it with one event for example, I'm logging in from react app, then by user id I'm finding all rooms that contains my id and then want to join them all.
For the second step I want to send message and I'm targeting the room that I want send the message, with .to(roomId).emit(..., ...)
but all of this tries are useless, it not works
here is the nestjs gateway code:
import { Logger } from '#nestjs/common';
import {
WebSocketGateway,
WebSocketServer,
SubscribeMessage,
OnGatewayDisconnect,
OnGatewayInit,
OnGatewayConnection,
MessageBody,
ConnectedSocket,
WsResponse,
} from '#nestjs/websockets';
import { Socket, Server } from 'socket.io';
import { UserService } from 'src/user/user.service';
import { ChatService } from './chat.service';
import { CreateChatDto } from './dto/create-chat.dto';
import { UpdateChatDto } from './dto/update-chat.dto';
#WebSocketGateway({
cors: {
origin: '*',
},
})
export class ChatGateway
implements OnGatewayConnection, OnGatewayDisconnect, OnGatewayInit
{
constructor(
private readonly chatService: ChatService,
private userService: UserService,
) {}
private readonly logger: Logger = new Logger(ChatGateway.name);
#WebSocketServer() server: Server;
afterInit(client: Socket) {
this.logger.log('Initialized SocketGateway');
}
handleConnection(client: Socket) {
this.logger.log(`[connection] from client (${client.id})`);
}
handleDisconnect(client: Socket) {
this.logger.log(`[disconnection] from client(${client.id})`);
}
//this works when user logs is
#SubscribeMessage('setup')
async handleJoinRoom(client: Socket, #MessageBody() userData) {
//here I'm getting all rooms that contains my user id
const rooms = await this.userService.findAll(userData);
await this.logger.log(
`[joinWhiteboard] ${userData}(${client.id}) joins ${userData}`,
);
await rooms.map((item) => {
client.join(item._id.toString());
//joining all rooms that I'm in
client.to(item._id.toString()).emit('joined');
});
}
#SubscribeMessage('new message')
create(client: Socket, #MessageBody() recievedMessage) {
this.logger.log(
`[sent a new message] from (${client.id}) to ${recievedMessage.chatRoomId}`,
);
//sending message to the room
client
.to(recievedMessage.chatRoomId)
.emit('message recieved', recievedMessage);
}
}
in my github is the full code(react part also), please feel free if you need to see it
https://github.com/Code0Breaker/chat

NestJs #Sse - event is consumed only by one client

I tried the sample SSE application provided with nest.js (28-SSE), and modified the sse endpoint to send a counter:
#Sse('sse')
sse(): Observable<MessageEvent> {
return interval(5000).pipe(
map((_) => ({ data: { hello: `world - ${this.c++}` }} as MessageEvent)),
);
}
I expect that each client that is listening to this SSE will receive the message, but when opening multiple browser tabs I can see that each message is consumed only by one browser, so if I have three browsers open I get the following:
How can I get the expected behavior?
To achieve the behavior you're expecting you need to create a separate stream for each connection and push the data stream as you wish.
One possible minimalistic solution is below
import { Controller, Get, MessageEvent, OnModuleDestroy, OnModuleInit, Res, Sse } from '#nestjs/common';
import { readFileSync } from 'fs';
import { join } from 'path';
import { Observable, ReplaySubject } from 'rxjs';
import { map } from 'rxjs/operators';
import { Response } from 'express';
#Controller()
export class AppController implements OnModuleInit, OnModuleDestroy {
private stream: {
id: string;
subject: ReplaySubject<unknown>;
observer: Observable<unknown>;
}[] = [];
private timer: NodeJS.Timeout;
private id = 0;
public onModuleInit(): void {
this.timer = setInterval(() => {
this.id += 1;
this.stream.forEach(({ subject }) => subject.next(this.id));
}, 1000);
}
public onModuleDestroy(): void {
clearInterval(this.timer);
}
#Get()
public index(): string {
return readFileSync(join(__dirname, 'index.html'), 'utf-8').toString();
}
#Sse('sse')
public sse(#Res() response: Response): Observable<MessageEvent> {
const id = AppController.genStreamId();
// Clean up the stream when the client disconnects
response.on('close', () => this.removeStream(id));
// Create a new stream
const subject = new ReplaySubject();
const observer = subject.asObservable();
this.addStream(subject, observer, id);
return observer.pipe(map((data) => ({
id: `my-stream-id:${id}`,
data: `Hello world ${data}`,
event: 'my-event-name',
}) as MessageEvent));
}
private addStream(subject: ReplaySubject<unknown>, observer: Observable<unknown>, id: string): void {
this.stream.push({
id,
subject,
observer,
});
}
private removeStream(id: string): void {
this.stream = this.stream.filter(stream => stream.id !== id);
}
private static genStreamId(): string {
return Math.random().toString(36).substring(2, 15);
}
}
You can make a separate service for it and make it cleaner and push stream data from different places but as an example showcase this would result as shown in the screenshot below
This behaviour is correct. Each SSE connection is a dedicated socket and handled by a dedicated server process. So each client can receive different data.
It is not a broadcast-same-thing-to-many technology.
How can I get the expected behavior?
Have a central record (e.g. in an SQL DB) of the desired value you want to send out to all the connected clients.
Then have each of the SSE server processes watch or poll that central record
and send out an event each time it changes.
you just have to generate a new observable for each sse connection of the same subject
private events: Subject<MessageEvent> = new Subject();
constuctor(){
timer(0, 1000).pipe(takeUntil(this.destroy)).subscribe(async (index: any)=>{
let event: MessageEvent = {
id: index,
type: 'test',
retry: 30000,
data: {index: index}
} as MessageEvent;
this.events.next(event);
});
}
#Sse('sse')
public sse(): Observable<MessageEvent> {
return this.events.asObservable();
}
Note: I'm skipping the rest of the controller code.
Regards,

How do you to implement a GRPC server in TypeScript?

I am trying to use #grpc/proto-loader to do dynamic code generation of the protobuf files to implement a simple server but in Typescript.
I've gotten as far as
import { Server, loadPackageDefinition, ServerCredentials } from "grpc";
import { loadSync } from "#grpc/proto-loader";
const packageDefinition = loadSync(__dirname + "/protos/ArtifactUpload.proto");
const protoDescriptor = loadPackageDefinition(packageDefinition);
const impl = {
};
const server = new Server();
server.addService(protoDescriptor.ArtifactUpload, impl);
server.bind('0.0.0.0:50051', ServerCredentials.createInsecure());
server.start();
So I have two problems
in the Javascript examples they use protoDescriptor.XXX.service however, there's no service property in protoDescriptor.ArtifactUpload
if I try to add implementation methods in impl, the compiler also fails to compile.
Since the Javascript example works, I am thinking that questions along the line of add new property in Typescript object may be able to add the necessary service type. However, I haven't had luck so far.
My Protobuf is
syntax = "proto3";
service ArtifactUpload {
rpc SignedUrlPutObject (UploadRequest) returns (SignedUrlPutObjectResponse) {}
}
message UploadRequest {
string message = 1;
}
message SignedUrlPutObjectResponse {
string reply = 1;
}
[Updated on 14 May 2021]: TypeScript generation via #grpc/proto-loader is now released with version 0.6.0! I've updated my example here to reflect this. You can now install the latest version of proto loader with npm i #grpc/proto-loader which will contain the TS generation script. The instructions below are still valid.
You can use the proto-loader to generate types.
First, install the proto-loader:
npm i #grpc/proto-loader
You can then generate the types like so:
./node_modules/.bin/proto-loader-gen-types --longs=String --enums=String --defaults --oneofs --grpcLib=#grpc/grpc-js --outDir=proto/ proto/*.proto
Here's the proto file I use for this example:
syntax = "proto3";
package example_package;
message ServerMessage {
string server_message = 1;
}
message ClientMessage {
string client_message = 1;
}
service Example {
rpc unaryCall(ClientMessage) returns (ServerMessage) {}
rpc serverStreamingCall(ClientMessage) returns (stream ServerMessage) {}
rpc clientStreamingCall(stream ClientMessage) returns (ServerMessage) {}
rpc bidirectionalStreamingCall(stream ClientMessage) returns (stream ServerMessage) {}
}
Once the types are generated, you can consume them like so:
import * as grpc from '#grpc/grpc-js';
import * as protoLoader from '#grpc/proto-loader';
import { ProtoGrpcType } from './proto/example';
import { ClientMessage } from './proto/example_package/ClientMessage';
import { ExampleHandlers } from './proto/example_package/Example';
import { ServerMessage } from './proto/example_package/ServerMessage';
const host = '0.0.0.0:9090';
const exampleServer: ExampleHandlers = {
unaryCall(
call: grpc.ServerUnaryCall<ClientMessage, ServerMessage>,
callback: grpc.sendUnaryData<ServerMessage>
) {
if (call.request) {
console.log(`(server) Got client message: ${call.request.clientMessage}`);
}
callback(null, {
serverMessage: 'Message from server',
});
},
serverStreamingCall(
call: grpc.ServerWritableStream<ClientMessage, ServerMessage>
) {
call.write({
serverMessage: 'Message from server',
});
},
clientStreamingCall(
call: grpc.ServerReadableStream<ClientMessage, ServerMessage>
) {
call.on('data', (clientMessage: ClientMessage) => {
console.log(
`(server) Got client message: ${clientMessage.clientMessage}`
);
});
},
bidirectionalStreamingCall(
call: grpc.ServerDuplexStream<ClientMessage, ServerMessage>
) {
call.write({
serverMessage: 'Message from server',
});
call.on('data', (clientMessage: ClientMessage) => {
console.log(
`(server) Got client message: ${clientMessage.clientMessage}`
);
});
},
};
function getServer(): grpc.Server {
const packageDefinition = protoLoader.loadSync('./proto/example.proto');
const proto = (grpc.loadPackageDefinition(
packageDefinition
) as unknown) as ProtoGrpcType;
const server = new grpc.Server();
server.addService(proto.example_package.Example.service, exampleServer);
return server;
}
if (require.main === module) {
const server = getServer();
server.bindAsync(
host,
grpc.ServerCredentials.createInsecure(),
(err: Error | null, port: number) => {
if (err) {
console.error(`Server error: ${err.message}`);
} else {
console.log(`Server bound on port: ${port}`);
server.start();
}
}
);
}
I've created various examples of how to use gRPC with TypeScript here: https://github.com/badsyntax/grpc-js-typescript
I got it working in the end as follows:
In package.json I had the following:
{
...
"scripts": {
"start": "node index.js",
"build": "pbjs -t static-module -w commonjs -o protos.js protos/*.proto && pbts -o protos.d.ts protos.js && tsc",
},
"dependencies": {
"#grpc/proto-loader": "^0.5.5",
"google-protobuf": "^3.13.0",
"grpc": "^1.24.4",
"typescript": "^4.0.5"
},
"devDependencies": {
"#types/node": "^14.14.7",
"protobufjs": "^6.10.1"
}
}
import { Server, loadPackageDefinition, ServerCredentials, GrpcObject, ServiceDefinition, handleUnaryCall } from "grpc";
import { ISignedUrlPutObjectResponse, IUploadRequest, SignedUrlPutObjectResponse } from "./protos";
import { loadSync } from "#grpc/proto-loader";
const packageDefinition = loadSync(__dirname + "/protos/ArtifactUpload.proto");
interface IArtifactUpload {
signedUrlPutObject: handleUnaryCall<IUploadRequest, ISignedUrlPutObjectResponse>;
}
interface ServerDefinition extends GrpcObject {
service: any
}
interface ServerPackage extends GrpcObject {
[name: string]: ServerDefinition
}
const protoDescriptor = loadPackageDefinition(packageDefinition) as ServerPackage;
const server = new Server();
server.addService<IArtifactUpload>(protoDescriptor.ArtifactUpload.service, {
signedUrlPutObject(call, callback) {
console.log(call.request.message);
console.log(callback);
callback(null, SignedUrlPutObjectResponse.create({ reply: "hello " + call.request.message }));
}
});
server.bind('0.0.0.0:50051', ServerCredentials.createInsecure());
server.start();
I use protobufjs to build some of the typings though they are mostly unused as it is not fully compatible with GRPC. However, it does save time with the request and response typings.
I still needed to create the server typings and apply it to the protoDescriptor. Repeating it here for emphasis.
interface IArtifactUpload {
signedUrlPutObject(call: ServerUnaryCall<IUploadRequest>, callback: ArtifactUpload.SignedUrlPutObjectCallback): void;
}
interface ServerDefinition extends GrpcObject {
service: any;
}
interface ServerPackage extends GrpcObject {
[name: string]: ServerDefinition
}
I used any for the service as it was the only one that allowed me to avoid putting in anything specific to IArtifactUpload Ideally the typing for GrpcObject which at present is
export interface GrpcObject {
[name: string]: GrpcObject | typeof Client | ProtobufMessage;
}
should try to provide an object that represents the server.
I linked my solution to https://github.com/protobufjs/protobuf.js/issues/1017#issuecomment-725064230 in case there's a better way that I am missing.

How to use connection as standalone object with types?

Not working code just to illustrate what I'm trying to achieve
Some connection file
import { ConnectionManager } from 'typeorm';
const c = new ConnectionManager();
// user ormconfig.conf file
export const connection = c.createAndConnect();
using in some model
#Entity()
#Table("annual_incomes")
export class AnnualIncome
{
#PrimaryGeneratedColumn()
id: number;
#Column({ length: 75 })
variant: string;
#Column("int")
sort: number;
#Column()
is_active: boolean;
}
Later somewhere in the code, I want to get connection with all methods, something like:
import { connection } from 'someconnection';
import { AnnualIncome } from 'entities';
// some code here
api.get('/incomes', async(ctx) => {
ctx.body = await connection.getRepository(AnnualIncome).find();
});
Usually, I'm getting an error from tsc that .getRepository() method was not found in connection. However if I do something like:
import { connection } from 'someconnection';
import { AnnualIncome } from 'entities';
// some code here
api.get('/incomes', async(ctx) => {
ctx.body = await connection.then(async connection => {
return await connection.getRepository(AnnualIncome).find();
}
});
the above code works with definitions and tsc does not complain about not-existing methods.
I'd like to avoid an extra definition connection.then() and get plain connection with all methods defined in <Connection> type.
just use createConnection method to create your connection when you bootstrap your application. Later you can access your connection from anywhere using getConnection() method:
import { AnnualIncome } from 'entities';
import { createConnection, getConnection } from 'typeorm';
// somewhere in your app, better where you bootstrap express and other things
createConnection(); // read config from ormconfig.json or pass them here
// some code here
api.get('/incomes', async(ctx) => {
ctx.body = await getConnection().getRepository(AnnualIncome).find();
});
Also you can simply use getRepository method also avalible from anywhere:
import { AnnualIncome } from 'entities';
import { getRepository } from 'typeorm';
// some code here
api.get('/incomes', async (ctx) => {
ctx.body = await getRepository(AnnualIncome).find();
});

Resources