ZeroMQ retransmitting pub-sub model - node.js

I'm trying to use the ZeroMQ javascript bindings to implement a re-transmitting pub-sub backbone.
Something to the effect of:
inbound modules announce themselves to the backbone, the backbone subscribes to them and re-transmits on its own publisher socket
outbound modules subscribe to backbone
I'm running into an issue with the pub sockets required to be used from a single thread though.
My current code, was something like this:
async function listen(name: string, sub: zmq.Subscriber, pub: zmq.Publisher) {
let id = 0;
for await (const [topic, msg] of sub) {
console.log(`BACKBONE | ${name} | received a message id: ${++id} related to: ${topic.toString()} containing message: ${msg.toString()}`);
await pub.send([topic, msg]);
}
}
Which gets instantiated for each of the inbound modules, but then of course they clash on pub.send.

I wrote a queue to serialize the socket access, which resolved the problem.
import * as zmq from "zeromq"
export class PublisherQueue {
private queue: Buffer[][] = []
constructor(private publisher: zmq.Publisher) { }
private static toBuffer(value: Buffer | string): Buffer {
if (value instanceof Buffer) {
return value as Buffer;
} else {
return Buffer.from(value as string);
}
}
send(topic: Buffer | string, msg: Buffer | string) {
this.queue.push([PublisherQueue.toBuffer(topic), PublisherQueue.toBuffer(msg)]);
}
async run() {
while (true) {
if (this.queue.length > 0) {
let msg = this.queue.shift();
if (msg !== undefined) {
await this.publisher.send(msg);
}
} else {
await new Promise(resolve => setTimeout(resolve, 50));
}
}
}
}

Related

How to add JSON data to Jetstream strams?

I have a code written in node-nats-streaming and trying to convert it to newer jetstream. A part of code looks like this:
import { Message, Stan } from 'node-nats-streaming';
import { Subjects } from './subjects';
interface Event {
subject: Subjects;
data: any;
}
export abstract class Listener<T extends Event> {
abstract subject: T['subject'];
abstract queueGroupName: string;
abstract onMessage(data: T['data'], msg: Message): void;
private client: Stan;
protected ackWait = 5 * 1000;
constructor(client: Stan) {
this.client = client;
}
subscriptionOptions() {
return this.client
.subscriptionOptions()
.setDeliverAllAvailable()
.setManualAckMode(true)
.setAckWait(this.ackWait)
.setDurableName(this.queueGroupName);
}
listen() {
const subscription = this.client.subscribe(
this.subject,
this.queueGroupName,
this.subscriptionOptions()
);
subscription.on('message', (msg: Message) => {
console.log(`Message received: ${this.subject} / ${this.queueGroupName}`);
const parsedData = this.parseMessage(msg);
this.onMessage(parsedData, msg);
});
}
parseMessage(msg: Message) {
const data = msg.getData();
return typeof data === 'string'
? JSON.parse(data)
: JSON.parse(data.toString('utf8'));
}
}
As I searched through the documents it seems I can do something like following:
import { connect } from "nats";
const jsm = await nc.jetstreamManager();
const cfg = {
name: "EVENTS",
subjects: ["events.>"],
};
await jsm.streams.add(cfg);
But it seems there are only name and subject options available. But from my original code I need a data property it can handle JSON objects. Is there a way I can convert this code to a Jetstream code or I should change the logic of the whole application as well?

Nestjs | grpc : How to handle remote error from client side

Remote Server
#Catch(RpcException)
export class RpcExceptionHandler implements RpcExceptionFilter<RpcException> {
catch(exception: RpcException, host: ArgumentsHost): Observable<any> {
return throwError(exception.getError());
}
}
#UseFilters(new RpcExceptionHandler())
#GrpcMethod('AppController', 'Accumulate')
async accumulate(numberArray: INumberArray, metadata: any): Promise<ISumOfNumberArray> {
throw new RpcException({
code: 5,
message: 'Data Not Found'
})
}
Client code
#Get('add')
async getSumc(#Query('data') data: number[]) {
try {
let ata = await this.grpcService.accumulate({ data });
return ata;
} catch (err) {
//logic here if error comes
return err;
}
}
Proto defination.
syntax = "proto3";
package app;
// Declare a service for each controller you have
service AppController {
// Declare an rpc for each method that is called via gRPC
rpc Accumulate (NumberArray) returns (SumOfNumberArray);
}
// Declare the types used above
message NumberArray {
repeated double data = 1;
}
message SumOfNumberArray {
double sum = 1;
}
If error comes it is not going to catch block, just showing the server error.
I want to catch the error if remote throwing any error.
Try this one:
#Get('add')
async getSumc(#Query('data') data: number[]) {
try {
let ata = await this.grpcService.accumulate({ data }).toPromise();
return ata;
} catch (e) {
throw new RpcException(e);
}
}
Example here

NestJs #Sse - event is consumed only by one client

I tried the sample SSE application provided with nest.js (28-SSE), and modified the sse endpoint to send a counter:
#Sse('sse')
sse(): Observable<MessageEvent> {
return interval(5000).pipe(
map((_) => ({ data: { hello: `world - ${this.c++}` }} as MessageEvent)),
);
}
I expect that each client that is listening to this SSE will receive the message, but when opening multiple browser tabs I can see that each message is consumed only by one browser, so if I have three browsers open I get the following:
How can I get the expected behavior?
To achieve the behavior you're expecting you need to create a separate stream for each connection and push the data stream as you wish.
One possible minimalistic solution is below
import { Controller, Get, MessageEvent, OnModuleDestroy, OnModuleInit, Res, Sse } from '#nestjs/common';
import { readFileSync } from 'fs';
import { join } from 'path';
import { Observable, ReplaySubject } from 'rxjs';
import { map } from 'rxjs/operators';
import { Response } from 'express';
#Controller()
export class AppController implements OnModuleInit, OnModuleDestroy {
private stream: {
id: string;
subject: ReplaySubject<unknown>;
observer: Observable<unknown>;
}[] = [];
private timer: NodeJS.Timeout;
private id = 0;
public onModuleInit(): void {
this.timer = setInterval(() => {
this.id += 1;
this.stream.forEach(({ subject }) => subject.next(this.id));
}, 1000);
}
public onModuleDestroy(): void {
clearInterval(this.timer);
}
#Get()
public index(): string {
return readFileSync(join(__dirname, 'index.html'), 'utf-8').toString();
}
#Sse('sse')
public sse(#Res() response: Response): Observable<MessageEvent> {
const id = AppController.genStreamId();
// Clean up the stream when the client disconnects
response.on('close', () => this.removeStream(id));
// Create a new stream
const subject = new ReplaySubject();
const observer = subject.asObservable();
this.addStream(subject, observer, id);
return observer.pipe(map((data) => ({
id: `my-stream-id:${id}`,
data: `Hello world ${data}`,
event: 'my-event-name',
}) as MessageEvent));
}
private addStream(subject: ReplaySubject<unknown>, observer: Observable<unknown>, id: string): void {
this.stream.push({
id,
subject,
observer,
});
}
private removeStream(id: string): void {
this.stream = this.stream.filter(stream => stream.id !== id);
}
private static genStreamId(): string {
return Math.random().toString(36).substring(2, 15);
}
}
You can make a separate service for it and make it cleaner and push stream data from different places but as an example showcase this would result as shown in the screenshot below
This behaviour is correct. Each SSE connection is a dedicated socket and handled by a dedicated server process. So each client can receive different data.
It is not a broadcast-same-thing-to-many technology.
How can I get the expected behavior?
Have a central record (e.g. in an SQL DB) of the desired value you want to send out to all the connected clients.
Then have each of the SSE server processes watch or poll that central record
and send out an event each time it changes.
you just have to generate a new observable for each sse connection of the same subject
private events: Subject<MessageEvent> = new Subject();
constuctor(){
timer(0, 1000).pipe(takeUntil(this.destroy)).subscribe(async (index: any)=>{
let event: MessageEvent = {
id: index,
type: 'test',
retry: 30000,
data: {index: index}
} as MessageEvent;
this.events.next(event);
});
}
#Sse('sse')
public sse(): Observable<MessageEvent> {
return this.events.asObservable();
}
Note: I'm skipping the rest of the controller code.
Regards,

Nestjs/microservice does not create an observable?

I was trying to use streams with grpc.
But when I use the below-published code, it says that subscribe is not a function.
It seems like nestjs/microservice does not create an observable from a stream.
The method used in the docs seems to be outdated.
My code - proto file:
message Rating {
string id = 1;
uint32 value = 2;
string comment = 3;
string type = 4;
}
service RatingsRpcService {
rpc Test (stream GetRatingRequest) returns (stream Rating);
}
message GetRatingRequest {
string id = 1;
}
message Ratings {
repeated Rating items = 1;
}
And the controller file:
#GrpcStreamMethod('RatingsRpcService')
test(msg: Observable<any>, metadata: any): Observable<any> {
const subject = new Subject();
msg.subscribe({
next: (item: any) => {
subject.next({whatever: 'value'});
},
error: (err: any) => console.log(err),
complete: () => subject.complete()
});
return subject.asObservable();
}
And the error I get: TypeError: msg.subscribe is not a function
Did I miss something?
Okay, I guess I found the issue.
It's kinda strange, but changing the type from Observable fixed the error.
I mean this one: messages: any instead of messages: Observable<any> or messages: Observable<RatingInterface>.
Please find the code presented below:
#GrpcStreamMethod('RatingsRpcService')
async findByIdStream(messages: any, metadata: any): Promise<Observable<RatingInterface>> {
const subject = new Subject<RatingInterface>();
messages.subscribe({
next: async (dto: GetRatingDto) => {
const item = await this.ratingsService.findById(dto.id);
subject.next(item);
},
error: (err: any) => {
throw new RpcException('Could not process stream.')
},
complete: () => subject.complete()
});
return subject.asObservable();
}

NodeJS streams not awaiting async

I have run into an issue when testing NodeJS streams. I can't seem to get my project to wait for the output from the Duplex and Transform streams after running a stream.pipeline, even though it is returning a promise. Perhaps I'm missing something, but I believe that the script should wait for the function to return before continuing. The most important part of the project I'm trying to get working is:
// Message system is a duplex (read/write) stream
export class MessageSystem extends Duplex {
constructor() {
super({highWaterMark: 100, readableObjectMode: true, writableObjectMode: true});
}
public _read(size: number): void {
var chunk = this.read();
console.log(`Recieved ${chunk}`);
this.push(chunk);
}
public _write(chunk: Message, encoding: string,
callback: (error?: Error | null | undefined, chunk?: Message) => any): void {
if (chunk.data === null) {
callback(new Error("Message.Data is null"));
} else {
callback();
}
}
}
export class SystemStream extends Transform {
public type: MessageType = MessageType.Global;
public data: Array<Message> = new Array<Message>();
constructor() {
super({highWaterMark: 100, readableObjectMode: true, writableObjectMode: true});
}
public _transform(chunk: Message, encoding: string,
callback: TransformCallback): void {
if (chunk.single && (chunk.type === this.type || chunk.type === MessageType.Global)) {
console.log(`Adding ${chunk}`);
this.data.push(chunk);
chunk = new Message(chunk.data, MessageType.Removed, true);
callback(undefined, chunk); // TODO: Is this correct?
} else if (chunk.type === this.type || chunk.type === MessageType.Global) { // Ours and global
this.data.push(chunk);
callback(undefined, chunk);
} else { // Not ours
callback(undefined, chunk);
}
}
}
export class EngineStream extends SystemStream {
public type: MessageType = MessageType.Engine;
}
export class IOStream extends SystemStream {
public type: MessageType = MessageType.IO;
}
let ms = new MessageSystem();
let es = new EngineStream();
let io = new IOStream();
let pipeline = promisify(Stream.pipeline);
async function start() {
console.log("Running Message System");
console.log("Writing new messages");
ms.write(new Message("Hello"));
ms.write(new Message("world!"));
ms.write(new Message("Engine data", MessageType.Engine));
ms.write(new Message("IO data", MessageType.IO));
ms.write(new Message("Order matters in the pipe, even if Global", MessageType.Global, true));
ms.end(new Message("Final message in the stream"));
console.log("Piping data");
await pipeline(
ms,
es,
io
);
}
Promise.all([start()]).then(() => {
console.log(`Engine Messages to parse: ${es.data.toString()}`);
console.log(`IO Messages to parse: ${io.data.toString()}`);
});
Output should look something like:
Running message system
Writing new messages
Hello
world!
Engine Data
IO Data
Order Matters in the pipe, even if Global
Engine messages to parse: Engine Data
IO messages to parse: IO Data
Any help would be greatly appreciated. Thanks!
Note: I posted this with my other account, and not this one that is my actual account. Apologies for the duplicate.
Edit: I initially had the repo private, but have made it public to help clarify the answer. More usage can be found on the feature/inital_system branch. It can be run with npm start when checked out.
Edit: I've put my custom streams here for verbosity. I think I'm on a better track than before, but now getting a "null" object recieved down the pipeline.
As the documentation states, stream.pipeline is callback-based doesn't return a promise.
It has custom promisified version that can be accessed with util.promisify:
const pipeline = util.promisify(stream.pipeline);
...
await pipeline(...);
After some work of the past couple of days, I've found my answer. The issue was my implementation of the Duplex stream. I have since changed the MessageSystem to be a Transform stream to be easier to manage and work with.
Here is the product:
export class MessageSystem extends Transform {
constructor() {
super({highWaterMark: 100, readableObjectMode: true, writableObjectMode: true});
}
public _transform(chunk: Message, encoding: string,
callback: TransformCallback): void {
try {
let output: string = chunk.toString();
callback(undefined, output);
} catch (err) {
callback(err);
}
}
}
Thank you to #estus for the quick reply and check. Again, I find my answer in the API all along!
An archived repository of my findings can be found in this repository.

Resources