How to use AsyncLocalStorage for an Observable? - node.js

I'd like to use use AsyncLocalStorage in a NestJs Interceptor:
export interface CallHandler<T = any> {
handle(): Observable<T>;
}
export interface NestInterceptor<T = any, R = any> {
intercept(context: ExecutionContext, next: CallHandler<T>): Observable<R> | Promise<Observable<R>>;
}
The interceptor function gets a next CallHandler that returns an Observable.
I cannot use run in this case (the run callback will exit immediately before the callHandler.handle() observable has finished):
intercept(context: ExecutionContext, callHandler: CallHandler): Observable<any> | Promise<Observable<any>> {
const asyncLocalStorage = new AsyncLocalStorage();
const myStore = { some: 'data'};
return asyncLocalStorage.run(myStore, () => callHandler.handle());
}
See broken replit-example
The solution I came up with is this:
const localStorage = new AsyncLocalStorage();
export class MyInterceptor implements NestInterceptor {
intercept(context: ExecutionContext, callHandler: CallHandler): Observable<any> | Promise<Observable<any>> {
const resource = new AsyncResource('AsyncLocalStorage', { requireManualDestroy: true });
const myStore = { some: 'data' };
localStorage.enterWith(myStore);
return callHandler.handle().pipe(
finalize(() => resource.emitDestroy())
);
}
}
See working replit example
This seems to work fine, but I am not sure if this is really correct - and it looks messy and error-prone. So I wonder:
Is this correct at all?
Is there a better/cleaner way to handle this?

Below is the solution I came up with. My understanding of the problem is that you need the run function to receive a callback function that will fully encapsulate the execution of the handler, however, the intercept function is expected to return an observable that has not yet been triggered. This means that if you encapsulate the observable itself in the run callback function, it will not have been triggered yet.
My solution, below, is to return a new observable that, when triggered, will be responsible for triggering (i.e. subscribing to) the call handler itself. As a result, the promise we create in the run call can fully encapsulate the handle function and it's async callbacks.
Here is the general functionality in a stand-alone function so that you can see it all together:
intercept(context: ExecutionContext, next: CallHandler<any>): Observable<any> {
return new Observable((subscribe) => {
asyncStorage.run({}, () => new Promise(resolve => {
next.handle().subscribe(
result => {
subscribe.next(result);
subscribe.complete();
resolve();
},
error => {
subscribe.error(err);
resolve();
}
);
}));
});
}
Next, I took that concept and integrated it into my interceptor below.
export class RequestContextInterceptor implements NestInterceptor {
constructor(
private readonly requestContext: RequestContext,
private readonly localStorage: AsyncLocalStorage<RequestContextData>
) {}
intercept(context: ExecutionContext, next: CallHandler<any>): Observable<any> {
const contextData = this.requestContext.buildContextData(context);
return new Observable((subscribe) => {
void this.localStorage.run(contextData, () => this.runHandler(next, subscribe));
});
}
private runHandler(next: CallHandler<any>, subscribe: Subscriber<any>): Promise<void> {
return new Promise<void>((resolve) => {
next.handle().subscribe(
(result) => {
subscribe.next(result);
subscribe.complete();
resolve();
},
(err) => {
subscribe.error(err);
resolve();
}
);
});
}
}
It's worth noting that the Promise that is created during the run call does not have a rejection path. This is intentional. The error is passed on to the observable that is wrapping the promise. This means that the outer observable will still succeed or error depending upon what the inner observable does, however, the promise that wraps the inner observable will always resolve regardless.

Here is a solution for cls-hooks:
return new Observable(observer => {
namespace.runAndReturn(async () => {
namespace.set("some", "data")
next.handle()
.subscribe(
res => observer.next(res),
error => observer.error(error),
() => observer.complete()
)
})
})

here is our current solution to the problem:
we create an observable that will simply pass all emissions to the callHandler
the important part is, that we subscribe inside the localStorage.run method
const localStorage = new AsyncLocalStorage();
export class MyInterceptor implements NestInterceptor {
intercept(context: ExecutionContext, callHandler: CallHandler): Observable<any> | Promise<Observable<any>> {
const myStore = { some: 'data' };
return new Observable((subscriber) => {
const subscription = localStorage.run(myStore, () => {
/**
* - run the handler function in the run callback, so that myStore is set
* - subscribe to the handler and pass all emissions of the callHandler to our subscriber
*/
return callHandler.handle().subscribe(subscriber);
});
/**
* return an unsubscribe method
*/
return () => subscription.unsubscribe();
});
}
}

Related

Binding interceptor to NestJS microservice method

I just created a simple interceptor to override every error thrown by my application, just like the one on Nest's documentation:
#Injectable()
export class ErrorsInterceptor implements NestInterceptor {
intercept(context: ExecutionContext, next: CallHandler): Observable<any> {
return next
.handle()
.pipe(
catchError(err => throwError(() => new ApplicationException())),
);
}
}
And altough exceptions caused by http requests indeed are caught in that interceptor, I just can't make it work with RPC requests (like KafjaJS and events).
Just like the documentation, I've binded it on my app.module:
{
provide: APP_INTERCEPTOR,
useClass: ErrorsInterceptor,
}
I know I'm probably missing something out, can someone clarify where and why what I'm doing is not working and how to make it work?
#Edit: I forgot to mention that I made it work #UseInterceptors() above my controller's method, but I'd like to make it work without it.
#Edit 2: I have a hybrid appplication, this is what my main looks like (as asked by Jay):
async function bootstrap() {
const app = await NestFactory.create<NestExpressApplication>(AppModule, {
logger: WinstonModule.createLogger(winstonTransports),
});
app.connectMicroservice<MicroserviceOptions>(kafkaConfig);
const logger = app.get<winston.Logger>(WINSTON_MODULE_NEST_PROVIDER);
app.useLogger(logger);
app.enableCors(corsConfig);
await app.startAllMicroservices();
await app.listen(env.PORT);
}
When working with hybrid applications you need to add { inheritAppConfig: true } to the connectMicroservice() method as a second parameter as described in the docs. This means your main.ts should be
async function bootstrap() {
const app = await NestFactory.create<NestExpressApplication>(AppModule, {
logger: WinstonModule.createLogger(winstonTransports),
});
app.connectMicroservice<MicroserviceOptions>(kafkaConfig, { inheritAppConfig: true });
const logger = app.get<winston.Logger>(WINSTON_MODULE_NEST_PROVIDER);
app.useLogger(logger);
app.enableCors(corsConfig);
await app.startAllMicroservices();
await app.listen(env.PORT);
}

Assign route dynamically Node/Express

I need dynamically assign a new route but it for some reason refuses to work.
When I send a request in the Postman it just keeps waiting for a response
The whole picture of what I am doing is the following:
I've got a controller with a decorator on one of its methods
#Controller()
export class Test {
#RESTful({
endpoint: '/product/test',
method: 'post',
})
async testMe() {
return {
type: 'hi'
}
}
}
export function RESTful({ endpoint, method, version }: { endpoint: string, version?: string, method: HTTPMethodTypes }) {
return function (target: any, propertyKey: string, descriptor: PropertyDescriptor): void {
const originalMethod = descriptor.value
Reflect.defineMetadata(propertyKey, {
endpoint,
method,
propertyKey,
version
}, target)
return originalMethod
}
}
export function Controller() {
return function (constructor: any) {
const methods = Object.getOwnPropertyNames(constructor.prototype)
Container.set(constructor)
for (let action of methods) {
const route: RESTfulRoute = Reflect.getMetadata(action, constructor.prototype)
if (route) {
const version: string = route.version ? `/${route.version}` : '/v1'
Container.get(Express).injectRoute((instance: Application) => {
instance[route.method](`/api${version}${route.endpoint}`, async () => {
return await Reflect.getOwnPropertyDescriptor(constructor, route.propertyKey)
// return await constructor.prototype[route.propertyKey](req, res)
})
})
}
}
}
}
Is it possible to dynamically set the route in the way?
I mainly use GraphQL but sometimes I need RESTful API too. So, I want to solve this by that decorator
In order for the response to finish, there must be a res.end() or res.json(...) or similar. But I cannot see that anywhere in your code.

Nestjs - Use applyDecorators inside createParamDecorator

I'm using Nestjs decorators and am trying to make the most of custom decorators. I'm trying to write my own custom #Body param decorator that validates and applies multiple decorators at the same time.
Does anyone know if the below is possible? I'm having difficulty getting the second argument in the transform call of the pipes to have metadata: ArgumentMetadata.
export const MyParamDecorator = <T>(myDto: T) => {
return createParamDecorator(
(data: unknown, ctx: ExecutionContext) => {
applyDecorators( // also get SetMeta and Pipes to validate DTO
SetMetadata('thisWorks', true)
UsePipes(CustomValidationPipe, OtherPipe), // + add MyDTO - type T somehow..
);
return doAsyncWork()
},
)();
}
#Controller('users')
export class UsersController {
#Patch(':id')
update(#MyParamDecorator() asyncWork: Promise<any>) { // <------ Promise<any> is custom async opperation that will be handled. (So I can't type the DTO here..)
return reqBody;
}
}
I ran across this question because I needed a similar answer. Hopefully what I've found is helpful.
Part 1. You can do async processing in a decorator.
And, it would resolve for you, so your controller would look like:
update(#MyParamDecorator() asyncWork: any) {
Notice that Promise<any> is just any.
Part 2. You can get ArgumentMetadata using an enhancer.
Here is a quick example, let's assume METADATA__PARAM_TYPE is myType.
param-type.enhancer.ts
export const paramTypeEnhancer: ParamDecoratorEnhancer = (
target: Record<string, unknown>,
propertyKey: string,
parameterIndex: number,
): void => {
const paramTypes = Reflect.getOwnMetadata('design:paramtypes', target, propertyKey);
const metatype = paramTypes[parameterIndex];
Reflect.defineMetadata(METADATA__PARAM_TYPE, metatype, target[propertyKey]);
};
my-decorator.ts
import { paramTypeEnhancer } from './param-type.enhancer';
export const MyDecorator = createParamDecorator(
async (data: unknown, ctx: ExecutionContext): Promise<any> => {
const metatype = Reflect.getOwnMetadata(METADATA__PARAM_TYPE, ctx.getHandler());
const argument: ArgumentMetadata = {
type: 'custom',
data: undefined,
metatype: metatype,
};
// Do processing here... You can return a promise.
},
[paramTypeEnhancer],
);
See this gist for a full annotated version: https://gist.github.com/josephdpurcell/fc04cfd428a6ee9d7ffb64685e4fe3a6

NestJS interceptor - handle error inside .pipe()

I'd like to implement a NestJS-interceptor that creates and writes an elasticSearch entry before the request handler is hit and that updates this entry with error/success-info after the handler has finished. To do this I am using:
#Injectable()
export class ElasticsearchInterceptor implements NestInterceptor {
constructor(private readonly elasticSearchService: ElasticsearchService) {}
async intercept(_context: ExecutionContext, next: CallHandler): Promise < Observable < any >> {
const elasticSearchPayload = new ElasticsearchPayloadBuilder()
.setOperation(...)
.build();
const elasticSearchEntry = await this.elasticSearchService.writeEntry(elasticSearchPayload);
return next
.handle()
.pipe(
catchError(err => {
elasticSearchPayload.status = err.status;
return throwError(err);
}),
tap(() => {
elasticSearchPayload.status = 'success';
}),
finalize(() => {
this.elasticSearchService.updateEntry(elasticSearchEntry.id, elasticSearchPayload));
}));
}
As long as the updateEntry-call resolves, this works fine, but in case that it fails, this results in an unhandled rejection. I'd like to make sure that the error is caught and thrown. I tried converting the updateEntry-promise into a new Observable using
finalize(() => {
return from(this.elasticSearchService.updateEntry(elasticSearchEntry.id, elasticSearchPayload))
.pipe(
catchError(err => throwError(err)));
}));
but this does not solve the issue. How can I prevent the unhandled rejection and return the error from updateEntry?
finalize will simply invoke the provided callback during the teardown phase(e.g after complete, error from source or unsubscribe from consumer), which is why I think it is not working this way.
With that being said, this would be my approach:
const main$ = return next
.handle()
.pipe(
catchError(err => {
elasticSearchPayload.status = err.status;
return throwError(err);
}),
tap(() => {
elasticSearchPayload.status = 'success';
}),
);
const elastic$ = from(this.elasticService/* ... */).pipe(
// might want to ignore elements and receive only errors
ignoreElements(),
catchError(err => throwError(err)),
);
return concat(main$, elastic$)

Operators invoked multiple times for merged Observable although only one source emits

I have a function in which I'm calling an instance of Manager's onSpecificData() to which I'm subscribing in order to update my application's state (I'm managing a state on the server-side as well).
The problem is that in the SomeManager's implementation of onSpecificData() I'm merging 3 different Observables using merge() operator, which for some reason triggers the invocation of all the underlying Observable's operators even though only 1 of the sources is the one that's emitting a value
SomeManager.ts
export class DerivedManager implements Manager {
private driver: SomeDriver;
constructor(...) {
this.driver = new SomeDriver(...);
}
public onSpecificData(): Observable<DataType> {
return merge(
this.driver.onSpecificData(Sources.Source1).map((value) => {
return {source1: value};
}),
this.driver.onSpecificData(Sources.Source2).map((value) => {
return {source2: value};
}),
this.driver.onSpecificData(Sources.Source3).map((value) => {
return {source3: value};
})
);
}
Manager.ts
export type DataType = Partial<{value1: number, value2: number, value3: number}>;
export interface Manager {
onSpecificData(): Observable<DataType>;
}
SomeDriver.ts
export const enum Sources {
Source1,
Source2,
Source3,
}
export class SomeDriver extends Driver {
private static specificDataId = 1337; // some number
private handler: Handler;
constructor(...) {
super(...);
this.handler = new Handler(this.connection, ...);
// ...
}
// ...
onSpecificData(source: Sources): Observable<number> {
return this.handler
.listenToData<SpecificDataType>(
SomeDriver.specificDataId,
(data) => data.source === source)
).map((data) => data.value);
}
}
Driver.ts
export abstract class Driver {
protected connection: Duplex;
constructor(...) {
// init connection, etc...
}
public abstract onSpecificData(source: number);
// some implementations and more abstract stuff...
}
Handler.ts
export class Handler {
private data$: Observable<Buffer>;
constructor(private connection: Duplex, ...) {
this.data$ = Observable.fromEvent<Buffer>(connection as any, 'data');
}
listenToData<T>(dataId: number, filter?: (data: T) => boolean) {
return this.data$
.map((data) => {
// decode and transform
})
.filter((decodedData) => !decodedData.error && decodedData.value.id)
.do((decodedData) => {
console.log(`Got ${decodedData.value.id}`);
})
.map((decodedData) => decodedData.value.value as T)
.filter(filter || () => true);
}
}
And finally, subscribe()-ing:
export default function(store: Store<State>, manager: Manager) {
// ...
manager.onSpecificData()
.subscribe((data) => {
// update state according to returned data
});
}
As you can see, there is only 1 underlying Observable (data$) but apparently the operator chain in listenToData<T>() is invoked 3 times for each value emitted by it. I already know this is because of SomeManager#onSpecificData()'s merge of those 3 Observables, but I don't know why this happens. I want it to be invoked once for each value.
Help will be much appreciated.
I solved this in a "hacky" way, in my opinion. I replaced data$ with a Subject, created an observable from stream's 'data' event, moving all the shared logic to that observable and emit a value from the subject, like so:
export class Handler {
private dataSrc = new Subject<DecodedData>();
constructor(private connection: Duplex, ...) {
Observable.fromEvent<Buffer>(connection as any, 'data')
.map((data) => {
// decode and transform
})
.filter((decodedData) => !decodedData.error)
.do((decodedData) => {
console.log(`Got ${decodedData.value.id}`);
})
.subscribe((decodedData) => {
this.dataSrc.next(decodedData);
});
}
listenToData<T>(dataId: number, filter?: (data: T) => boolean) {
return this.dataSrc
.filter((decodedData) => decodedData.value.id === dataId)
.map((decodedData) => decodedData.value.value as T)
.filter(filter || () => true);
}
}
Not exactly the solution I was looking for, but it works. If anyone has a better solution, which better suits the "Rx way" to do stuff, I'd love to hear it.

Resources