I am using bunyan for logging service in my express-nodeJS application. I want to log per user/per request logging so that flow of request can be tracked down easily and will be easier to debug and keep a check on response time.
I am thinking to create a child logger on every request and attach user info to it and then attach that logger to request object. My application is kind of mvc, so my model doesn't know about request object and most of the logging is being done in models only. Now with this approach I have to pass that child logger as parameter to every function call in function and then may be that needs to be passed in other adapters that model will call.Is this the right way to do this ? As the codebase is really vast every function needs to be updated. Can you suggest any better way to achieve same results with less changes in code ?
Related
This is my context:
A NodeJs application deployed as lambda function offering an API.
I'd like to reuse the same application to offer different content depending on a request header.
So having this header "x-custom-header" sent through with the request I'd like to load a different configuration
and to provide a specific outcome payload.
When I get the request I'd like to store this header value into object
process.env.mycustomHeader = requestHeader["x-custom-header"];
so I can possibly load on any file of the app the right configuration object like:
const config = getConfig(process.env.mycustomHeader);
Now : this is working fine on a single request test
But I'm worried and investigating about the behaviour of this process object in NodeJs w/ Lambda.
Can I be sure that this object life and scope is limited to a single request?
And there's no way that multiple different request might conflict accessing this process?
And lambda does not possible share reuse same NodeJs process, messing up with my idea/setup?
Any other better proposal to solve this use-case is welcome.
If you construct the object inside the lambda handler method, it will execute with each function call.
I was going through the NestJs Docs. And there's this image.
https://docs.nestjs.com/pipes
Filters are more oriented towards Client-Side and pipes are towards Controllers. To me both seems similar.
What are the differences between Pipe and filter with their respective common use-cases ?
Pipes are meant to consuming incoming data from the request, be it url or query parameters, or a request body, and do some validations and/or transformations on them to ensure they are the shape your server expects them to be. Nest has some built in utilities like the ValidationPipe to help with this.
Filters (AKA Exception Filters) on the other hand are meant for catching errors that happened during the execution of a request and handling it, sending the error back to the client in a nice format, taking care of sending back the proper error codes, and any other error handling logic you have (like possibly sending to a monitoring service). Nest has a built in ExceptionFilter that manages this nicely, but you can always create your own to handle the logic differently.
I wrote server side application that is devided to serveral microservices all of them currentlly written in nodejs. Im using winston as my logging library. Im using also elk to monitor my logs. Recently, i discovered i cant monitor my logs comfortably, and i need a way to view in the kibana the request all the way in my microservices. I mean, i want to view all the logs from the one the request entered the first microservice, until she sent back from the last microservice it went threw. I dont have unique id to my requests or entities that sent, so i need to generate new unique id to each request. but i dont want to add the generated id to each method in my application. There is an elegant solution to do this without changing all my logs? Thanks alot.
I'm using Hapi.js's Good and GoodFile for access logs by logging 'response' events. How can I customize the object that gets logged? Specifically, I do not want to include the 'query' property on response events when the 'statusCode' is a 404.
good-file is not configurable in that fashion. It logs the JSON representation of the complete request. At present, if you want to control how the output is formatted, you will have to write your own custom reporter. Documentation can be found here.
You may also be able to fork good-file and just make some changes to the way the output is formatted.
I'm trying to create an event logger system that logs database events, I initially created this system to run on the front end (sending more than one request to the api), but have decided that it would be much better to do this all on the back end. I would like to trigger a 2nd event when a database request is made like when a user creates/modifies/deletes a document, that the system records that event along with some info that goes along with it.
I am struggling with how to add this to my node/mongo api and am wondering what is the best practice. I've read about event emitters, however i'm not sure if this would be the best way to trigger this second event - in addition to that, i'm not sure how to pass info through the emitter to the 2nd mongoose request.
Any guidance would be appreciated.
Looks like I overlooked the next() command.