AQL logs with arangoDB in custom logger - node.js

I want to get the logs of each AQL query or operation running with the arangojs SDK for ArangoDB.
I know ArangoDB maintains the logs in GUI but I just want the main DB operation logs which my code performs and attach them with any custom logger or simply with console.log
Here are the things I want to log:
Full Query
Variables used in the query
Total time it took for the query to run
Error, if occurred
Is there any middleware or callback method available to inject it with arangojs methods?
PS: I'm using arangoJS with NodeJS and GraphQL.

Related

Couchbase Java SDK subdocument mutation

Hi Im running couchabse server 7.1.3 docker container and trying to use the 3.3.4 couchbase java sdk version. I am facing an issue while performing subdocument mutation using the upsert command.
The issue I am facing is when I use the collection.mutateIn() method to perform a subdocument mutation using the upsert command, I am expecting to get the result of the operation as a MutateInResult object. when I try to use the MutateInResult.contentAs(0, String.class) to see the response I get the Index 0 is invalid error. I see that the operation as in is successful in the DB, I can see the desired json path has the updated value.
P.S: I only have a single mutation to perform which is the upsert command.
can someone please help if I am missing something here?

How can I change the name of a task that Celery sends to a backend?

I have built a queue system using Celery that accepts web requests and executes some tasks to act on those requests. I'm using Redis as the backend for Celery, but I imagine this question would apply to all backends.
Celery is returning the task name as celery-task-meta-<task ID> and storing it in the backend. This is meaningless to me. How can I change the name of the result that celery sends to Redis? I have searched through all of Celery's documentation to try to figure out how to do this.
The Redis CLI monitor is showing that Celery is using the SETEX method and sending the following input:
"SETEX" "celery-task-meta-dd32ded3-00aa-4884-8b21-42f8332e7fac"
"86400" "{\"status\": \"SUCCESS\", \"result\": {\"mode\": \"staging\",
\"123\": 50}, \"traceback\": null, \"children\": [], \"task_id\":
\"dd32ded3-00aa-4884-8b21-42f8332e7fac\", \"date_done\":
\"2019-05-09T16:44:12.826951\", \"parent_id\":
\"2e99d958-cd5a-4700-a7c2-22c90f387f28\"}"
The "result": {...} that you can see in the SETEX command above is what the task returns. I would like the SETEX to be more along the lines of:
"SETEX" "mode-staging-123-50-SUCCESS" "{...}", so that when I view all my keys in Redis, the name of the key is informational to me.
Here's another example view of the keys in my Redis cache that are meaningless:
You can't change this. The task key is created by ResultConsumer class which Redis backend uses. ResultConsumer then delegates creation of the task key to BaseKeyValueStoreBackend class. The get_key_for_task method which actually creates the key uses hardcoded task_keyprefix set to celery-task-meta-. So, to change the behaviour, you would have to subclass these classes. There's not configuration option for it.

MongoDB transaction vs firebase transaction

Question1:
Is mongoDB transaction available in Nodejs? or is it only available for Python and Java at the moment?
Java
try (ClientSession clientSession = client.startSession()) {
clientSession.startTransaction();
collection.insertOne(clientSession, docOne);
collection.insertOne(clientSession, docTwo);
clientSession.commitTransaction();
}
Python
with client.start_session() as s:
s.start_transaction()
collection_one.insert_one(doc_one, session=s)
collection_two.insert_one(doc_two, session=s)
s.commit_transaction()
Question2:
If there are 2 users calling the same API at the same time updating the same document, will transaction solve the race condition. Where it will wait for the 1st process of user 1 reading and updating the document and only process the 2nd process of user 2 reading and updating the document?
Previously, when using firebase cloud function, I'm able to solve this issue using transactions. Not sure if it's the same thing

Log all Operations of MongoDB 3.6 on Windows 10

I want to see ALL queries which are processed by my local MongoDB instance.
I tried to set db.setProfilingLevel(2) but I still only get access information, but no queries.
Does anybody now how I can log EVERY query?
db.setProfilingLevel(2) causes the MongoDB profiler to collect data for all operations.
Perhaps you are expecting the profiler docs to turn up in the MongoDB server logs? If so, then bear in mind that the profiler output is written to the system.profile collection in whichever database profiling has been enabled.
More details in the docs but the short summary is:
// turn up the logging
db.setProfilingLevel(2)
// ... run some commands
// find all profiler documents, most recent first
db.system.profile.find().sort( { ts : -1 } )
// turn down the logging
db.setProfilingLevel(0)

NodeJS/Express: ECONNRESET when doing multiples requests using Sequelize/Epilogue

I'm building a webapp using the following the architecture:
a postgresql database (called DB),
a NodeJS service (called DBService) using Sequelize to manipulate the DB and Epilogue to expose a REST interface via Express,
a NodeJS service called Backend serving as a backend and using DBService threw REST calls
an AngularJS website called Frontend using Backend
Here are the version I'm using:
PostgreSQL 9.3
Sequelize 2.0.4
Epilogue 0.5.2
Express 4.13.3
My DB schema is quite complex containing 36 tables and some of them contains few hundreds of records. The DB is not meant to write data very often, but mostly to read them.
But recently I created a script in Backend to make a complete check up of datas contained inside the DB: basically this script retrieve all datas of all tables and do some basic checks on datas. Currently the script only does reading on database.
In order to achieve my script I had to remove the pagination limit of Epilogue by using the option pagination: false (see https://github.com/dchester/epilogue#pagination).
But now when I launch my script I randomly obtained that kind of error:
The request failed when trying to retrieve a uniquely associated objects with URL:http://localhost:3000/CallTypes/178/RendererThemes.
Code : -1
Message : Error: connect ECONNRESET 127.0.0.1:3000
The error randomly appears during the script execution: then it's not always this URL which is returned, and even not always the same tables or relations. The error message before code is a custom message returned by Backend.
The URL is a reference to the DBService but I don't see any error in it, even using logging: console.log in Sequelize and DEBUG=express:* to see what happens in Express.
I tried to put some setTimeout in my Backend script to slow it, without real change. I also tried to manipulate different values like PostgreSQL max_connections limit (I set the limit to 1000 connections), or Sequelize maxConcurrentQueries and pool values, but without success yet.
I did not find where I can customize the pool connection of Express, maybe it should do the trick.
I assume that the error comes from DBService, from the Express configuration or somewhere in the configuration of the DB (either in Sequelize/Epilogue or even in the postgreSQL server itself), but as I did not see any error in any log I'm not sure.
Any idea to help me solve it?
EDIT
After further investigation I may have found the answer which is very similar to How to avoid a NodeJS ECONNRESET error?
: I'm using my own object RestClient to do my http request and this object was built as a singleton with this method:
var NodeRestClient : any = require('node-rest-client').Client;
...
static getClient() {
if(RestClient.client == null) {
RestClient.client = new NodeRestClient();
}
return RestClient.client;
}
Then I was always using the same object to do all my requests and when the process was too fast, it created collisions... So I just removed the test if(RestClient.client == null) and for now it seems to work.
If there is a better way to manage that, by closing request or managing a pool feel free to contribute :)

Resources