Mongooplog alternative - node.js

As we all know that mongooplog tool is going to be removed in upcoming releases. I needed help about some the following issue:
I was planning to create a listener using mongooplog which will read any kind of activity on mongodb and will generate a trigger according to activity which will hit another server. Now, since mongooplog is going out, can anyone suggest what alternative can I use in this case and how to use it.
I got this warning when trying to use mongooplog. Please let me know if you any further questions.
warning: mongooplog is deprecated, and will be removed completely in a future release
PS: I am using node.js framework to implement the listener. I have not written any code yet so have no code to share.

The deprecation message you are quoting only refers to the mongooplog command-line tool, not the general approach of tailing the oplog. The mongooplog tool can be used for some types of data migrations, but isn't the right approach for a general purpose listener or to wrap in your Node.js application.
You should continue to create a tailable cursor to follow oplog activity. Tailable cursors are supported directly by the MongoDB drivers. For an example using Node.js see: The MongoDB Oplog & Node.js.
You may also want to watch/upvote SERVER-13932: Change Notification Stream API in the MongoDB issue tracker, which is a feature suggestion for a formal API (rather than relying on the internal oplog format used by replication).

Related

How to listen to DB changes in Azure Cosmos DB from Node App

Azure CosmosDB provides changed feed feature.
One can listen to DB changes and run business logic in response.
Is there a way this can be achieved in a Node App?
There is a change feed processor library, but I didn't come across any node SDK to use it.
If any one can provide few pointers how this can be achieved it will be great.
There is currently no equivalent of the Change Feed processor library for Node.
However the Node.js SDK allows you to query the change feed itself manually. You can find an example here: https://learn.microsoft.com/en-us/azure/cosmos-db/change-feed#can-i-read-change-feed-using-javascript
This however means that you will have to write the automated process yourself and you can't be notified only when there is a change. You will have to keep previous and next states and compare what's new and what's not.
You can also achieve automated change feed processing using the Azure Functions trigger which can be used in Node.js (thanks Matias). You can find more info on that here: https://learn.microsoft.com/en-us/azure/cosmos-db/change-feed#using-azure-functions
Also this link has a CosmosDB binding example in js (double thanks Matias): https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-cosmosdb#trigger---javascript-example

Can I use a different db as a Wolkenkit read model?

I like MongoDB ok, but I was thinking about just using postgres as the read model and querying from it with graphQL. Do I have to write an adapter to do that? If so, where should I look to start?
As always, it depends 😉
Short answer: No, you can't.
Long answer: Yes, theoretically changing the read model database is possible, as wolkenkit uses an adapter-based approach. Right now MongoDB is the only implemented one, but it would be possible to write one, for whatever datastore you want to use.
Basically, the place to start is the wolkenkit-broker, which is the public API server for wolkenkit, and which also handles reading models. At the center of this there is the so-called modelStore, which acts as an abstraction layer over the specific implementation, such as the modelStoreMongoDb adapter.
GraphQL again is currently not supported out of the box. We use our own approach, implemented in the tailwind module. The place to start here is the HTTP server API.
Please note that I am one of the developers of wolkenkit, so please take my answer with a grain of salt.

Transaction mongodb

I need to write into two different mongodb collections using an 'all or nothing' process. Fyi I use NodeJs in my backend side.
As far as I know MongoDb provides atomicity when it comes to a single collection, but it does not when we need to write into multiple collections.
So I'd like to know a way of emulating this a transaction in nodejs/mongodb in order to avoid writing into one collection if the other failed and also getting the possibility of doing a 'roll back' if the second process fails.
Thank you guys!
Starting from version 4.0 MongoDB will add support for multi-document transactions. Transactions in MongoDB will be like transactions in relational databases.
For details visit this link:
https://www.mongodb.com/blog/post/multi-document-transactions-in-mongodb?jmp=community
I wrote a library that implements the two phase commit system mentioned above. It might help in this scenario. Fawn - Transactions for MongoDB
The transactions for multi-document have been introduced in MongoDB 4.0 !!!
https://docs.mongodb.com/manual/core/transactions
In MongoDB (prior to 4.0) there is no way you can fully implement transactions on database level. However, there are some mechanisms which provides some transactions functionality. You can read about them in documentation.
Since MongoDB 4.0, transactions are supported. Very little chage is needed in your current code to support them. There's a new section in the documentation fully dedicated to the subject

How to bootstrap/initialize couchDB at first run?

I can't find any information on initalizing a couch db. What's the best method of initializing and creating the map and view functions for couchdb at deployment?
I have a node server which will access a couchdb. Should I just create the http calls necessary to create the proper logic on couchdb from my node server or is there a better way handling the initialization of the db?
EDIT: Also is there any good open source projects that I can take examples from?
I'm not sure your question is clear. Remember that CouchDB is schemaless, so, at startup, there probably isn't anything (ie, documents) on which to base view functions.
If you mean a helper to setup a design document with attachments and the like, in addition to the other answers, have a look at Kanso (http://kan.so). If you're comfortable with Node, you'll find it friendly.
If, on the other hand, you're looking for something to analyze existing docs in a CouchDB and guess at good views, I've haven't come across that yet.
One possibility would be to use erica.

translation/localization workflow for nodejs/express app

What setup do you use for localization in your nodejs/express app?
Right now I'm using i18n-node in my project. But I'm not happy with the storage in the json files. I'd like to have the translations stored in a database.
I found a promising module named dialect. It can store the translations in mongodb and there's also a module from the same author which enables you to manage the translations via a webinterface(dialect-http).
Unfortunately the dialect module doesn't seem to work with latest stable versions of node. The problem is known for 2 months but since nothing was updated since then I guess the module isn't actively maintained anymore.
I think using a redis db for storing the translations would also make sense. I don't know if there's an module for that.
Maybe you guys have some hints or know of any good modules?
Why don't you just fork i18n-node and overwrite the read and write functions with your own persistence mechanism?
https://github.com/mashpie/i18n-node/blob/master/i18n.js#L235
It seems like you could easily persist the json data within a redis key instead of a json file with a few changes.
I could suggest you to use lingua. Here an example =) http://www.jmanzano.es/blog/?p=647
Another option to lingua might be http://i18next.com/node comes with backends in redis, mongodb or couchDb (and Filesystem of course!)

Resources