I have a node.js app that polls every x seconds a mongodb database to show data variations.
I think there must be a better way to do it, using all the capabilities of node.js environment ... which is the most efficient way to trigger mongodb data variations and showing them using node?
You can use a tailable cursor
http://docs.mongodb.org/manual/tutorial/create-tailable-cursor/
Related
I am testing postgresql with a nodejs backend server, using Pg npm module to query the database. The issue I am having is that when I run a particular query directly on the postgres database table using query tool on pgAdmin4, the data is fetched within 5 seconds. But the same query when requested from the backend through my nodejs server, the process is split between parallel workers and a client backend using IPC: messagequeuesend, this runs for almost 17minutes before return the data. I can't understand why the same query is fast using query tool, it just processes it fast but the one coming from my server has to delay. Is there a way to increase the priority for queries coming from backend to run like it was queried inside pgAdmin. I noticed when I check pg_stat_activity, there is an application value for the query when using query tool, but when the same query comes from the nodejs server the application value is null. I do not seem to understand why its like this, i have been searching every community for answers to this for the past 5 days, and there is no question or answer for this. Please any help will be appreciated. Thanks in advance
I tried running a query from the backend, but its split using IPC processes and result comes in after 17 minutes, the same query takes only 5 seconds to return a result inside pgAdmin query tool
We have architecture problem on our project. This project requires sharding, as soon as we need almost unlimited scalability for the part of services.
Сurrently we use Node.js + MongoDb (Mongoose) and MySQL (TypeORM). Data is separated by databases through the simple 'DB Locator'. So node process needs connections to a lot of DBs (up to 1000).
Requests example:
HTTP request from client with Shop ID;
Get DB IP address/credentials in 'DB Locator' service by Shop ID;
Create connection to specific database with shop data;
Perform db queries.
We tried to implement it in two ways:
Create connection for each request, close it on response.
Problems:
we can't use connection after response (it's the main problem, because sometimes we need some asynchronous actions);
it works slower;
Keep all connections opened.
Problems:
reach simultaneous connections limit or some another limits;
memory leaks.
Which way is better? How to avoid described problems? Maybe there is a better solution?
Solution #1 perfectly worked for us on php as it runs single process on request and easily drops connections on process end. As we know, Express is pure JS code running in v8 and is not process based.
It would be great to close non-used connections automatically but can't find options to do that.
The short answer: stop using of MongoDB with Mongoose 😏
Longer answer:
MongoDB is document-oriented DBMS. The main usage case is when you have some not pretty structured data that you have to store, but you don't need to use too much. There is lazy indexing, dynamic typing and many more things that not allow you to use it as RDBMS, but it is great as a storage of logs or any serialized data.
The worth part here is Mongoose. This is the library that makes you feel like your trashbox is wonderful world with relations, virtual fields and many things that should not to be in DODBMS. Also, there is a lot of legacy code from previous versions that also make some troubles with connections management.
You already use TypeORM that may works instead Mongoose. With some restrictions, for sure.
It works exactly same way as MySQL connection management.
Here is some more data: https://github.com/typeorm/typeorm/blob/master/docs/mongodb.md#defining-entities-and-columns
In this case you may use you TypeORM Repository as transparent client that will init connections and close it or keep it alive on demand.
I am dividing the load on my Database and want to retrieve data from ES and write data to MongoDB. Can I sync them real time? I have checked the Transporter library but I want to do it for realtime.
There are several ways to achieve that :
Using your own application server. Whenever you are inserting a new
document in the mongo, put it in the ES as well at the same time.
That way you will maintain the consistency with minimum latency.
Use logstash. It has near realtime pipelining capabilities.
You can use elasticsearch mongodb river. Its a plugin used for data synchronization between mongo and elasticsearch.
I have two Node.js applications each running Mongoose on a different machine. There is a single MongoDB database running on the first, and the second connects to it and adds documents periodically. I'm trying to add a hook to the creation of these documents so the server running the database is aware that other server has added data. I tried using the Schema.post() method, but it doesn't seem to work since there are two separate instances of Mongoose. Is this true or am I just implementing it incorrectly? I can get the hook to fire if the document is created on the same server, but not the other.
So my thought is to add the hook to MongoDB directly, instead of Mongoose, but I'm not sure how to go about doing that. Am I on the right track?
That is true, Schema.post() only work in the same process. You either need to use a library that tails MongoDB's oplog (like mongo-oplog), implement it yourself using a message queue (or pub/sub) that all instances are connected to (like Redis, RabbitMQ, etc) or use a database that supports this natively. PostgreSQL supports this with its NOTIFY feature for example.
in my application I have a default database and other database I have to connect to in function of client's requests , since with mongoose in node as far as I understood: there is a pool of connections application wide, if I change database, it is changed for all the subsequent requests, I think it could cause some problems, what is the best way to switch Database with mongoose?
Mongoose 3.7.1 (unstable) supports switching databases.
Otherwise you'll need to create separate connection instances for each database.