I have a MongoDB database, Node.js server and User Interface.
The connection between the three is working and data is being sent between them.
I am creating a real-time web application and I am using a setTimeout() feature to request data from database via the client's webpage every 4 seconds.
BUT the data that is coming back is the same everytime and isnt updating from the database.
It is acting as if there is only one MongoDB session and just sending the same set of data that is collected from when the server starts.
On the Node.js server code, I made sure that a connection is opened and close once the query has completed but it is collecting the same set of data every time and no new data that has come into the database. The only way I can get it to update the data from the database is to turn the server on and off to reconnect to the database.
Would using some thing like socket.io be better for the real time requests or is it possible to do real time data just with a setTimeout() feature on the client side and normal MongoDB collection queries?
Is there a way to constantly refresh the connection to the database?
Related
I am trying to add a functionality in my web app where whenever a new friend request is received in the database (mongodb) then i get a notification through from backend (Node.js) to my frontend (React.js)
Now i researched about this functionality and get to know about socket.io but the problem is the solutions i found which were using socket.io were kind of a brute force according to me ,
In those solutions they were querying the database inside the socket.emit(),
Now according to me if I keep querying the database every 4-5 seconds is it a good approach to do that doesn't it put load on database?
What is the right way to do this?
What i have tried so far is finding a better solution than querying the database again and again till i get an update. But i had no luck ..
The best approach is to connect frontend with backend using websocket/socket.io and as soon as you add a new object the server should push the data to frontend. You don't have to run a database query every 4-5 second. Write a server push event in your data.save() function. So as soon as you create a new object, the backend sends data to frontend.
I'm working on an API that sends messages contained in a SQL Server database.
The problem is that I implemented it in a way that every 10 seconds the API performs a query for messages not yet sent.
I would like to optimize this by making the SQL Server warn every time my table receives an insert, so that the application can query the messages to be sent.
For that I'm using node JS and importing Sequelize.
I also think it's important to comment that the inserts of this table are made by another application.
If your infrastructure has a message broker set up (it probably doesn't, but ask your operations crew) you can use an event notification for this.
Otherwise you're stuck polling your server, as you described in your question. That is a workable strategy, but it's worth hard work to make sure the polling query is very efficient.
I have several clients. Each client has multiple data acquisition stations. Each station has multiple sensors. I need to store this data on a MongoDB server (using mongoose and Node.js). So, I thought about the organization that follows.
Each client has its own database inside the MongoDB server.
Within this database, each station has its own collection.
Data is sent to the MongoDB server via an MQTT broker (Node.js). So, depending on which client the broker receives the message from, I need to create a new connection to the MongoDB server (using mongoose.createConnection).
I'm not sure if this is the best alternative. I don't know if creating multiple different connections will slow down the system.
What do you think?
I want to send ws message to the client when my sql sever database table values updated or new row inserted (real time data) for that client. how can I achieve this. I'm using Node js and Mssql for DB. what I tried before was checking every second to DB and sending the records to the clients. But the problem is when more users connected I'm getting database timeout error. Why because I'm checking db for every users every seconds. What can I do now how can I solve this.
I am developing an Angular Web App that receives its data from a nodejs/express API.
This API runs mongoose that connect to MongoLab (the free account).
When receiving data, I experience a response time > 500ms for small data sets (1.5kb) and > 1s for "large" data sets (hundreds of kb).
This is clearly already too much and I am affraid it will be even worse when my db will grow.
The current process is as follow:
Client goes to mysite.com/discover
Server send the Angular App
Client does an ajax request to mysite.com/api/collections
Server connects to MongoLab, receives data
Server send back data to client
This process is very fast in local development (local node, local MongoDB) (<20ms) but takes so much time when put online. I investigated what was taking so much time and I found two equal contributions:
API response time
MongoLab response time
The MongoDB query takes no time (<1ms).
The Question
What are my options to reduce this response time? Is it possible to store locally the data and use mongoLab as a "copy" (it would remove the MongoDB latency in most cases)? If so, would you suggest disk temporary storage, mongoDB replica, ...?
What I tried
I migrated my mongoLab DB to match the physical localization of my server (VM on digitalocean), it improve by a few 50ms, not much more.
Thanks a lot