I am writing REST api which has to provide kind of real-time communication between users. Lets say I have db.orders collection. And I have api GET /order/{id}. This api should wait for some change in order document. For example it should return some data only when order.status is ready. I know how to do long-polling but I no idea how to check for data to appear/change in db. It would be easy if there was one app instance - then I could do this in memory, something like this:
var queue = []
// GET /order/{id}
function(req,res,next) {
var data = getDataFromDb();
if(data && data.status == 'ready') {
res.send(data);
return;
}
queue.push({id: req.params.id, req: req, res: res, next: next});
}
// POST /order/{id}
function(req,res,next) {
req.params.data.status = 'ready'
saveToDb(req.params.data);
var item = findInQueue(queue,req.params.id);
if(item) item.res.send(req.params.data);
}
First handler waits for data to have status ready and second sets status of data to ready. Its just a pseudocode and many things are missing (timeout for example).
The problem is when I want to use many instances of such app - I need some messaging mechanism which will allow to communicate across instances in kind of real time.
I read about REDIS PUB/SUB but I am not sure if I can use it in this way...
I am using node.js + restify + mongoDB for now.
You are looking for the oplog. It's a special capped collection where all operations on the database are stored. To enable them on a single server you can do.
mongod --dbpath=./data --oplogSize=100 --replSet test
then connect to the server using the console and write
rs.initiate()
use the console and do
use local
show collections
Notice the collection oplog.rs. it contains all the operations that have been applied to the server. If you are using node.js you can listen to the changes in the following way
var local = db.db("local");
var steam = local.collection("oplog.rs").find({}, {tailable:true, awaitdata:true}).stream();
stream.on('data', function(doc) {
});
for each operation on mongodb you'll receive a doc where you can establish if something you are interested in changed state.
Related
Okay so I have a Nodejs/Express app that has an endpoint which allows users to receive notifications by opening up a connection to said endpoint:
var practitionerStreams = [] // this is a list of all the streams opened by pract users to the
backend
async function notificationEventsHandler(req, res){
const headers ={
'Content-Type': 'text/event-stream',
'Connection': 'keep-alive',
'Cache-Control': 'no-cache'
}
const practEmail = req.headers.practemail
console.log("PRACT EMAIL", practEmail)
const data = await ApptNotificationData.findAll({
where: {
practEmail: practEmail
}
})
//console.log("DATA", data)
res.writeHead(200, headers)
await res.write(`data:${JSON.stringify(data)}\n\n`)
// create a new stream
const newPractStream = {
practEmail: practEmail,
res
}
// add the new stream to list of streams
practitionerStreams.push(newPractStream)
req.on('close', () => {
console.log(`${practEmail} Connection closed`);
practitionerStreams = practitionerStreams.filter(pract => pract.practEmail !== pract.practEmail);
});
return res
}
async function sendApptNotification(newNotification, practEmail){
var updatedPractitionerStream = practitionerStreams.map((stream) =>
// iterate through the array and find the stream that contains the pract email we want
// then write the new notification to that stream
{
if (stream["practEmail"]==practEmail){
console.log("IF")
stream.res.write(`data:${JSON.stringify(newNotification)}\n\n`)
return stream
}
else {
// if it doesnt contain the stream we want leave it unchanged
console.log("ELSE")
return stream
}
}
)
practitionerStreams = updatedPractitionerStream
}
Basically when the user connects it takes the response object (that will stay open), will put that in an Object along with a unique email, and write to it in the future in sendApptNotification
But obviously this is slow for a full app, how exactly do I replace this with Redis? Would I still have a Response object that I write to? Or would that be replaced with a redis stream that I can subscribe to on the frontend? I also assume I would store all my streams on redis as well
edit: from what examples I've seen people are writing events from redis to the response object
Thank you in advance
If you want to use Redis Stream as notification system, you can follow this official guide:
https://redis.com/blog/how-to-create-notification-services-with-redis-websockets-and-vue-js/ .
To get this data as real time you need to create a websocket connection. I prefer to send to you an official guide instead of create it for you it's because the quality of this guide. It's perfect to anyone understand how to create it, but you need to adapt for your reality, of course.
However like I've said to you in the comments, I just believe that it's more simple to do requests in your api endpoint like /api/v1/notifications with setInterval in your frontend code and do requests each 5 seconds for example. If you prefer to use a notification system as real time I think you need to understand why do you need it, so in the future you can change your system if you think you need it. Basically it's a trade-off you must to do!
For my example imagine two tables in a relational database, one as Users and the second as Notifications.
The tables of this example:
UsersTable
id name
1 andrew
2 mark
NotificationTable
id message userId isRead
1 message1 1 true
2 message2 1 false
3 message3 2 false
The endpoint of this example will return all cached notifications that isn't read by the user. If the cache doesn't exists, it will return the data from the database, put it on the cache and return to the user. In the next call from API, you'll get the result from cache. There some points to complete in this example, for example the query on the database to get the notifications, the configuration of time expiration from cache and the another important thing is: if you want to update all the time the notifications in the cache, you need to create a middleware and trigger it in the parts of your code that needs to notify the notifications user. In this case you'll only update the database and cache. But I think you can complete these points.
const redis = require('redis');
const redisClient = redis.createClient();
app.get('/notifications', async (request, response) => {
const userId = request.user.id;
const cacheResult = await redisClient.get(`user:${userId}:notifications`)
if (cacheResult) return response.send(cacheResult);
const notifications = getUserNotificationsFromDatabase(userId);
redisClient.set(`user:${userId}:notifications`, notifications);
response.send(notifications);
})
Besides that there's another way, you can simple use only the redis or only the database to manage this notification. Your relational database with the correct index will send to your the results as faster as you expect. You'll only think about how much notifications you'll have been.
I want to make a progress bar kind of telling where the user where in process of fetching the API my backend is. But it seems like every time I send a response it stops the request, how can I avoid this and what should I google to learn more since I didn't find anything online.
React:
const {data, error, isError, isLoading } = useQuery('posts', fetchPosts)
if(isLoading){<p>Loadinng..</p>}
return({data&&<p>{data}</p>})
Express:
app.get("api/v1/testData", async (req, res) => {
try {
const info = req.query.info
const sortByThis = req.query.sortBy;
if (info) {
let yourMessage = "Getting Data";
res.status(200).send(yourMessage);
const valueArray = await fetchData(info);
yourMessage = "Data retrived, now sorting";
res.status(200).send(yourMessage);
const sortedArray = valueArray.filter((item) => item.value === sortByThis);
yourMessage = "Sorting Done now creating geojson";
res.status(200).send(yourMessage);
createGeoJson(sortedArray)
res.status(200).send(geojson);
}
else { res.status(400) }
} catch (err) { console.log(err) res.status(500).send }
}
You can only send one response to a request in HTTP.
In case you want to have status updates using HTTP, the client needs to poll the server i.e. request status updates from the server. Keep in mind though that every request needs to be processed on the server side and will take resources away which are then not available for other (more important) requests from other clients. So don't poll too frequently.
If you want to support long running operations using HTTP have a look at the following API design pattern.
Alternatively you could also use a WebSockets connection to push updates from the server to the client. I assume your computation on the backend will not be minutes long and you want to update the client in real-time, so probably WebSockets will be the best option for you. A WebSocket connection has, once established, considerably less overhead than sending huge HTTP requests/ responses between client and server.
Have a look at this thread which dicusses abovementioned and other possibilites.
I built a simple API endpoint with NodeJS using Sails.js.
When someone access my API endpoint, the server starts to wait for data and whenever a new data appears, he broadcasts it using sockets. Each client should receive his own stream of data based on his user input.
var Cap = require('cap').Cap;
collect: function (req, res) {
var iface = req.param("ip");
var c = new Cap(),
device = Cap.findDevice(ip);
c.on('data', function(myData) {
sails.sockets.blast('message', {"host": myData});
});
});
The response do not complete (I never send a res.json() - what actually happens is that the browser keep loading - but the above functionality works).
2 Problems:
I'm trying to subscribe and unsubscribe to to this API endpoint from my client (using RxJS). When I subscribe, I start to receive data via sockets - but I can't unsubscribe to the API endpoint (the browser expect the request to be completed).
Each client should subscribe to his own socket room based on the request IP parameter ( see updated code ). Currently it blasts the message to everyone.
How I can create a stream/service-like API endpoint with Sails.js that will emit new data to each user based on his input?
My goal is to be able to subscribe / unsubscribe to this API endpoint from each client.
Revised Answer
Let's assume your API endpoint is defined in config/routes.js like this:
...
'get /collect': 'SomeController.collectSubscribe',
'delete /collect': 'SomeController.collectUnsubscribe',
Since each Cap instance is tied to one device, we need one instance for each subscription. Instead of using the sails join/leave methods, we keep track of Cap instances in memory and just broadcast to the request socket's id. This works because Sails sockets are subscribed to their own ids by default.
In api/controllers/SomeController.js:
// In order for the `Cap` instances to persist after `collectSubscribe` finishes, we store them all in an Object, associated with which socket the were created for.
var caps = {/* req.socket.id: <instance of Cap>, */};
module.exports = {
...
collectSubscribe: function(req, res) {
if (!res.isSocket) return res.badRequest("I need a websocket! Help!");
if (!!caps[req.socket.id]) return res.badRequest("Dude, you are already subscribed.");
caps[req.socket.id] = new Cap();
var c = caps[req.socket.id]; // remember that `c` is a reference to our new `Cap`, not a copy.
var device = c.findDevice(req.param('ip'));
c.open(device, ...);
c.on('data', function(myData) {
sails.sockets.broadcast(req.socket.id, 'message', {host: myData});
});
return res.ok();
},
collectUnsubscribe: function(req, res) {
if (!res.isSocket) return res.badRequest("I need a websocket! Help!");
if (!caps[req.socket.id]) return res.badRequest("I can't unsubscribe you unless you actually subscribe first.");
caps[req.socket.id].removeAllListeners('data');
delete caps[req.socket.id];
return res.ok();
}
}
Basically, it goes like this: when a browser request triggers collectSubscribe, a new Cap instance listens to the provided IP. When the browser triggers collectUnsubscribe, the server retreives that Cap instance, tells it to stop listening, and then deletes it.
Production Considerations: please be aware that the list of Caps is NOT PERSISTENT (since it is stored in memory and not a DB)! So if your server is turned off and rebooted (due to lightning storm, etc), the list will be cleared, but considering that all websocket connections will be dropped anyway, I don't see any need to worry about this.
Old Answer, Kept for Reference
You can use sails.sockets.join(req, room) and sails.sockets.leave(req, room) to manage socket rooms. Essentially you have a room called "collect", and only sockets joined in that room will receive a sails.sockets.broadcast(room, eventName, data).
More info on how to user sails.sockets here.
In api/controllers/SomeController.js:
collectSubscribe: function(req, res) {
if (!res.isSocket) return res.badRequest();
sails.sockets.join(req, 'collect');
return res.ok();
},
collectUnsubscribe: function(req, res) {
if (!res.isSocket) return res.badRequest();
sails.sockets.leave(req, 'collect');
return res.ok();
}
Finally, we need to tell the server to broadcast messages to our 'collect' room.
Note that this only need to happen once, so you can do this in a file under the config/ directory.
For this example, I'll put this in config/sockets.js
module.exports = {
// ...
};
c.on('data', function(myData) {
var eventName = 'message';
var data = {host: myData};
sails.sockets.broadcast('collect', eventName, data);
});
I am assuming that c is accessible here; If not, you could define it as sails.c = ... to make it globally accessible.
I have a Node / Express based server that uses Mongodb / Mongoose as it's data store. The clients are iOS apps that mostly access data (posts) using a REST API on the server. However, the client user can also start a chat on individuals posts in the app, which require that the user find out immediately if a new message is posted for that specific chat/post. This is handled by opening a websocket between the iOS app (socketrocket) and the server (Einaros WS socket).
Each chat involves only a handful of the total number of users on the system (e.g. 2 - 5 of the thousands of active users), and there are many concurrent chats on different posts. When a new message is received by the server (using an HTML POST), I need to figure out how to inform just the corresponding websockets of that message. Ideally I would like to only iterate through the 2-5 users that are connected to one post rather than the full list of all open sockets.
What would be the best way to store and access this list of sockets (for a post) without going through all of the open sockets, and blocking the event loop? Are there better ways to do this without using sockets at all, since I only care about the notification from server to client that a new message is available? I debated using long polling as well but ran into issues when trying to store the Express response object.
I'm thinking that the incoming message handler would look something like below where I can access the list of sockets for each story, but storing sockets in the database feels strange to me.
app.post('/api/message', isLoggedInAPI, function(req, res) {
var story_id = req.body.story_id;
var text = req.body.message;
var message = {text: text, sender: req.user._id};
Story.findByIdAndUpdate(story_id, {$push: {messages: message}}, {safe: true, upset: true})
.execAsync()
.catch(function(err) {
res.status(404).end();
Promise.resolve();
})
.then(function(story){
// console.log(story, story_id);
if(story)
{
//console.log(story, story_id);
res.status(200).json({count: story.messages.length});
for (var i=0; i< story.pendingResponses.length; i++)
{
socket = story.pendingResponses[i];
// Send message to socket
});
}
story.pendingResponses.length = 0;
story.update({$set: {pendingResponses: []}}, {safe: true, upset: true})
.execAsync()
.then(function(){
})
.catch(function(err) {
});
}
else res.status(404).end();
}).catch(function(err) {
res.status(501).send(err).end();
});
});
I was able to solve this by doing the following. When I get a new socket, I store it in an array using the user_id which is unique for each connection.
var clients = [];
store socket with user ID (could store res for long polling the same way)
clients[req.user.id]=socket;
I store the user_id in an array of pendingResponses for each post / Story, at the same time.
And in my POST api,
for (var i=0; i< story.pendingResponses.length; i++)
{
id = story.pendingResponses[i];
do something with clients[id] to talk to thesocket
}
This means that for each POST, the event loop will look through pending sockets for that specific story only. I do worry about the scenario of what happens if there are many users connected to the same story, in which case this loop becomes problematic.
To scale this as the number of users grow requires using some sort of messaging queue (such as ZMQ, RabbitMQ, Redis Pub Sub), so that messages from one server instance can be shared with users connected to other instances.
I am using node.js for server side development and backbone.js for client side development. i want to fetch data from multiple table(more than 3) by sending only one request to node.js. but i cant merge all that result with each other beacuse of asynchronous execution of node.js. i have done this but it sending a lots of get request to node js for getting data from all the tables and because of these performance of my site is become slower. please help me if anyone having any idea.
I would create a method which aggregates the results from each of the requests and sends the response back. Basically each of your three async db calls would pass their data to the same method. That method would check to see if it had all of the data it needed to complete the request, and if it did, send the response.
Here is a pseudo code example:
function handleRequest(req, res) {
var results = {};
db.getUsers(function(data) {
aggregate('users', data);
});
db.getPosts(function(data) {
aggregate('posts', data);
});
db.getComments(function(data) {
aggregate('comments', data);
});
function aggregate(name, data) {
results[name] = data;
if(results.users && results.posts && results.comments) {
res.send(results);
}
}
}
This is simplified greatly, you should also of course check for errors and timeouts to the db calls, but this will allow you to wait for all the async commands to complete before sending the data.