Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
Looking to use a message queue in a small web app I'm building with node.js. I looked at resque but not sure that's appropriate. The goal is to push notifications to clients based on backend and other client actions with socketio. I could do this with just socketio but I thought maybe a proper message queue would make this cleaner and I wouldn't have to reinvent the wheel.
What are the options out there?
you could use redis with the lightning fast node_redis client. It even has built-in pubsub semantics.
You could use the node STOMP client. This would let you integrate with a variety of message queues including:
ActiveMQ
RabbitMQ
HornetQ
I haven't used this library before, so I can't vouch for its quality. But STOMP is a pretty simple protocol so I suspect you can hack it into submission if necessary.
Another option is to use beanstalkd with node. beanstalkd is a very fast "task queue" written in C that is very good if you don't need the feature flexibility of the brokers listed above.
Shameless plug: I'm working on Bokeh: a simple, scalable and blazing-fast task queue built on ZeroMQ. It supports pluggable data stores for persisting tasks, currently in-memory, Redis and Riak are supported. Check it out.
Here's a couple of recommendations I can make:
node-amqp: A RabbitMQ client that I have successfully used in combination with Socket.IO to make a real-time multi-player game and chat application amongst other things. Seems reliable enough.
zeromq.node: If you want to go down the non-brokered route this might be worth a look. More work to implement functionality but your more likely to get lower latency and higher throughput.
Take a look at node-busmq - it's a production grade, highly available and scalable message bus backed by redis.
I wrote this module for our global cloud and it's currently deployed in our production environment in several datacenters around the world. It supports named queues, peer-to-peer communication, guaranteed delivery and federation.
For more information on why we created this module you can read this blog post: All Aboard The Message Bus
I recommend trying Kestrel, it's fast and simple as Beanstalk but supports fanout queues. Speaks memcached. It's built using Scala and used at Twitter.
kue is the only message queue you would ever need
You might want to have a look at
Redis Simple Message Queue for Node.js
Which uses Redis and offers most features of Amazons SQS.
Look at node-queue-lib. Perhaps it is enough that you.
It support node.js and browsers. Has two delivery strategies: broadcast and round-robin.
Only javascript.
Quick example:
var Queue = require('node-queue-lib/queue.core');
var queue = new Queue('Queue name', 'broadcast');
// subscribe on 'Queue name' messages
queue.subscribe(function (err, subscriber) {
subscriber.on('error', function(err){
//
});
subscriber.on('data', function (data, accept) {
console.log(data);
accept(); // accept process message
});
});
// publish message
queue.publish('test');
How about Azure ServiceBus? It supports nodejs.
I used KUE with socketIO like you described.
I stored the socketID with the job and could then retreive it in the Job Complete..
KUE is based on redis and has good examples on github
something like this....
jobs.process('YourQueuedJob',10, function(job, done){
doTheJob(job, done);
});
function doTheJob(job, done){
var socket = io.sockets.sockets[job.data.socketId];
try {
socket.emit('news', { status : 'completed' , task : job.data.task });
} catch(err){
io.sockets.emit('news', { status : 'fail' , task : job.data.task , socketId: job.data.socketId});
}
job.complete();
}
You might also want to check out ewd-qoper8: https://github.com/robtweed/ewd-qoper8
Related
I'm building a chat app in react-native, with a nodejs backend. I'm using the google cloud platform.
I'm using websockets to create a continuous connection between the app and the backend. Because users can send messages to specific clients, I'm storing the sockets in nodejs:
var sockets = {}
io.on('connection', socket => {
console.log('user connected')
let userId = socket.handshake.query.userId
sockets[userId] = socket
socket.on('message', msgData => {
let msg = JSON.parse(msgData)
sockets[msg.userId].emit('message', JSON.stringify(msg.message))
}
socket.on('disconnect', () => {
console.log('user disconnected')
delete sockets[userId]
}
})
Please note that this is a simplified example.
The problem is: I'm planning on having multiple instances in different regions behind a load balancer. When you connect to a specific instance, other instances can't reach the sockets object. So when 2 different users are connected to 2 different instances, they can't chat with each other.
To solve this, I was thinking of storing the sockets in a redis cache (cloud memorystore). But the redis instance must be in the same region as the VM instance. But like I said, I have multiple VM instances in multiple regions.
My questions are:
1) Is this solution the best way to go? Or are there any other possibilities, like just storing the sockets in a database?
2) How can I solve the issue of not being able to connect VM instances to a redis instance when they are not in the same region. Should I create a redis instance for each region I use (asia-east1, europe-north1, us-central1), and mirror those 3 redis instances so they all have the same content?
If you on the other hand have a total different approach, please let me know! I'm still learning nodejs and google cloud platform, and I'm open to new input.
Edit: All instances (instancegroups) are ofc in the same VPC.
Edit 2: What if I create a VM in the same region as the redis instance, and use this as a proxy? Would there be any performance issues?
Edit 3: I got it working by creating a proxy server using haproxy. The instance is located in the same region as the redis instance. One question: will there be any performance issues? And is this really the way to go?
Focusing on your first question, I would say that this architecture is not the best way of implementing a chat application. Google Cloud Platform provides a very strong messaging service - Pub/Sub. By using this service all the issues regarding load balancing, concurrency, connections and efficiency would be solved by the default.
Here you can find a really nice article about how to create a chat application wiht Cloud Pub/Sub. It is C# based, but the idea is the same, but using the Nodejs client libraries
Take a look on a general schema on how Pub/Sub works :
The architecture of this app will have the following advantages:
One-to-One (Direct) and One-to-Many messaging functionality
Transmission method that did not require a full server to be
developed
In case you do not want to use Pub/Sub, I would still think that you will need a centralized server application, which will be able to communicate with the users, process their messages and send them to the proper destination and reverse.
Regarding your second question, that may work, but I think it may affect the performance and, more important than that, the clearness of the system itself. It would be a nightmare to maintain, debug something like this.
I am new to the Javascript world, need to design a system where React Ui is calling a Node Server Rest API, which internally trigger Python Rest API, which is a hosting a long running (min 5 minutes) optimization model. We don't intend to make a blocking call instead wanted to notify the React Ui using Push notification (Observable Pattern). Something similar to Stack over flow page, where it provides an information that how many new questions are posted or a new answer or new comment is posted, which can then be refreshed and reviewed.
Sifting through the RxJs and Socket IO what I understood is, we can do the following:
Make an Async call using Observable, on the server Socket IO is used to subscribe and notify the client, which will do the communication between Node server and the React Client and follow a similar Push notification system at the Python server level, which notifies the Node server. Node server also executes a DB operation once the optimization job is complete
Another option is implementing the Queuing service between Node and python, which provides more durability and resilience to process the data using Pub - Sub model
Though our overall requirements are not enterprise oriented, we have low concurrency and resilience is not a critical factor, can some verify and let me know whether the above mentioned design options correct consideration or there are simpler options, which can help us meet our push notification use case
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
I'm using node.js server and GCloud. What is exactly the GCloud's pub/sub? Does it work like socket or tcp? My server is using socket.io and node-ipc for communicating, can GCloud's pub/sub be the alternative?
PubSub is a messaging service. These allow asynchronous communication between two applications; one "publishes a message" to a message service and then some other process reads that message from the message service at a later time - seconds, minutes, or hours later. The application that published the message does not need to "stay connected".
That's really useful for scalable and reliable communication between applications - but quite different from socket-based communication which is point-to-point between a client and server process. Implementing request/response type communications is difficult over a messaging service - "send and forget" is the usual model. As #komarkovich noted, a message can also be received by many applications if that is appropriate.
Google Cloud Pub/Sub is an asynchronous publish/subscribe messaging service.
Publisher creates and sends messages to a topic. Subscriber creates a subscription to a topic to receive messages from it. Communication can be one-to-many, many-to-one, and many-to-many.
Pub/Sub has two endpoints:
Publisher: Any application that can make HTTPS requests to googleapis.com.
Subscriber:
Pull subscriber: Also any application that can make HTTPS requests to googleapis.com.
Push subscriber: Webhook endpoints that can accept POST requests over HTTPS.
You can check the Cloud Pub/Sub Client Libraries and review the example for Node.js to get you started using Google Cloud Pub/Sub API.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I am making a chat app and would like to add the functionality of showing an online/offline symbol next to users. How can I do this reliably with minimum number of srever requests and database writes?
One way I found was to update a lastSeenAt field in the user document every time that user requests a page and use this to indicate whether the user is online/offline. Another way is to ping the server from the client side at fixed intervals of time and then update the lastSeenAt field.
Both theses ways would require a lot of database writes and/or server requests. Is there a way to do this more efficiently?
You have to push data from your server to the clients upon change, with something like web-sockets or server sent events. socket.io is a popular tool for implementing such functionally.
With socket.io or specifically websockets you could track the if a client is connected or not and have a flag in database that tracks whenever a client connects or
disconnects(keep in mind, the client might be having multiple connections(multiple devices, or even browser tabs), so if one connection disconnect he still might be online.
Assuming you will have multiple state-less webservers(common practice), then once a client connects or disconnects you want to notify other interested clients, then you should a use a pubsub pattern to notify other servers which will notify their connected clients respectively. There's a lot of implementations such as Redis, zeromq, AWS SNS, GCP Cloud Pubsub. You could even use MongoDB with tailable cursors as your pubsub.
However this might mean a lot of constant connections to your server, so this might hurt your scaliblity. If it proves to be to expensive for you then your lastSeenAt approach with some sane polling. You can find out which works better with your setup by just running some experiments.
If database writes and requests worries, you could always throw more servers at it. You could have a microservice specifically for this so your main server wouldn't be affected much and also you could have a database specifically for this. If performance worries you, you can use a in-memory database.
Also you can also have some caching in your servers with simple time based invalidation or you can sync the online/offline data between your servers with a pubsub pattern.
I would suggest try to make your setup as simple as possible, and you can run experiments to see how it handles your expected traffic.
Look into Express.io where you can combine HTTP calls with sockets. (the best method for real-time communication is Sockets).
In my current project, we are trying to implement a functionality using Node-RED for experiments and exploring new technologies.
The functionality is shown as follows. Here, the BadgeReader publishes its data using publish-subscribe to Proximity(it can be easily implemented using MQTT in Node-Red). The Proximity component receives data from BadgeReader and using that data it interact with ProfileDB using request/response interaction mode. Now, my question is--- how can we implmement request/response interaction in Node-RED? (Note that - Request/Response can be implemented using MQTT, but this question is related to dedicated request-response functionality in Node-RED?)
All the available database nodes will allow you to send a query and receive a reply before moving on to the next node in the flow.
There is also the http-request node that will do the same for a HTTP call to a remote service.
You can't do this with the Node-RED MQTT nodes because they either start a flow or end a flow. MQTT is asynchronous and publishers should be totally decoupled from subscribers so there is no way to know if a message ever reaches a subscriber, so no way to handle error cases or time outs properly. While it is possible to do request/response with MQTT it is not best suited this task.
If you want to do this with MQTT or something else then you may have to look at writing your own node, there is no generic request/response capabilities built into Node-RED.
P.S. given your flow of questions over the last few days you should probably look at the Node-RED mailing list here:
https://groups.google.com/forum/#!forum/node-red
It will be better suited to answering your questions than Stack Overflow