How to make two NODE.js servers communicate each other over RabbitMQ? - node.js

I wanted to create two servers in Node.js and make full-duplex communication with each other over rabbitMQ. I am new to messagebrokers or event-driven development, I just want to make one server serve API to the front-end another one just a chat server? Is that even a good approach?

Working directly with a broker is a bad idea. Typically, a gateway is added between the clients and the broker as an abstract layer. In this case, it will be easier for you to change the broker (for example, from rabbit to kafka, etc.), and you do not need to copy the client <-> broker logic in different languages. As example I just add this link reddwarf. Simple demo service is service and client is client

Related

chat application, peer-to-peer communication

I am in the process of developing a chat application using Javascript. When sending messages from one client to another client, do I have to send it through a server or can I send it directly from a peer-to-peer approach, using something like websockets ?
Welcome to the stage of life where you see the importance of design patterns.
You can start solutionizing with mediator pattern and proxy pattern with web sockets.
Wheater you need a server or not is up to your design.
Technology-wise there are multiple APIs that HTML5 offers you can go through them and make something on your own.
There is a bunch of APIs available with HTML5 and JS.
Start digging on WebSockets, Server-Sent Events, Web Workers.
The server will give you the flexibility of record-keeping while acting as a mediator. Alternatively, you can come up with a pure p2p design with a scheme where every node or user notify other users with their details(IP) for establishing communication. Remember for web socket to work the client need to know what address to connect to. Maybe it can have fixed master nodes. Then you can use observables for polling and other features. Take a look at the BitTorrent protocol for design inspiration.
Get creative and start designing.
There are many ways to do it. I recommend the scheme:
Peer <---> custom websocket server <---> Peer;
I recommend NodeJS with SocketIO.

How to configure MassTransit in an unreliable network environment?

I'm trying to get my head around MassTransit in combination with RabbitMQ.
The basic concepts are working in a test project, but what I need is the following:
My system will have one or more servers that react to real life events (telephony). These events wil, by means of MassTransit and RabbitMQ, translate into messages that will be picked up by one or more receivers via a separate server, set up as RabbitMQ host. So far so good.
However, I cannot assume that I always have a connection between the publisher and the host machines. Just assume that the publishing server will continue to consume the real life events, but now cannot publish it's messages.
So, the question is: Does MassTransit have some kind of mechanism to store messages locally some way until the connection is re-established?
Or should I install RabbitMQ on every publishing server as well, in order to create a local exchange? Then I have to make the exchanges synchronize themselves after a reconnect.
Probably you have to implement a store and forward policy. Instead of publishing directly your message through MassTransit and RabbitMQ, you can store the message in a persistence repository (a local database) and delegate to some other process the notification through Masstransit of the messages stored before. This approach is often referred as "Client High Availability". This does not substitute the standard HA (High Availability) on server like the one implemented by RabbitMQ. But it's a good approach to use in a distributed system (like the one you described) because it could help you a lot in scenarios of server failure (e.g. an issue on RabbitMQ server that causes some loss of messages that you still have inside the store of some client and therefore you can make it process again).

What is the best way to communicate between two servers?

I am building a web app which has two parts. In one part it uses a real time connection between the server and the client and in the other part it does some cpu intensive task to provide relevant data.
Implementing the real time communication in nodejs and the cpu intensive part in python/java. What is the best way the nodejs server can participate in a duplex communication with the other server ?
For a basic solution you can use Socket.IO if you are already using it and know how it works, it will get the job done since it allows for communication between a client and server where the client can be a different server in a different language.
If you want a more robust solution with additional options and controls or which can handle higher traffic throughput (though this shouldn't be an issue if you are ultimately just sending it through the relatively slow internet) you can look at something like ØMQ (ZeroMQ). It is a messaging queue which gives you more control and lots of different communications methods beyond just request-response.
When you set either up I would recommend using your CPU intensive server as the stable end(server) and your web server(s) as your client. Assuming that you are using a single server for your CPU intensive tasks and you are running several NodeJS server instances to take advantage of multi-cores for your web server. This simplifies your communication since you want to have a single point to connect to.
If you foresee needing multiple CPU servers you will want to setup a routing server that can route between multiple web servers and multiple CPU servers and in this case I would recommend the extra work of learning ØMQ.
You can use http.request method provided to make curl request within node's code.
http.request method is also used for implementing Authentication api.
You can put your callback in the success of request and when you get the response data in node, you can send it back to user.
While in backgrount java/python server can utilize node's request for CPU intensive task.
I maintain a node.js application that intercommunicates among 34 tasks spread across 2 servers.
In your case, for communication between the web server and the app server you might consider mqtt.
I use mqtt for this kind of communication. There are mqtt clients for most languages, including node/javascript, python and java. In my case I publish json messages using mqtt 'topics' and any task that has registered to subscribe to a 'topic' receives it's data when published. If you google "pub sub", "mqtt" and "mosquitto" you'll find lots of references and examples. Mosquitto (now an Eclipse project) is only one of a number of mqtt brokers that are available. Another very good broker that is written in Java is called hivemq.
This is a very simple, reliable solution that scales well. In my case literally millions of messages reliably pass through mqtt every day.
You must be looking for socketio
Socket.IO enables real-time bidirectional event-based communication.
It works on every platform, browser or device, focusing equally on reliability and speed.
Sockets have traditionally been the solution around which most
realtime systems are architected, providing a bi-directional
communication channel between a client and a server.

how to distribute socket.io

Im using nodejs and socket.io to deliver a chat on my business app, but i want to distribute the deploy so i can have as many chat servers i want to balance the load of the traffic.
I try the load balance approach from nginx but that just do that balance the traffic but the communication between the socket.io serves its not the same, so one chat message send from user A to server S1 wont travel to user B on server S2.
There is any tool or approach to do this.
Thanks in advance.
===== EDIT =====
Here is the architecture of the app.
The main app frontend on PHP CodeIgniter lets tag it as PHPCI
The chat app backend on NodeJs and SocketIO lets tag it as CHAT
The chat model data on Redist lets tag it as REDIST
So what i have now its PHPCI -> CHAT -> REDIST. That work just fine.
What i need is to distribute the application so i can have as many PHPCI or CHAT or REDIST i want, example
PHPCI1 CHAT1
PHPCI2 -> -> REDIST1
PHPCI3 CHAT2
Where the numbers represent instances not different apps.
So a User A connected to PHPCI1 can send a message to a user B connected on PHPCI3.
I think some queue in the middle of CHAT can handle this something like rabbitmq that can only use the SocketIO to deliver the messages to the client.
If you're distributing the server load (and that's a requirement), I'd suggest adding a designated chat data server (usually an in-memory database or message queue) to handle chat state and message passing across edge servers.
Redis Pub/Sub is ideal for this purpose, and can scale up to crazy levels on even a low-end machine. The Redis Cookbook has a chapter on precisely this use case.
If you set up the server-side of your chat app correctly, you shouldn't have to distribute socket.io. Since node.js is browser-based and doesn't require any client-side code (other than the resources downloaded from the webpage), it works automatically. With a webpage, the files required to run socket.io are temporarily downloaded to users when they are correctly included (just like with jQuery). If you are using node.js and socket.io to make an android app, the files should be included in your application when you distribute it, not separately.
In addition, if you wish to use two separate socket.io servers, you should be able to establish communication between the two by connecting them in a similar manner that a client connects to the server, but with a special parameter that lets the other server know that a server connected and it can respond and set a variable for the other server.

Let a Node.js WAMP.IO server subscribe to a clients publish in purpose to measure latency

I know that the intended use case of a Pub/Sub pattern is to let the clients "multicast" directly to each other with the servers work transparent. But there has been a few occasions where I'd like my server to react a clients publish. Basically I'd like a server.Subscribe('event:ident', callback). Before starting to implement this, I guess I'm not the first running in to this limitation. How do you folks solve it?
In this latest case I'd like to measure the latency between two clients. So I let first client publish a message which the other client subscribe to will respond to ASAP. Obviuosly the traffic will pass through the server. So I'd like the server to also respond so I can separate the latency from the first client to the server from the latency from the server to the second client.
Do you see any pitfalls with this approach? (Except that I'm breaking the strict PubSub pattern)
Note that I'm using the WAMP.IO lib (implementing the WAMP-protocol). I'm not talking about Windows, Apache, PHP and MySQL server!
For WAMPv1, an (ugly) solution is to break the PubSub pattern and have the client publish the message as an RPC, where then server then publishes a PubSub message.
In WAMPv2 (under development), a server will also be able to subscribe to topics.

Resources