Simultaneous gRPC clients sync/async server - rpc

I'm just curious,
Does a sync gRPC server supports connection from multiples clients?
If not, and the async ones does?
And combination of async server/sync client? is even possible?

Yes, synchronous gRPC supports multiple connected clients out of the box. I have personally tested with up to 2000 simultaneously connected clients to an microservice written in Go exposing a single API.

Related

can we use grpc instead of socket.io(websocket)? _Nodejs grpc

I am implementing a grpc server in nodejs . grpc is great. I want to know is it possible to send a message from grpc server to the client(not request/response)?
I know we have bidirectional communication in grpc with full-duplex. which means this is similar functionality we have in socket.io(websocket). how we can push messages to a single client? is it possible to track down grpc clients? or even better, get a heartbeat?
gRPC servers cannot initiate connections to clients. Your best bet is to initiate bidirectional streaming from the client, as you said. If your client applications also ran gRPC servers, the application server could initiate connections to them, but that may be a heavy-handed solution.

Does pusher support bi-directional communication? If yes then how to implement it in Node.js?

I can see only one way communication in pusher docs i.e., from server to client. How to do it from client to server with node.js?
Pusher Channels does not support bidirectional transport. If you need to send data from your client to your server you will have to use another solution such as a POST request.
Channels does offer webhooks which can be triggered by certain events in the application and could be consumed by your server if they fit your requirements. However, webhooks are designed to keep you informed of certain events within your application rather than as a means of communication between client and server.

What is the best way to communicate between two servers?

I am building a web app which has two parts. In one part it uses a real time connection between the server and the client and in the other part it does some cpu intensive task to provide relevant data.
Implementing the real time communication in nodejs and the cpu intensive part in python/java. What is the best way the nodejs server can participate in a duplex communication with the other server ?
For a basic solution you can use Socket.IO if you are already using it and know how it works, it will get the job done since it allows for communication between a client and server where the client can be a different server in a different language.
If you want a more robust solution with additional options and controls or which can handle higher traffic throughput (though this shouldn't be an issue if you are ultimately just sending it through the relatively slow internet) you can look at something like ØMQ (ZeroMQ). It is a messaging queue which gives you more control and lots of different communications methods beyond just request-response.
When you set either up I would recommend using your CPU intensive server as the stable end(server) and your web server(s) as your client. Assuming that you are using a single server for your CPU intensive tasks and you are running several NodeJS server instances to take advantage of multi-cores for your web server. This simplifies your communication since you want to have a single point to connect to.
If you foresee needing multiple CPU servers you will want to setup a routing server that can route between multiple web servers and multiple CPU servers and in this case I would recommend the extra work of learning ØMQ.
You can use http.request method provided to make curl request within node's code.
http.request method is also used for implementing Authentication api.
You can put your callback in the success of request and when you get the response data in node, you can send it back to user.
While in backgrount java/python server can utilize node's request for CPU intensive task.
I maintain a node.js application that intercommunicates among 34 tasks spread across 2 servers.
In your case, for communication between the web server and the app server you might consider mqtt.
I use mqtt for this kind of communication. There are mqtt clients for most languages, including node/javascript, python and java. In my case I publish json messages using mqtt 'topics' and any task that has registered to subscribe to a 'topic' receives it's data when published. If you google "pub sub", "mqtt" and "mosquitto" you'll find lots of references and examples. Mosquitto (now an Eclipse project) is only one of a number of mqtt brokers that are available. Another very good broker that is written in Java is called hivemq.
This is a very simple, reliable solution that scales well. In my case literally millions of messages reliably pass through mqtt every day.
You must be looking for socketio
Socket.IO enables real-time bidirectional event-based communication.
It works on every platform, browser or device, focusing equally on reliability and speed.
Sockets have traditionally been the solution around which most
realtime systems are architected, providing a bi-directional
communication channel between a client and a server.

Use Apache Thrift for two-way communication?

Is it possible to implement a two-way communication between client and server with Apache Thrift? Thus not only to be able to make RPC from client to server, but also the other way round? In my project I have the requirement that the server must also push some data to the client without being asked by the client before to do this.
There are two ways how to achieve this with Thrift.
If both ends are more or less peers and you connect them through sockets or pipes, you simply set up a server and a client on both ends and you're pretty much done. This does not work in all cases, however, especially with HTTP.
If you connect server and client through HTTP or a similar channel, there is a technique called "long polling". It basically requires the client to call the server as usual, but the call will only return when the server wants to send some data back to the client. After receiving the data, the client starts another call if he's still interested in more data.
As Denis pointed out, depending on your exact use case, you might want to consider using a MQ system. Note that it is still possible to use Thrift to de/serialize the messages into and from the queues. The contrib folder has some examples that show how to use Thrift with ZMQ, Rebus and some others.
You are better to use queues then, e.g. ZeroMQ.

Using nodejs as a client to a Websocket server. Is it still event driven

I would like to code a performance test creating hundreds of concurrent websocket connections to a vendor's server, that will randomly send/receive messages and then hangup.
Can I use Websockets with nodejs to do this and will this be considered event driven?
To be clear, I am not building a websocket server. I basically want to use nodejs as a client to connect to an outside websocket server but create hundreds of concurrent connections that it will respond to.
Thanks.
Yes you can use node.js and yes it will be event driven. All network I/O in node.js is exclusively asynchronous and event-driven.
Be aware substack's rant in the hyperquest README about node.js core http module's connection pooling, which you will want to make sure you bypass. Presumably you'll be using a helper library such as socket.io or sock.js. Just check that they will create the number of connections you want to a single server without any pooling or throttling.

Resources