I currently have a socket.io server spawned by a nodeJS web API server.
The UI runs separately and connects to the API via web socket. This is mostly used for notifications and connectivity status checks.
However the API also acts as a gateway for several micro services. One of these is responsible for computing the data necessary for the UI to render some charts. This operation is long-lasting and due to many reasons the computation will only start when a request is received.
In a nutshell, the UI sends a REST request to the API and the API currently uses gRPC to send the request to the micro service. This is bad because it locks both API and UI.
To avoid locking the socket server on the API should be be able to relay the UI request and the "computation ended" event received by the micro service, this way nothing would be locked. This could eventually lead to the gRPC server on the micro service to be removed.
Is this something achievable with socket.io?
If not is the only way for the API to spawn a secondary socket connection to the micro service for each one received by the UI?
Is this a bad idea?
I hope this is clear, thanks.
I actually ended up not using socket.io. However this can still be done with it if the API spawns a server and has the different services connected as clients, https://socket.io/docs/rooms-and-namespaces/ can be used.
This way messages can be "relayed" and even broadcasted from the server to both in case something happens.
Related
I started to implement a HTTP ping health monitor as a private project with React and Node.js. I thought about making monitor with intervals that will send an axios request to server to receive all the urls and will return the results to server which will be shown later on in the client side.
I don't wanna use REST API to transfer data between the monitor and the server and to show it lively in the client side.
MONITOR <--> SERVER <--> CLIENT
What should I use instead of REST API in order to communicate between the monitor and the server? I know socket.io is fine to communicate between the client and the server but it is not so good for scaling.
What will be good and fast to transfer data for this specific project and not so hard to implement?
Thanks!
You can work with Server Sent Events in NodeJS, that is a way of receiving events from the server. Then you can use EventSource to open a connection to the server to begin receiving events from it.
https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events
Take a look at this tutorial from DigitalOcean:
How To Use Server-Sent Events in Node.js to Build a Realtime App
Also take a look at Socket.io:
https://socket.io/
I'm new to working with sockets and have a small system design question:
I have 2 separate node processes for a web app, 1 is a simulator that is constantly running and the 2nd is an api server. Both share the same MongoDB database and we have a React app running for the client, served by the api server.
I'm looking to implement socket.io for real-time notifications and so I've set up a simple connection between the api and client.
My problem is that while the simulator runs, there are some events that I also want to trigger push notifications for so my question is how to hook that into everything?
The file hierarchy is like:
app/
simulator/
api/
client/
I saw this article for communication between node processes and I currently have 3 solutions in mind:
Leave hierarchy as it is and install socket.io package inside simulator as well. I'm not sure if sockets work this way but can both simulator and api connect to the same socket?
Move simulator file into api file to fork as a child process so that the 2 processes can communicate via child/parent messaging. simulator will message api which will then emit updates through the socket to client
Leave hierarchy as is and communicate via node-ipc. Same situation as above with simulator messaging api first before api emits that to client
If 1 is possible, that seems like the best solution in my impression. It seems like extra work to add an additional layer of messaging for 2 and 3.
Leave hierarchy as it is and install socket.io package inside simulator as well. I'm not sure if sockets work this way but can both simulator and api connect to the same socket?
The client would have to create a separate socket.io connection to the simulator process. Then, the client can receive data from the API server over one connection and from the simulator over another connection. You would need two separate, independent socket.io connections from the client, one to the API server and one to the simulator. Simulator and API server cannot share the same socket unless they are in the same process.
Move simulator file into api file to fork as a child process so that the 2 processes can communicate via child/parent messaging. simulator will message api which will then emit updates through the socket to client
This is really part of a broader option that the simulator communicates with the API server and sends it data that the API server can then send to the client over the single socket.io connection that the client made to the API server.
There are lots of different ways for the simulator process to communicate with the API server.
Since it's already an API server, you can just make an API for this (probably non-public). The simulator calls an API to send data to the client. The API server receives that data and sends it to the client.
As you suggest, if the simulator is run from the API server as a child process, then you can use parent/child communication messaging built into node.js. Note, you don't have to move the simulator files into the API file at all. You can just use child_process to launch the simulator as another nodejs app from another project. You just have to know the path to that other project.
You can use any another communication mechanism you want between the simulator process and the API server process. There could be a socket.io connection between them. You could use several forms of IPC, etc...
If 1 is possible, that seems like the best solution in my impression.
Your #1 option is not possible as separate processes can't use the same socket.io connection.
It seems like extra work to add an additional layer of messaging for 2 and 3.
My options #1 and #2 are not much code in each server. You're doing interprocess communication. You should expect to use some code to enable that. But, it's not hard at all.
If the lifetime of the simulator server and the API server are always together (they have no independent uses), then I'd probably do the child process thing where the API server launches the simulator and then use parent/child messaging to communicate between them. You do NOT have to combine sources to do this.
The child_process module can run the simulator process by just knowing what directory it is located in.
Otherwise, I'd probably make a small web server on a non-public port in the API server and have the simulator just send data to that other web server. I often refer to this as a control port. It's a way of "controlling or diagnosing" the API server internals and can only be accessed from within the private network and/or with credentials. The reason I'd use a separate web server (in the same nodejs app as the API server) is to make it easy to secure so it can't be accessed from the outside world like the regular public APIs can. You just put the internal web server on a port that is not exposed to the outside world.
You should check Socket.IO docs about adapters and Emitters. This allows to connect to sockets from different node processes and scalability.
In my node.js server app I'm providing a service to my js client that performs some handling of remote api's.
It might very well be possible that two different clients request the same information. Say client 1 requests information, then before client 1's request is fully handled (remote api's didn't returns their response yet) client 2 is requesting the same data. What I'd want to is to wait for client 1 data to be ready and then write it to both client 1 and 2.
This seems to me like a very common issue and I was wondering if there was any library or built-in support in connect or express that supports this issue.
You might not want to use HTTP for providing the data to the client. Reasons:
If the remote API is taking a lot of time to process you will risk the client request to timeout, or the browser to repeat the request.
You will have to share some state between requests which is not a good practice.
Have a look at websockets (socket.io would be a place to start). With them you can push data from the server to the client. In your scenario, clients will perform the request to the server, which will return 202 and when the remote API will respond, the server will push the data to the clients using websockets.
I am working on a webRTC application where a P2P connection is established between a Customer and free agents .The agents are fetched using AJAX call in the application.I want to scale the application such that if the agents are running on any node server they are able to have a communication mechanism and update status on agent(available,busy,unavailable)can be performed.
My problem statement is that the application is running on 8040 and agentsservice is running on 8088 where the application is making ajax calls and bringing the data.What best can be done to scale the agents or any idea about how to scale the application.
I followed https://github.com/rajaraodv/redispubsub using Redis pub/sub but my problem is not resolved as the agents are being updated , fetched on another node using ajax calls .
You didnt gave enough info... but to scale your nodejs app you need a centeral place which will hold all the info that needed and than can scale redis can scale easily, youc can try socket.io etc..
now after you have your cluster of redis for example you need to make all your node.js to communicate with the redis server that way all you nodes server will have access to same info, now its up to you to send to right info to right clients
Message Bus approach:
Ajax call will send to one of the nodejs servers. If the message doesn't find its destination in that server, it will be sent to the next one, and so one. So signaling server must distribute the received message to all the other nodes in the cluster by establishing a Message Bus
I have a SPA application (Backbone on client and node.js on server). It all communication in both directions is through via websockets. Now I wondered - it's a good idea? What are the cons before approach: the client sends data to the server via the REST API, and server sends data to client via websockets?
thanks.
UPD:
I have websockets in any case because my app is multiroom chat.
Even if you only consider RPC ("Remote Procedure Calls"), REST is less capable than WebSocket.
REST, since it runs over HTTP, cannot pipeline RPCs. Each HTTP connection can only serve 1 RPC synchronously. And browsers limit the number of parallel HTTP connections to a given origin.
With RPC over WebSocket, you can fire off 100 RPCs pipelined, and process RPC returns asynchronously as they come in.
Then, with WebSocket, you can have server-initiated notifications as well. E.g. you can have full-flavored Publish & Subscribe.
WAMP ("The Web Application Messaging Protocol") runs over WebSocket and was designed exactly for this: SPAs that need 2 messaging patterns in 1 protocol - RPC and PubSub.
Disclaimer: I am original author of WAMP and work for Tavendo.
If server needs uncertain time to prepare data, it may be a good idea though
basically there is no reason to use websocket(socket.io) for REST API.
because of what REST API stands for,You don't have to keep connection stablished nor don't have to wait for someones action like broadcasting server.
EDIT answering the comment
even if you already used websocket,it doesn't mean you can't handle normal req/res.
RESTapi with websocket is like
get request -> server response -> client try io.connect(); -> connection established -> server send data to the client thru websocket
and normal REST API is like
get request -> server responce
which do you choose?