Streaming data from multiple api calls using nodejs and socket io - node.js

I am new to socket.io. I have a backend server implemented on nodejs which makes multiple async calls to rest api and send the response back to the client on angular 4.0. I follow the simple http request methodology. However, the issue is that it takes a while for the server to get collect and parse data from each multiple apis and send it back to the client. Therefore, I wanted to implement streaming which will return data to the client as soon as it gets from anywhere. I wanted to know if this can be done efficiently using web sockets and socket.io or is there another way around to do this.
I know, we can make async calls from the client but i want to implement the logic on the server rather than the client.
Note: I donot have realtime data

Related

How to handle the dataflow on Node and Next.js

A few questions about how to handle this dataflow. To notice I'm new to node and next.js but doing some starting basics and now trying to think ahead. Because I have read websockets and node being good for live-data handling.
We have an external API from a game. We have multiple moderators/clients that login to our system so they can use that API. Now they are requesting with multiple clients on that API multiple times per second which isn't a nice dataflow.
The main GET functions from that API are chat, so best would be live-data/push?? Else 1 second refresh for it. And the server info with the players. That could do with a 5 second refresh. Or is it just as easy to get that data with the 1 second/live-data from the chat ownGET flow I have in mind, as mentioned next? That's the first question sneaked in.
Then the ownGET flow.
I was thinking of making a own API that does the request to the game API. The game API doesn't support a websocket. So the idea is to let the own API do a request once a second to the game API. And then let the clients listen to the own API by websocket.
What I think websocket can do is, sending a push notification to the client when there is new data. On that push notification the client sends a get request to the own API. Instead of doing a 1 second get ping per client on the own API.
Just a quick check, is this even possible? And is this the right way of thinking or could it be optimized even better? Thanks in advance!

How to use traccar api to get realtime location automatically without setinterval or socket

I'm using node.js and express, while getting the live location using WebSocket, sometimes getting the error or disconnecting the service. I tried to use the setInterval function but the server is calling each 5 seconds is not good. I want real-time data and without a request from the client-side. How?
You need a server push solution e.g. websocket, SSE, etc. It does not need to send real data, just the notification that the client should refresh, because there is new data for it or maybe a hyperlink to follow. After the client gets the notification, it can do HTTP requests.

node.js built in support for handling requests for same data

In my node.js server app I'm providing a service to my js client that performs some handling of remote api's.
It might very well be possible that two different clients request the same information. Say client 1 requests information, then before client 1's request is fully handled (remote api's didn't returns their response yet) client 2 is requesting the same data. What I'd want to is to wait for client 1 data to be ready and then write it to both client 1 and 2.
This seems to me like a very common issue and I was wondering if there was any library or built-in support in connect or express that supports this issue.
You might not want to use HTTP for providing the data to the client. Reasons:
If the remote API is taking a lot of time to process you will risk the client request to timeout, or the browser to repeat the request.
You will have to share some state between requests which is not a good practice.
Have a look at websockets (socket.io would be a place to start). With them you can push data from the server to the client. In your scenario, clients will perform the request to the server, which will return 202 and when the remote API will respond, the server will push the data to the clients using websockets.

Streaming an API using request module

The API endpoint I need to access provides live streaming option only. But the need is for a regular non streaming API. Using the request node module can I achieve this?
You can hook up to the stream on your server and store data that arrives in the stream locally on the server in a database and then when a REST request comes in for some data, you look in your local database and satisfy the request from that database (the traditional, non-streaming way).
Other than that, I can't figure out what else you might be trying to do. You cannot "turn a streaming API into a non-streaming one". They just aren't even close to the same thing. A streaming API is like subscribing to a feed of information. You don't make a request, new data is just sent to you when it's available. A typical non-streaming API is that a client makes a specific request and the server responds with data for that specific request.
Here's a discussion of the Twitter streaming API that might be helpful: https://dev.twitter.com/streaming/overview

RESTful backend and socket.io to sync

Today, i had the idea of the following setup. Create a nodejs server along with express and socket.io. With express, i would create a RESTful API, which is connected to a mongo. BackboneJS or similar would connect the client to that REST API.
Now every time the mongodb(ie the data in it iam interested in) changes, socket.io would fire an event to the client, which would carry a courser to the data which has changed. The client then would trigger the appropriate AJAX requests to the REST to get the new data, where it needs it.
So, the socket.io connection would behave like a synchronize trigger. It would be there for the entire visit and could also manage sessions that way. All the payload would be send over http.
Pros:
REST API for use with other clients than web
Auth could be done entirely over socket.io. Only sending token along with REST requests.
Use the benefits of REST.
Would also play nicely with pub/sub service like Redis'
Cons:
Greater overhead, than using pure socket.io.
What do you think, are there any great disadvantages i did not think of?
I agree with #CharlieKey, you should send the updated data rather than re-requesting.
This is exactly what Tower is doing:
save some data: https://github.com/viatropos/tower/blob/development/src/tower/model/persistence.coffee#L77
insert into mongodb (cursor is a query/persistence abstraction): https://github.com/viatropos/tower/blob/development/src/tower/model/cursor/persistence.coffee#L29
notify sockets: https://github.com/viatropos/tower/blob/development/src/tower/model/cursor/persistence.coffee#L68
emit updated records to client: https://github.com/viatropos/tower/blob/development/src/tower/server/net/connection.coffee#L62
The disadvantage of using sockets as a trigger to re-request with Ajax is that every connected client will have to fetch the data, so if 100 people are on your site there's going to be 100 HTTP requests every time data changes - where you could just reuse the socket connections.
I think that pushing the updated data with the socket.io event would be better than re-requesting the lastest. Even better you could only push the modified pieces of data decreasing the amount of data sent over the line. Overall though a interesting idea.
I'd look into Now.js since it does pretty much exactly what you need.
It creates a namespace which is shared among the client and server. The server can call functions on the client directly and vice versa.
That is if you insist on your current infrastructure decision to use MongoDB and Node.js, otherwise there would be CouchDB which is a full web server and document database with sophisticated replication mechanisms built-in.

Resources