I am using Twitter's Streaming API in my NodeJS application. I am starting and stopping the server in my developers machine. My question is that does the API fetches new data every time? Does it return duplicate data on starting/stopping the NodeJS server?
Every Twitter request fetches new data; however, depending on the specific request, the new data may be the same as or different from the old data.
There is nothing called "live search API." You may be referring to Twitter's Streaming API. If that's the case, then you would not receive duplicate data.
Related
A few questions about how to handle this dataflow. To notice I'm new to node and next.js but doing some starting basics and now trying to think ahead. Because I have read websockets and node being good for live-data handling.
We have an external API from a game. We have multiple moderators/clients that login to our system so they can use that API. Now they are requesting with multiple clients on that API multiple times per second which isn't a nice dataflow.
The main GET functions from that API are chat, so best would be live-data/push?? Else 1 second refresh for it. And the server info with the players. That could do with a 5 second refresh. Or is it just as easy to get that data with the 1 second/live-data from the chat ownGET flow I have in mind, as mentioned next? That's the first question sneaked in.
Then the ownGET flow.
I was thinking of making a own API that does the request to the game API. The game API doesn't support a websocket. So the idea is to let the own API do a request once a second to the game API. And then let the clients listen to the own API by websocket.
What I think websocket can do is, sending a push notification to the client when there is new data. On that push notification the client sends a get request to the own API. Instead of doing a 1 second get ping per client on the own API.
Just a quick check, is this even possible? And is this the right way of thinking or could it be optimized even better? Thanks in advance!
I am new to socket.io. I have a backend server implemented on nodejs which makes multiple async calls to rest api and send the response back to the client on angular 4.0. I follow the simple http request methodology. However, the issue is that it takes a while for the server to get collect and parse data from each multiple apis and send it back to the client. Therefore, I wanted to implement streaming which will return data to the client as soon as it gets from anywhere. I wanted to know if this can be done efficiently using web sockets and socket.io or is there another way around to do this.
I know, we can make async calls from the client but i want to implement the logic on the server rather than the client.
Note: I donot have realtime data
I have Lambda function that runs every 5 mins and queries for new tweets for a particular user ID. This uses REST API now and works pretty well.
Using streaming API might have been a better way instead of running a nodeJs lambda function every 5 mins, right? Is there a way to use streaming API on Lambda nodeJs? Or any other code hosting service?
Check the code in the following repo; It's connected to Twitter API stream and monitor some keywords and respond and insert to dynamoDB.
https://github.com/PBXDom/Twitter-Marketing-Nodejs
We use this code in our company for monitoring keywords in Twitter and find related leads.
The API endpoint I need to access provides live streaming option only. But the need is for a regular non streaming API. Using the request node module can I achieve this?
You can hook up to the stream on your server and store data that arrives in the stream locally on the server in a database and then when a REST request comes in for some data, you look in your local database and satisfy the request from that database (the traditional, non-streaming way).
Other than that, I can't figure out what else you might be trying to do. You cannot "turn a streaming API into a non-streaming one". They just aren't even close to the same thing. A streaming API is like subscribing to a feed of information. You don't make a request, new data is just sent to you when it's available. A typical non-streaming API is that a client makes a specific request and the server responds with data for that specific request.
Here's a discussion of the Twitter streaming API that might be helpful: https://dev.twitter.com/streaming/overview
I am developing a Twitter app that (on the backend) consumes Tweets does some fairly intense processing and then stores the data in a database for use later by the client. All of my servers are running node.js.
I am going to have a server connected to the Twitter Streaming API using nTwitter for node.js. I want to then have this server pass the Tweets along to worker servers and distribute the load based on the Tweet ID (the last digit in the ID would be used).
Right now, I am using Socket.io (and Socket.io-client) which seems to run pretty well. It seems like the Websocket protocol is ideal for this. I am wondering if there are there any reasons not to use Socket.io in this manner?