What I want to achieve is having a react application receive data posted to a Node-js server. Currently, the Node server receives a POST from an external source with a list of items. When the Node server receives the data, I want to send the data to the react application. This data will be used in the react application to display the listed items. How would I proceed to make this possible? Any advice is appreciated!
Thank you!
You can do this using Web Sockets. It would look something like this
Your app connects to your API and maintains a socket
The external source will POST to your API
API handles the POST request
Then the API passes some data on to the React app over the web socket
Your app consumes said data
Profit?
Socket.IO is a popular JavaScript library for web sockets with fallbacks and stuff.
Related
Situation
I have two applications. 1st App is spinning up its Node.js server and then serving an html page with continuous data entries, similar to chat messages coming through one at a time. The 2nd App is a React app should take the data coming from the 1st client to its own but these data entries will be put into graphs and such.
Restrictions: 1st App must use node server that serves html page containing running data. 2nd App uses React frontend, Socket.io server (w/ Express if necessary).
Problem
The problem is I am unsure if I should create a socket.io server and use cors so that the server can listen to both client urls (A = local:8080 and B = localhost:3333), and then try to grab the data as it hits Client A, and then emit the data to Client B, which React will then display it properly.
What I've tried
I've created a bunch of test applications with chat apps, and simple clocks sending the seconds in realtime. But I just can't wrap my head on how to get these two clients to be connected through Socket.io. I've looked into trying to implement socket.io-client, but I am unsure if that will work.
The challenge is that Client A needs to be simplistic, Node server and html page showing incoming data and Client B taking that data and putting it into visual graphs.
I believe the solution may lie in using cors and passing both urls to the server of the 2nd Application.
const socketA = io("localhost:8080");
const socketB = io("localhost:3333");
Any help is greatly appreciated on this approach.
I started to implement a HTTP ping health monitor as a private project with React and Node.js. I thought about making monitor with intervals that will send an axios request to server to receive all the urls and will return the results to server which will be shown later on in the client side.
I don't wanna use REST API to transfer data between the monitor and the server and to show it lively in the client side.
MONITOR <--> SERVER <--> CLIENT
What should I use instead of REST API in order to communicate between the monitor and the server? I know socket.io is fine to communicate between the client and the server but it is not so good for scaling.
What will be good and fast to transfer data for this specific project and not so hard to implement?
Thanks!
You can work with Server Sent Events in NodeJS, that is a way of receiving events from the server. Then you can use EventSource to open a connection to the server to begin receiving events from it.
https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events
Take a look at this tutorial from DigitalOcean:
How To Use Server-Sent Events in Node.js to Build a Realtime App
Also take a look at Socket.io:
https://socket.io/
I am creating a web app that uses react on the frontend and node on the backend. On the backend a web hook is used to get information about the status of a Twilio conference. Whenever the status changes, it gets posted to an endpoint on my backend. How do I get that information to the front end without constantly polling?
Twilio developer evangelist here.
You are looking for WebSockets, which are a constant connection between a front-end and back-end that you can send data over. If you are working with Node.js then you might find a library like Socket.io a helpful introduction to working with WebSockets. There is a tutorial for building a simple chat that will take you through the basics that you can then apply to this problem.
Is it best to make API calls directly to RabbitMQ from the frontend React Native app, or is it better to make an API call to a backend server endpoint, and bind/queue the messages there, in order to return a JSON response to the frontend once the message is consumed?
My plan is to make a React Native app that uploads large files to Digital Ocean Spaces, and then store other data in Firebase collections. I have a Node.JS Express server running on the backend, and I'm wondering if it's best to queue RabbitMQ messages by going through the Express server first, or if I should just queue the messages to RabbitMQ directly from the frontend React Native app?
Here's a SO post of an example fetch() API call to RabbitMQ directly from a frontend React Native app, but I'm wondering how secure this is (because you need to pass user and password credentials in a JSON object), and if it's best to just send all messages to the backend Express server first. I suppose a lot of this may depend on app architecture, but my thinking is that it's best to queue, produce, and consume messages by first going through a 3rd-party client library on the backend, using amqplib for example, especially since most RabbitMQ examples found online do this.
I built an app and i'm planning to make a real time battle with Angular 2 and laravel. For example, you hit the "attack" button, and your opponent see his life going down in real time.
My app built with:
frontend: Angular 2
Backend: PHP Laravel 5.2
Now I'm searching and learning for my real time battle component,
and I saw different guides and tutorials for it:
https://www.codetutorial.io/laravel-5-and-socket-io-tutorial/
http://4dev.tech/2016/02/creating-a-live-auction-app-with-angular-2-node-js-and-socket-io/
The first tutorial is about how to use Laravel 5 and socket io.
The second one is how to use Angular 2 with NODS JS and socket io.
When I say real time, I mean that both users see the same thing that is happening on the screen)
My Backend and Frontend are totally divided and I have no setup with NodeJS anywhere in my app.
Both users need to see actions happening during a battle in my app, and It need to go through my laravel API and shown via my Angular 2 battle component
My question is -
What's the best approach to real time app (seem websockets) using Angular2 and Laravel 5.2 to get the desired result of what I'm trying to achieve?
Laravel in this context is just templating and serving the client files, and acting as an interface inbetween the client and the socket.io server. It doesn't actually act as the socket.io server, and I don't believe it can.
So yes, you would still need something (node) to host the socket.io server to interact with the client, through PHP or otherwise. Personally, I'd skip Laravel/PHP altogether and just use node with koa/express/whatever to template your client (html/js/css/etc) files. Feels like an unnecessary abstraction to me.
The code below from socket.blade.php already has a connection to the actual socket.io server, so I don't see why the additional overhead of an HTTP POST through PHP/Laravel is a good idea. Security, perhaps, but you can handle that with the actual socket.io server as well.
var socket = io.connect('http://localhost:8890');
socket.on('message', function (data) {
$( "#messages" ).append( "<p>"+data+"</p>" );
});
For the real-time character of your use-case, websockets are definitely the way to go. The players that should get the updates should be in the same 'room', so you can broadcast changes more easily. For the other functionality you can either use websockets or regular API calls to your backend directly from your client-side app code with some kind of communication between your api and the socket server, e.g. through Redis.
TLDR:
All data through sockets, node server does api calls and broadcasts changes to active players
Use API from app directly, use pub/sub queue foo for communication between laravel and node to broadcast changes to active players
Option 1:
Angular frontend app
Set up websocket connection
Add triggers for game foo which will send data over the socket connection and is handled by your nodeserver
Only talks to sockets
Node server
Serves frontend app
Handles socket connections, divide players per game
Handles socket calls and calls laravel api to do mutations on your data
Process action and broadcast changes to players in game X
Laravel REST API
Auth x
Default CRUD foo
Option 2:
Angular frontend app
Talks to api directly
Uses sockets to listen for updates
Node server
Serves frontend app
Handle websocket data
Listen on queue for published data from API
Broadcast changes to players in game x over socket
Laravel REST API
Auth
Crud
Mutation x triggers publish in Redis or other queue, which the node server can/should listen on
I'm sure there are more ways you can set this up, you just have to decide where you want what. Maybe introducing Redis is something you do not want, in that case your node app will have more to do. If you do want to use something like Redis, you'll need to do API calls from either your frontend app or choose to do it through the node app anyway, combining the 2 options.
If you are planning to use websockets then there seems to be less use of laravel as only one socket is pretty capable of handling all the data that will be exchanged between the frontend and the backend, so if you don't mind changing your engine you can try Meteor, https://www.meteor.com/