I built an app and i'm planning to make a real time battle with Angular 2 and laravel. For example, you hit the "attack" button, and your opponent see his life going down in real time.
My app built with:
frontend: Angular 2
Backend: PHP Laravel 5.2
Now I'm searching and learning for my real time battle component,
and I saw different guides and tutorials for it:
https://www.codetutorial.io/laravel-5-and-socket-io-tutorial/
http://4dev.tech/2016/02/creating-a-live-auction-app-with-angular-2-node-js-and-socket-io/
The first tutorial is about how to use Laravel 5 and socket io.
The second one is how to use Angular 2 with NODS JS and socket io.
When I say real time, I mean that both users see the same thing that is happening on the screen)
My Backend and Frontend are totally divided and I have no setup with NodeJS anywhere in my app.
Both users need to see actions happening during a battle in my app, and It need to go through my laravel API and shown via my Angular 2 battle component
My question is -
What's the best approach to real time app (seem websockets) using Angular2 and Laravel 5.2 to get the desired result of what I'm trying to achieve?
Laravel in this context is just templating and serving the client files, and acting as an interface inbetween the client and the socket.io server. It doesn't actually act as the socket.io server, and I don't believe it can.
So yes, you would still need something (node) to host the socket.io server to interact with the client, through PHP or otherwise. Personally, I'd skip Laravel/PHP altogether and just use node with koa/express/whatever to template your client (html/js/css/etc) files. Feels like an unnecessary abstraction to me.
The code below from socket.blade.php already has a connection to the actual socket.io server, so I don't see why the additional overhead of an HTTP POST through PHP/Laravel is a good idea. Security, perhaps, but you can handle that with the actual socket.io server as well.
var socket = io.connect('http://localhost:8890');
socket.on('message', function (data) {
$( "#messages" ).append( "<p>"+data+"</p>" );
});
For the real-time character of your use-case, websockets are definitely the way to go. The players that should get the updates should be in the same 'room', so you can broadcast changes more easily. For the other functionality you can either use websockets or regular API calls to your backend directly from your client-side app code with some kind of communication between your api and the socket server, e.g. through Redis.
TLDR:
All data through sockets, node server does api calls and broadcasts changes to active players
Use API from app directly, use pub/sub queue foo for communication between laravel and node to broadcast changes to active players
Option 1:
Angular frontend app
Set up websocket connection
Add triggers for game foo which will send data over the socket connection and is handled by your nodeserver
Only talks to sockets
Node server
Serves frontend app
Handles socket connections, divide players per game
Handles socket calls and calls laravel api to do mutations on your data
Process action and broadcast changes to players in game X
Laravel REST API
Auth x
Default CRUD foo
Option 2:
Angular frontend app
Talks to api directly
Uses sockets to listen for updates
Node server
Serves frontend app
Handle websocket data
Listen on queue for published data from API
Broadcast changes to players in game x over socket
Laravel REST API
Auth
Crud
Mutation x triggers publish in Redis or other queue, which the node server can/should listen on
I'm sure there are more ways you can set this up, you just have to decide where you want what. Maybe introducing Redis is something you do not want, in that case your node app will have more to do. If you do want to use something like Redis, you'll need to do API calls from either your frontend app or choose to do it through the node app anyway, combining the 2 options.
If you are planning to use websockets then there seems to be less use of laravel as only one socket is pretty capable of handling all the data that will be exchanged between the frontend and the backend, so if you don't mind changing your engine you can try Meteor, https://www.meteor.com/
Related
I'm developing a react app and I need to trigger notifications in it for database changes , My back end is node and I'm not sure how to achieve this task. Should I listen always from front end for back-end notifications?I need to do It like how they do it in Facebook, When I develop my flutter apps I used Firebase streams to achieve this and don't know how to do this in react and node with PostgreSQL database.
You can use web sockets or socket.io library.
https://socket.io/
Old browsers don't support web sockets, in that case you need to check repeatedly from the front-end whether there is any notification from the back-end, lets say when database changes. This is called polling.
But, socket.io supports this polling automatically if browser don't support web sockets.
Socket.io is used by many applications . Your purpose seems to be solved using this library. It is event based. Once there is any database change in backend, and if you set up a socket.io event emitter, your front end will receive it via socket.io on the client and your react app can finally notify the user.
From their website,
Socket.IO enables real-time, bidirectional and event-based communication.
It works on every platform, browser or device, focusing equally on reliability and speed.
I'm new to working with sockets and have a small system design question:
I have 2 separate node processes for a web app, 1 is a simulator that is constantly running and the 2nd is an api server. Both share the same MongoDB database and we have a React app running for the client, served by the api server.
I'm looking to implement socket.io for real-time notifications and so I've set up a simple connection between the api and client.
My problem is that while the simulator runs, there are some events that I also want to trigger push notifications for so my question is how to hook that into everything?
The file hierarchy is like:
app/
simulator/
api/
client/
I saw this article for communication between node processes and I currently have 3 solutions in mind:
Leave hierarchy as it is and install socket.io package inside simulator as well. I'm not sure if sockets work this way but can both simulator and api connect to the same socket?
Move simulator file into api file to fork as a child process so that the 2 processes can communicate via child/parent messaging. simulator will message api which will then emit updates through the socket to client
Leave hierarchy as is and communicate via node-ipc. Same situation as above with simulator messaging api first before api emits that to client
If 1 is possible, that seems like the best solution in my impression. It seems like extra work to add an additional layer of messaging for 2 and 3.
Leave hierarchy as it is and install socket.io package inside simulator as well. I'm not sure if sockets work this way but can both simulator and api connect to the same socket?
The client would have to create a separate socket.io connection to the simulator process. Then, the client can receive data from the API server over one connection and from the simulator over another connection. You would need two separate, independent socket.io connections from the client, one to the API server and one to the simulator. Simulator and API server cannot share the same socket unless they are in the same process.
Move simulator file into api file to fork as a child process so that the 2 processes can communicate via child/parent messaging. simulator will message api which will then emit updates through the socket to client
This is really part of a broader option that the simulator communicates with the API server and sends it data that the API server can then send to the client over the single socket.io connection that the client made to the API server.
There are lots of different ways for the simulator process to communicate with the API server.
Since it's already an API server, you can just make an API for this (probably non-public). The simulator calls an API to send data to the client. The API server receives that data and sends it to the client.
As you suggest, if the simulator is run from the API server as a child process, then you can use parent/child communication messaging built into node.js. Note, you don't have to move the simulator files into the API file at all. You can just use child_process to launch the simulator as another nodejs app from another project. You just have to know the path to that other project.
You can use any another communication mechanism you want between the simulator process and the API server process. There could be a socket.io connection between them. You could use several forms of IPC, etc...
If 1 is possible, that seems like the best solution in my impression.
Your #1 option is not possible as separate processes can't use the same socket.io connection.
It seems like extra work to add an additional layer of messaging for 2 and 3.
My options #1 and #2 are not much code in each server. You're doing interprocess communication. You should expect to use some code to enable that. But, it's not hard at all.
If the lifetime of the simulator server and the API server are always together (they have no independent uses), then I'd probably do the child process thing where the API server launches the simulator and then use parent/child messaging to communicate between them. You do NOT have to combine sources to do this.
The child_process module can run the simulator process by just knowing what directory it is located in.
Otherwise, I'd probably make a small web server on a non-public port in the API server and have the simulator just send data to that other web server. I often refer to this as a control port. It's a way of "controlling or diagnosing" the API server internals and can only be accessed from within the private network and/or with credentials. The reason I'd use a separate web server (in the same nodejs app as the API server) is to make it easy to secure so it can't be accessed from the outside world like the regular public APIs can. You just put the internal web server on a port that is not exposed to the outside world.
You should check Socket.IO docs about adapters and Emitters. This allows to connect to sockets from different node processes and scalability.
I'm building a full-stack web application. Backend is done in express, front-end in Vue.
The application has some real-time features, and for this I am using socket.io. Socket IO will have to keep track of some variables and react with different events to clients depending on current state of those variables. I intend to use redis to manage application state.
I would like to keep my API stateless, and separate my socket server from my API server (they would run on different ports). I would like to do this because I feel like it will be easier to manage, more scalable, and I think my security will benefit from this.
For authorized actions, client will send data to API, API will process the data and send it to the socket, and socket will emit changes to clients.
For unauthorized actions, I will use direct socket to client communication.
Is this a good way to structure my application?
I spend already 3 days for reading, watching tutorials about WebSockets, socket.io, node.js and so on.
Basically, I'm a Laravel developer and have just a basic idea about all the rest components.
With regret, after these 3 days I don't have in mind all logic step-by-step of implementing this architecture. I will try to explain what did I understand and you please correct me.
So :
WebSockets - is a bidirectional continuous connection between client and server. It uses another port, and basically it is not a HTTP/S connection.
For making this kind of app like I said, we need one more server, and idk why, but this is Node.js. At this Node.js server we should install socket.io (server-side package) and Redis.
Then, we need to add client-side socket.io (via CDN probably).
At Node.js server we are creating a server.js file where require all modules that we need, like socket.io and Redis. We open a connection for a specific non-used port (as 6001). Then we run this node server.
At front-end we are subscribe-ing to this channel and define method for emitting and listening to the server.
Example :
User1 is connecting to a specific route. User2 as well. User1 type a message for User2, when press Submit, message from User1 is sent to Node.js server, where it is sent in Redis(yes? if yes - Why?), and then Node.js is listening for what to do in this case, and send this message to specific user, or with broadcast to all users except the publisher.
Oh, it's even hard to explain that, too much steps and tehcnologies used.
Can please someone correct my logic? I really want to understand all that process and logic of using this components. Or please, give me some useful articles and videos, may be I didn't saw them. Thanks!
I suggest you read the official docs on how to build chat. Basically what you will have in the end is 2 servers, 1 for your Laravel app and the other for chat (Socket.io) . The key to this is using broadcasters and listening for events on both sides, frontend and backend.
Events are broadcast over "channels", which may be specified as public or private. Any visitor to your application may subscribe to a public channel without any authentication or authorization; however, in order to subscribe to a private channel, a user must be authenticated and authorized to listen on that channel.
Im using nodejs and socket.io to deliver a chat on my business app, but i want to distribute the deploy so i can have as many chat servers i want to balance the load of the traffic.
I try the load balance approach from nginx but that just do that balance the traffic but the communication between the socket.io serves its not the same, so one chat message send from user A to server S1 wont travel to user B on server S2.
There is any tool or approach to do this.
Thanks in advance.
===== EDIT =====
Here is the architecture of the app.
The main app frontend on PHP CodeIgniter lets tag it as PHPCI
The chat app backend on NodeJs and SocketIO lets tag it as CHAT
The chat model data on Redist lets tag it as REDIST
So what i have now its PHPCI -> CHAT -> REDIST. That work just fine.
What i need is to distribute the application so i can have as many PHPCI or CHAT or REDIST i want, example
PHPCI1 CHAT1
PHPCI2 -> -> REDIST1
PHPCI3 CHAT2
Where the numbers represent instances not different apps.
So a User A connected to PHPCI1 can send a message to a user B connected on PHPCI3.
I think some queue in the middle of CHAT can handle this something like rabbitmq that can only use the SocketIO to deliver the messages to the client.
If you're distributing the server load (and that's a requirement), I'd suggest adding a designated chat data server (usually an in-memory database or message queue) to handle chat state and message passing across edge servers.
Redis Pub/Sub is ideal for this purpose, and can scale up to crazy levels on even a low-end machine. The Redis Cookbook has a chapter on precisely this use case.
If you set up the server-side of your chat app correctly, you shouldn't have to distribute socket.io. Since node.js is browser-based and doesn't require any client-side code (other than the resources downloaded from the webpage), it works automatically. With a webpage, the files required to run socket.io are temporarily downloaded to users when they are correctly included (just like with jQuery). If you are using node.js and socket.io to make an android app, the files should be included in your application when you distribute it, not separately.
In addition, if you wish to use two separate socket.io servers, you should be able to establish communication between the two by connecting them in a similar manner that a client connects to the server, but with a special parameter that lets the other server know that a server connected and it can respond and set a variable for the other server.