I'm working on creating mock sever for my Angular application. On frontend I have library for STOMP. And regularly my frontend communicates with Api written in Java.
But additionally I start mock nodejs api which returns hard coded Json files when remote server is down.
Now I'm trying to write mock nodejs websockets server which will community with Angular client when remote server is down. But I would like to keep it simple
I found StompJs library but seems like it needs STOMP message broker (like Rabbit?). It's seems to me a bit complicated for mock server. Is there any option to skip this broker step and keep it as simple as possible?
Related
I started to implement a HTTP ping health monitor as a private project with React and Node.js. I thought about making monitor with intervals that will send an axios request to server to receive all the urls and will return the results to server which will be shown later on in the client side.
I don't wanna use REST API to transfer data between the monitor and the server and to show it lively in the client side.
MONITOR <--> SERVER <--> CLIENT
What should I use instead of REST API in order to communicate between the monitor and the server? I know socket.io is fine to communicate between the client and the server but it is not so good for scaling.
What will be good and fast to transfer data for this specific project and not so hard to implement?
Thanks!
You can work with Server Sent Events in NodeJS, that is a way of receiving events from the server. Then you can use EventSource to open a connection to the server to begin receiving events from it.
https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events
Take a look at this tutorial from DigitalOcean:
How To Use Server-Sent Events in Node.js to Build a Realtime App
Also take a look at Socket.io:
https://socket.io/
Is it best to make API calls directly to RabbitMQ from the frontend React Native app, or is it better to make an API call to a backend server endpoint, and bind/queue the messages there, in order to return a JSON response to the frontend once the message is consumed?
My plan is to make a React Native app that uploads large files to Digital Ocean Spaces, and then store other data in Firebase collections. I have a Node.JS Express server running on the backend, and I'm wondering if it's best to queue RabbitMQ messages by going through the Express server first, or if I should just queue the messages to RabbitMQ directly from the frontend React Native app?
Here's a SO post of an example fetch() API call to RabbitMQ directly from a frontend React Native app, but I'm wondering how secure this is (because you need to pass user and password credentials in a JSON object), and if it's best to just send all messages to the backend Express server first. I suppose a lot of this may depend on app architecture, but my thinking is that it's best to queue, produce, and consume messages by first going through a 3rd-party client library on the backend, using amqplib for example, especially since most RabbitMQ examples found online do this.
I built an app and i'm planning to make a real time battle with Angular 2 and laravel. For example, you hit the "attack" button, and your opponent see his life going down in real time.
My app built with:
frontend: Angular 2
Backend: PHP Laravel 5.2
Now I'm searching and learning for my real time battle component,
and I saw different guides and tutorials for it:
https://www.codetutorial.io/laravel-5-and-socket-io-tutorial/
http://4dev.tech/2016/02/creating-a-live-auction-app-with-angular-2-node-js-and-socket-io/
The first tutorial is about how to use Laravel 5 and socket io.
The second one is how to use Angular 2 with NODS JS and socket io.
When I say real time, I mean that both users see the same thing that is happening on the screen)
My Backend and Frontend are totally divided and I have no setup with NodeJS anywhere in my app.
Both users need to see actions happening during a battle in my app, and It need to go through my laravel API and shown via my Angular 2 battle component
My question is -
What's the best approach to real time app (seem websockets) using Angular2 and Laravel 5.2 to get the desired result of what I'm trying to achieve?
Laravel in this context is just templating and serving the client files, and acting as an interface inbetween the client and the socket.io server. It doesn't actually act as the socket.io server, and I don't believe it can.
So yes, you would still need something (node) to host the socket.io server to interact with the client, through PHP or otherwise. Personally, I'd skip Laravel/PHP altogether and just use node with koa/express/whatever to template your client (html/js/css/etc) files. Feels like an unnecessary abstraction to me.
The code below from socket.blade.php already has a connection to the actual socket.io server, so I don't see why the additional overhead of an HTTP POST through PHP/Laravel is a good idea. Security, perhaps, but you can handle that with the actual socket.io server as well.
var socket = io.connect('http://localhost:8890');
socket.on('message', function (data) {
$( "#messages" ).append( "<p>"+data+"</p>" );
});
For the real-time character of your use-case, websockets are definitely the way to go. The players that should get the updates should be in the same 'room', so you can broadcast changes more easily. For the other functionality you can either use websockets or regular API calls to your backend directly from your client-side app code with some kind of communication between your api and the socket server, e.g. through Redis.
TLDR:
All data through sockets, node server does api calls and broadcasts changes to active players
Use API from app directly, use pub/sub queue foo for communication between laravel and node to broadcast changes to active players
Option 1:
Angular frontend app
Set up websocket connection
Add triggers for game foo which will send data over the socket connection and is handled by your nodeserver
Only talks to sockets
Node server
Serves frontend app
Handles socket connections, divide players per game
Handles socket calls and calls laravel api to do mutations on your data
Process action and broadcast changes to players in game X
Laravel REST API
Auth x
Default CRUD foo
Option 2:
Angular frontend app
Talks to api directly
Uses sockets to listen for updates
Node server
Serves frontend app
Handle websocket data
Listen on queue for published data from API
Broadcast changes to players in game x over socket
Laravel REST API
Auth
Crud
Mutation x triggers publish in Redis or other queue, which the node server can/should listen on
I'm sure there are more ways you can set this up, you just have to decide where you want what. Maybe introducing Redis is something you do not want, in that case your node app will have more to do. If you do want to use something like Redis, you'll need to do API calls from either your frontend app or choose to do it through the node app anyway, combining the 2 options.
If you are planning to use websockets then there seems to be less use of laravel as only one socket is pretty capable of handling all the data that will be exchanged between the frontend and the backend, so if you don't mind changing your engine you can try Meteor, https://www.meteor.com/
We are in process of developing an app and we started with:
Spring SockJs Java server with Servlet 3.0 container (Async support)
SockJs javascript client.
To test the load aspects, I had written a client using Bayeux API -this was for WebSockets and not for Polling.
Since Node makes strong case for async communication -so we want to evaluate its load capacity against Java server. From articles, I get a feel that Node scales -but we want to keep Java unless there is "significant" advantage going the Node.js way. And most important, we want to try it out for our specific use case rather than riding on popular opinion and hence we want to benchmark them on open connection handling over a period of time, My Question:
Is it possible to write a common client in JAVA -which can connect to both Java based server as well as NODE? Seems possible because at end its HTPP and does not matter what exposes it -but am not getting the right client. Even Bayeux does not work for NON websocket use case -Although I used same API to test "Websocket" connection for SockJs, CometD and Node server
What to do for Non Websocket , for example -Long Polling or streaming any client which can be run against a server "irrespective" whether its in CometD, SockJs or NODE?
SockJs cleint is a Java script client -we can emulate connections from browser -Is there a Java client for server?
Am new to javascript -so how a load test is written in javascript and how is it run (Socket one cannot be run from browser) -any sample code on github?
Is there anyway to send data though sockets from Node.JS to SignalR? I have a Node.JS app that sends realtime information as JSON format. The other app it's an MVC C# app that uses SignalR to send the data to the client via socket. I want tosen data from de nodejs to signalr and signal send that info to the client.
You might consider better solution for internal communication between processes. SignalR is meant to be used between .Net server and client using different authentication, handshake, protocol and network layer methods, which is inefficient for internal server communication.
Take a look on ZeroMQ, is well simple and very easy to use tool, meant especially for such cases. It has bindings for most languages including .Net and node.js.
There is js client for browser to communicate with Signal R server.
http://www.nuget.org/packages/SignalR.Js
You probably can extract js file from it and run from Node.js.
And probably standard Socket.IO will just work, you need to subscribe to proper events and go.
If you want a node.js client for signalR that doesn't require jQuery I started this one. It intentionally only supports websockets.
https://npmjs.org/package/signalr-client