I am currently working on using Electron to run two applications: React as the UI app + FastAPI as the local backend server. I am running both applications from Electron using node's child_process.spawn method. However, these processes don't terminate by nature but I am not sure how to be certain that both applications are ready to use so I can allow Electron to launch the renderer window loading the React app.
For the backend server, I am thinking about implementing a method in Electron that keeps sending a GET request (IsTheServerReady call) method till the backend server responds. I am open to hearing better solutions for this. For React, however, I can't seem to find a direct way to do this.
I'm working on a Vue.js application that retrieves some data with ajax calls; in the dev environment I'm playing with the application in the browser using the builtin server and mocking the external resources with an API stubber that runs on its own server. Since both pieces of software are using node I'm wondering if there's a way to run both with a single command, serving the Vue.js application code and some static files for the mocked calls (which are not all GETs and require parameters, so sticking the json files in the app public folder wouldn't work).
Edit: let me try to clarify. The Vue.js application will interact with a backend service that is still being developed. On my workstation I can play with the application by just running the npm run serve command, but since there's no backend service I won't be able to try the most interesting bits. Right now I'm running saray alongside my application, serving some static json files that mock the server responses. It works fine but I'm effectively running two separate http servers, and since I'm new to the whole Vue, npm & javascript ecosystem, I was wondering if there's a better way, for instance serving the mock responses from the same (dev) server that's serving the Vue application.
I have one misunderstanding with Deepstream.io. It has both NodeJS SDK (which can make client for deepstreamHub from my node application) and NodeApi, which allows to install deepstreamHub as npm package. Why do we need both of this options? When should I use each of the option?
For example, I have existing realtime node app, which uses socket.io as transport layer. And I want rewrite app and migrate from socket.io. What option should I use? Install deepstream as a package in existing app and call my app logic in RPC callbacks, or install standalone server on computer, then install deepstream.io-client-js in my app and also register my app logic as RPC callbacks, using ds.rpc.provide ? I do not understand difference of two approaches
It looks like there is a misunderstanding between deepstreamHub[1] (the cloud platform) and deepstream[2] (the standalone server).
If using deepstream the server, you'll likely need to host it yourself on a cloud provider (AWS, Azure, etc). With deepstreamHub you get given an application endpoint which you can connect to and a dashboard to manage your application permissions, authentication methods and data.
Either way you'll need to install the deepstream.io-client-js module in your application, you can then connect to your server or your application endpoint (if using deepstreamHub).
After this you can register app logic as RPC endpoints and proceed to write your app logic as normal.
[1] (https://deepstreamhub.com/)
[2] (https://deepstreamhub.com/open-source/)
I built an app and i'm planning to make a real time battle with Angular 2 and laravel. For example, you hit the "attack" button, and your opponent see his life going down in real time.
My app built with:
frontend: Angular 2
Backend: PHP Laravel 5.2
Now I'm searching and learning for my real time battle component,
and I saw different guides and tutorials for it:
https://www.codetutorial.io/laravel-5-and-socket-io-tutorial/
http://4dev.tech/2016/02/creating-a-live-auction-app-with-angular-2-node-js-and-socket-io/
The first tutorial is about how to use Laravel 5 and socket io.
The second one is how to use Angular 2 with NODS JS and socket io.
When I say real time, I mean that both users see the same thing that is happening on the screen)
My Backend and Frontend are totally divided and I have no setup with NodeJS anywhere in my app.
Both users need to see actions happening during a battle in my app, and It need to go through my laravel API and shown via my Angular 2 battle component
My question is -
What's the best approach to real time app (seem websockets) using Angular2 and Laravel 5.2 to get the desired result of what I'm trying to achieve?
Laravel in this context is just templating and serving the client files, and acting as an interface inbetween the client and the socket.io server. It doesn't actually act as the socket.io server, and I don't believe it can.
So yes, you would still need something (node) to host the socket.io server to interact with the client, through PHP or otherwise. Personally, I'd skip Laravel/PHP altogether and just use node with koa/express/whatever to template your client (html/js/css/etc) files. Feels like an unnecessary abstraction to me.
The code below from socket.blade.php already has a connection to the actual socket.io server, so I don't see why the additional overhead of an HTTP POST through PHP/Laravel is a good idea. Security, perhaps, but you can handle that with the actual socket.io server as well.
var socket = io.connect('http://localhost:8890');
socket.on('message', function (data) {
$( "#messages" ).append( "<p>"+data+"</p>" );
});
For the real-time character of your use-case, websockets are definitely the way to go. The players that should get the updates should be in the same 'room', so you can broadcast changes more easily. For the other functionality you can either use websockets or regular API calls to your backend directly from your client-side app code with some kind of communication between your api and the socket server, e.g. through Redis.
TLDR:
All data through sockets, node server does api calls and broadcasts changes to active players
Use API from app directly, use pub/sub queue foo for communication between laravel and node to broadcast changes to active players
Option 1:
Angular frontend app
Set up websocket connection
Add triggers for game foo which will send data over the socket connection and is handled by your nodeserver
Only talks to sockets
Node server
Serves frontend app
Handles socket connections, divide players per game
Handles socket calls and calls laravel api to do mutations on your data
Process action and broadcast changes to players in game X
Laravel REST API
Auth x
Default CRUD foo
Option 2:
Angular frontend app
Talks to api directly
Uses sockets to listen for updates
Node server
Serves frontend app
Handle websocket data
Listen on queue for published data from API
Broadcast changes to players in game x over socket
Laravel REST API
Auth
Crud
Mutation x triggers publish in Redis or other queue, which the node server can/should listen on
I'm sure there are more ways you can set this up, you just have to decide where you want what. Maybe introducing Redis is something you do not want, in that case your node app will have more to do. If you do want to use something like Redis, you'll need to do API calls from either your frontend app or choose to do it through the node app anyway, combining the 2 options.
If you are planning to use websockets then there seems to be less use of laravel as only one socket is pretty capable of handling all the data that will be exchanged between the frontend and the backend, so if you don't mind changing your engine you can try Meteor, https://www.meteor.com/
I have a node.js server which only acts as a RESTful service. I have an angular js frontend which consumes the JSONs sent by Node server. I want to integrate socket.io into this application.
I tried https://github.com/btford/angular-socket-io but get weird errors like io is not defined and Unknown provider: $animateProvider. Are there any good examples with the exact scenario?