React server side render with nashorn - nashorn

Is it possible SSR react app with Nashorn?
It work for single thread. Nashorn render my app in 700-1600ms. Nodejs do it for the same time. BUT - I need ~30 seconds to eval my js app. As I understood I can't share bindings between threads, but eval results is part of the bindings.
I can't do server side render without eval script for every client. Is it correct? (create pool is bad idea - I don't know how many connections I can have - 1,5,10,20...)

Related

How to make sure that a React application is ready to be loaded by a user?

I am currently working on using Electron to run two applications: React as the UI app + FastAPI as the local backend server. I am running both applications from Electron using node's child_process.spawn method. However, these processes don't terminate by nature but I am not sure how to be certain that both applications are ready to use so I can allow Electron to launch the renderer window loading the React app.
For the backend server, I am thinking about implementing a method in Electron that keeps sending a GET request (IsTheServerReady call) method till the backend server responds. I am open to hearing better solutions for this. For React, however, I can't seem to find a direct way to do this.

Cuncurrency handling in express js

because of some issues such as having SSR, SSG, and CSR beside each other, I decided to create my own SSR for React js with express js, I'm using redux and saga, and I have several API calls to generate the data before rendering it.
so I had to use several promises in my server-side renderer, such as waiting for redux to finish all APIs, or waiting for styles and scripts, also I'm using react-ssr-prepass and it navigates through all my components (for dispatching actions that are required in SSR).
so I have a lot of thread-blocking stuff in my project.
for handling concurrency I started to use node-cluster, so I'll have several nodes on my server and it will increase the concurrency capacity, but it's not the best solution because, under heavy load, even node clustering won't be able to respond to all of the requests.
so I started to think about worker thread or child process in node js, so I make an instance of my server-side renderer on each request and do everything in the background, so concurrent requests won't wait for eachother to be done.
but the issue is in the child process or worker thread I can't use "import", since it's es6
so I have two questions
first of all, is there any way to use es6 in the child process? (I tried babel-esm-plugin but it's not supporting webpack 5)
second, is there any better idea than using worker thread of child process to increase the concurrency capacity?
so I found the solution for my first challenge, instead of running my renderer directly with the child process, I had to build it first, so I used webpack to make a cjs output of it, then use that output in the child process.
and for increase the performance, even more, I used a combination of SSR and SSG, so in each request I check if a file mapped to the route exists on the server, if it's not, I'm gonna use SSR renderer output to create a file, and serve the response to the user, then for next request since the cached file exists I use that cache file instead of rendering the result again.
finally I set a corn job on the server to clear the cache every 10 minute

Angular2 + Laravel with Real time & WebSockets

I built an app and i'm planning to make a real time battle with Angular 2 and laravel. For example, you hit the "attack" button, and your opponent see his life going down in real time.
My app built with:
frontend: Angular 2
Backend: PHP Laravel 5.2
Now I'm searching and learning for my real time battle component,
and I saw different guides and tutorials for it:
https://www.codetutorial.io/laravel-5-and-socket-io-tutorial/
http://4dev.tech/2016/02/creating-a-live-auction-app-with-angular-2-node-js-and-socket-io/
The first tutorial is about how to use Laravel 5 and socket io.
The second one is how to use Angular 2 with NODS JS and socket io.
When I say real time, I mean that both users see the same thing that is happening on the screen)
My Backend and Frontend are totally divided and I have no setup with NodeJS anywhere in my app.
Both users need to see actions happening during a battle in my app, and It need to go through my laravel API and shown via my Angular 2 battle component
My question is -
What's the best approach to real time app (seem websockets) using Angular2 and Laravel 5.2 to get the desired result of what I'm trying to achieve?
Laravel in this context is just templating and serving the client files, and acting as an interface inbetween the client and the socket.io server. It doesn't actually act as the socket.io server, and I don't believe it can.
So yes, you would still need something (node) to host the socket.io server to interact with the client, through PHP or otherwise. Personally, I'd skip Laravel/PHP altogether and just use node with koa/express/whatever to template your client (html/js/css/etc) files. Feels like an unnecessary abstraction to me.
The code below from socket.blade.php already has a connection to the actual socket.io server, so I don't see why the additional overhead of an HTTP POST through PHP/Laravel is a good idea. Security, perhaps, but you can handle that with the actual socket.io server as well.
var socket = io.connect('http://localhost:8890');
socket.on('message', function (data) {
$( "#messages" ).append( "<p>"+data+"</p>" );
});
For the real-time character of your use-case, websockets are definitely the way to go. The players that should get the updates should be in the same 'room', so you can broadcast changes more easily. For the other functionality you can either use websockets or regular API calls to your backend directly from your client-side app code with some kind of communication between your api and the socket server, e.g. through Redis.
TLDR:
All data through sockets, node server does api calls and broadcasts changes to active players
Use API from app directly, use pub/sub queue foo for communication between laravel and node to broadcast changes to active players
Option 1:
Angular frontend app
Set up websocket connection
Add triggers for game foo which will send data over the socket connection and is handled by your nodeserver
Only talks to sockets
Node server
Serves frontend app
Handles socket connections, divide players per game
Handles socket calls and calls laravel api to do mutations on your data
Process action and broadcast changes to players in game X
Laravel REST API
Auth x
Default CRUD foo
Option 2:
Angular frontend app
Talks to api directly
Uses sockets to listen for updates
Node server
Serves frontend app
Handle websocket data
Listen on queue for published data from API
Broadcast changes to players in game x over socket
Laravel REST API
Auth
Crud
Mutation x triggers publish in Redis or other queue, which the node server can/should listen on
I'm sure there are more ways you can set this up, you just have to decide where you want what. Maybe introducing Redis is something you do not want, in that case your node app will have more to do. If you do want to use something like Redis, you'll need to do API calls from either your frontend app or choose to do it through the node app anyway, combining the 2 options.
If you are planning to use websockets then there seems to be less use of laravel as only one socket is pretty capable of handling all the data that will be exchanged between the frontend and the backend, so if you don't mind changing your engine you can try Meteor, https://www.meteor.com/

What is the best way to communicate from Angular2 to Electron and back?

I have this simple app (Node.js, Electron, Angular2, TypeScript) like you can find in any tutorial.
What is the best way to communicate from Angular2 to Electron and back?
Let's say you want to call a system dialog. How would you do that?
These are my main source files:
My main Electron file
My main Angular2 index
My Angular2 bootstrap file
My Angular2 root component
You can treat the main Electron file like a server running in node. Meaning you can communicate with it any way you choose.
You can spin up an express http server and create some API endpoints to hit from your Angular code on the client-side.
You could fire up a socket.io server and use a websocket for communication.
You can also just straight up use those APIs right inside your angular code if you don't care about mixing system code with client-side code. Only do this if your app will always be an Electron app and never ported to a web app. If it's ever going to be a web app then your client-side Angular app should stick to using only font-end javascript code and let the main Electron file act as a server.
Another way to go is communicate using ipc events. Use ipcRenderer on browser side and ipcMain on the Electron side. That's pretty much what I have done in my app (work in progress) https://github.com/sumitkm/electricedit/
However I used KO not Angular.

Proxying WebSockets & HTTP using custom path with nodeJS to a separate NodeJS app?

CONTEXT
I am trying to build a new web app in NodeJS. This webapp uses two main components.
1) Code that I am writing within NodeJS - AKA my own logic flow.
2) A 3rd party open source NodeJS app - EtherCalc - that uses both HTTP & Socket.io
EtherCalc is a completely standalone web application on its own. It is a spreadsheet-in-a-browser+server meant to be used as a standalone app. Of importance, it has its own namespace (as in it has various pathnames that route to different functions).
My app and EtherCalc each run on their own ports independently of each other.
For simplicity's sake, let the domain name be localhost.
CHALLENGE I AIMED TO SOLVE
My application will be using both the spreadsheet capabilities of EtherCalc, as well as the non-spreadsheet-related logic flow of my own code. Users will have access to both interfaces. However, I want this all to appear to come from one URL/port - I don't want non-programmers to be specifying different ports to access different functionality.
The easiest way to tackle this for me was to create a namespace for EtherCalc within my app. Anything path that starts with /ethercalc/ automatically gets forwarded to the port that EtherCalc is running on (with the ethercalc/ removed from the request URL before it is forwarded). Anything that doesn't start with that stays within the standard server logic flow.
I'm using node-http-proxy to make this happen, using the following code:
proxy.proxyRequest(req,res,{
host:'localhost',
port:8000
});
PROBLEM I HAVE COME ACROSS
This seems to function fine initially - the spreadsheet interface loads when I go to any /ethercalc/ url. However, EtherCalc uses either WebSockets or JSON Polling in order to keep the server & the browser on the same page. This doesn't seem to work properly. For some reason, it takes a good 10 seconds to actually get that connection going. This is especially problematic if I've worked on a spreadsheet, then load it again later - there's a 10 second window before I actually see data I've put in beforehand.
After having this issue was to remove the url-changing functionality. Now my app simply forwards the requests to the port EtherCalc is running on. However, the problem remains. Connecting directly to the port that EtherCalc is on removes the problem.
Rewriting the code to use node-http-proxy's httpProxy.createServer() code didn't make a difference either - same exact outcome...
What's bizarre is that when connecting to my app port (the one that has a proxy/forwarding system in place), AFTER the lengthy wait period, it functions just fine - everything is completely synced up in real time.
Does anybody have any idea what it is that's going on?
TL;DR
I have a web app in NodeJS facing HTTP port 80.
I have a 3rd party web app in NodeJS using Socket.IO (EtherCalc) running on the same server on a non-public port.
I want to serve webpages from my own webapp AND webpages from EtherCalc to the same domain name on port 80.
I have a proxy (node-http-proxy) running on my webapp to make this possible.
For some reason when I do this, the handshake that takes place between the browser and the server for EtherCalc (now running through a proxy) takes FOREVER. (or rather 10 seconds, AKA unacceptable for a consumer facing webpage)
I have no idea why this happens.
Any clue/hints/suggestions/leads? Socket.IO is completely new to me - I have used Node for a long time (2 years), but it's been entirely in HTTP world. Proxies as well are new to me (this is my first time using one - using it for the namespace/port issues). As such, I really have no idea where to begin looking.

Resources