Node.js Server Sent Events with Load Balancer - node.js

I am working on a node.js based Web application that needs to be able to push data down to the browser. The Web app will be sitting behind a load balancer.
Assuming the Web app has a POST REST API as:
/update_client
and assuming there is a third-party application calls this API to push some data to the Web app, and then the Web app pushes the data down to the browser.
Now assuming I have two servers behind the load balancer, with the Web app running. A browser client connects to server 1 to listen on the event. Then the other applications hits the /update_client API at the server 2. Now since the two activities happen on two different servers, how can server 2 notify server 1 to send the data to the connected clients?
And what if I am using auto scaling, with dynamic number of servers behind the load balancer?

You need to have some kind of shared resource behind the servers so they all know about updates. I show how to use Redis Pub / Sub for this in a blog post I wrote recently.
Server Sent Events with Node JS

Related

Deploy Node server that isn't a web application

I created a Node server that receives events through webhooks, handles them, and posts their data to one API endpoint. Currently I'm deploying it using AWS Elastic Beanstalk, but I don't know if it's the best option.
I don't need load balancers.
I don't need web servers like Apache/Nginx.
My Node server does not have any ports to receive requests, since it's a simple server that only handles webhooks events. So the EBS service will always be without metrics for requests (severe health status - because doesn't handle any of the health requests).
Should I use another type of AWS service? Docker?
Finally, I went for it with the App Runner AWS service for running containers. No load balancers, just elastic sizing. No web servers.

Secured Socket.IO access to GCP VM.s

I'm building a backend for a multiplayer game. It's Node.JS based, and deployed to Google Compute-Engine VMs, in a managed-instance group.
This backend manages many instances of a game, each game instances hosts a several players, and since each game instance is stateful, it should be managed on the SAME VM.
The connection flow is as follows:
A client opens the game page with an ID of the game instance
The client requests the API for an available game-server IP
The client then connects to the game-server DIRECTLY (not via any load balancer)
The problem is that the connection to the game-servers must be secured, and since the connection is not via a load balancer, I don't have any way to secure the connection.
How can I solve this problem?
Thanks

RabbitMQ security in mobile app

I am using Rabbit MQ broker in one of mobile apps that we are developing, I am bit puzzled about security aspects. we are using cloud hosted rabbitmq and hosting platform has given us user name and password (which have been changed since) and we are using SSLconnection so not so much worried about MIM or eavesdropping.
my concern is anybody who knows host and port can make connection to rabbitmq, since we have mobile app we are storing rabbitmq user name and password on device (although encrypted) so I guess that anybody who gets physical access to device and somehow decrypts username password can login to rabbitmq, and once you are logged in you can pretty much do anything on rabbitmq like deleting queues etc..
How are MQ like Rabbitmq used in mobile environment. Is there a better / more secure way of using rabbitmq.
In my experience, it is best to not have your mobile app connect to rabbitmq directly. Use a web server in between the app and RabbitMQ. Have your mobile app connect to your web server via HTTP based API calls. The web server will connect to RabbitMQ, and you won't have to worry about the mobile app having the connection information in it.
There are several advantages of this, on top of the security problem:
better management of RabbitMQ connections
easier to scale number of mobile users
ability to add more logic and processing to the back-end, as needed, without changing the mobile app
creating a connection to RabbitMQ is an expensive operation. It requires a TCP/IP connection. once that connection is open it stays open until you close it. if you open a connection from your mobile app and leave it open, you are reducing the number of available connections to RabbitMQ. if you open and close the connection quickly, you are inducing a lot of extra cost in creating and closing the connections constantly.
with a web server in the middle, you can open a single connection and have it manage multiple mobile devices. the web server will handle the http requests and use the one connection to rabbitmq to push messages to it.
since an HTTP web request is a short-lived connection, you'll be able to handle more users in a short period of time, than you would with direct rabbitmq connections.
this ultimately leads to better scalability as you can add another web server to handle thousands more mobile app instances, while only adding 1 new RabbitMQ connection.
this also lets you add middle-tier logic inside of the web server. you can add additional layers of processing as needed, without changing the mobile app. change the web server code and redeploy as needed.
if you must do this without a server in the middle, you likely won't be able to get around the security issue that you're having. the mobile device will contain the necessary information to make the connection.

Connections pooling REST calls made from Bluemix nodejs apps in to DataCenter services via Datapower

Hi We have a UI component deployed to Bluemix on Noedjs which makes REST service calls (JSON/XML) to services deployed in Data-center. These calls will go through the IBM Data Power gateway as a security proxy.
Data Power establishes an HTTPS Mutual Authentication connection (using certs that are exchanged offline) to the caller.
Although this method is secure it is time consuming to set up and if this connection is in setup for each service request it will create a slow response for the end user.
To optimize response time we are looking for any solution which can pool connections between nodejs app deployed on Bluemix and DataPower security proxy. Any one has any experience in this area?
In regards to "it is time-consuming to set up", in datapower you can create a multi-protocol gateway (MPGW) in front of your services to act as router. The MPGW will match services calls based on their URI and route them accordingly. In this scenario, you will only need to configure a single endpoint in the Bluemix Cloud Integration service in order to work with all your services. One downside to this approach is that it will be harder to control access to specific on-premise services because they will all be exposed to your Bluemix app as a single service.
In regards to optimizing response times, where are you seeing the bottleneck?
If the establishment of the tcp connections is causing too much overhead, you should be able to configure your Node.js app to use or re-use persistent connections via keepalive settings or you can look into setting up a connection pool that manages that for you (e.g. https://www.npmjs.com/package/generic-pool seems a popular choice).
On the datapower side, make sure the front/back persistent timeout is set according to your requirements:http://www-01.ibm.com/support/knowledgecenter/SS9H2Y_7.2.0/com.ibm.dp.doc/mpgw_availableproperties_serviceview.html?lang=en
Other timeout values in datapower can be found at http://www-01.ibm.com/support/docview.wss?uid=swg21469404

Connectivity between NodeJS applications behind load balancer

I'm currently working on nodejs application and I got small issue.
My NodeJS application consists of 2 parts:
Internal API from other applications. Let's call this part API.
User faced web server (Express + Socket.io). Let's call this Web.
We're receiving a lot of calls to API from our other internal applications. Some of this calls would generate notifications to web users (let's imaging it's online-chat).
So if we have message for client #1000 and he's online (connected to Web application through Socket.io) we would emit message throught Socket.io to this client. Everything works fine.
But there is an issue.
We're going to introduce load balancer between our NodeJS application (it's one application, so both parts - API and Web would be behind the load balancer). Now let's imagine that we have load balancer and 2 servers with this application: server1 and server2.
Thus some of API calls are sent to server1 and some of them are sent to server2. So let's imagine we got API call to server1 and this call should send a message to client #1000. But this client has open connection to server2.
The question is: is there any best practices or common solutions - how these two servers should communicate? One of possible solutions could be open socket connection between all servers with nodejs application and if we need to send a message to client - just broadcast it so every server could check if client is connected at this moment and send the message to correct client.

Resources