Implementing Push notification with React Ui, Node Server and Python API - node.js

I am new to the Javascript world, need to design a system where React Ui is calling a Node Server Rest API, which internally trigger Python Rest API, which is a hosting a long running (min 5 minutes) optimization model. We don't intend to make a blocking call instead wanted to notify the React Ui using Push notification (Observable Pattern). Something similar to Stack over flow page, where it provides an information that how many new questions are posted or a new answer or new comment is posted, which can then be refreshed and reviewed.
Sifting through the RxJs and Socket IO what I understood is, we can do the following:
Make an Async call using Observable, on the server Socket IO is used to subscribe and notify the client, which will do the communication between Node server and the React Client and follow a similar Push notification system at the Python server level, which notifies the Node server. Node server also executes a DB operation once the optimization job is complete
Another option is implementing the Queuing service between Node and python, which provides more durability and resilience to process the data using Pub - Sub model
Though our overall requirements are not enterprise oriented, we have low concurrency and resilience is not a critical factor, can some verify and let me know whether the above mentioned design options correct consideration or there are simpler options, which can help us meet our push notification use case

Related

Node.js, Socket.IO, Express: Should app logic be in socket handlers or REST api?

I'm planning a non-trivial realtime chat platform. The app has several types of resources: Users, Groups, Channels, Messages. There are roughly 20 types of realtime events having to do with these resources: for instance, submitting a message, a user connecting or disconnecting, a user joining a group, a moderator kicking a user from a group, etc...
Overall, I see two paths to organizing all this complexity.
The first is to build a REST API to manage the resources. For instance, to send a message, POST to /api/v1/messages. Or, to kick a user from a group, POST to /api/v1/group/:group_id/kick/. Then, from within the Express route handler, call io.emit (made accessible through res.locals) with the updated data to notify all related clients. In this case, clients talk to the server through HTTP and the server notifies clients through socket.io.
The other option is to not have a rest API at all, and handle all events through socket.IO. For instance, to send a message, emit a SEND_MESSAGE event. Or, to kick a user, emit a KICK_USER event. Then, from within the socket.io event handler, call io.emit with the updated data to notify all clients.
Yet another option is to have certain actions handled by a REST API, others by socket.IO. For instance, to get all messages, GET api/v1/channel/:id/messages. But to post a message, emit SEND_MESSAGE to the socket.
Which is the most suitable option? How do I determine which actions need to be sent thorough an API, and which need to be sent through socket.io? Is it better not to have a REST API for this type of application?
Some of my thoughts so far, nothing conclusive:
Advantages of REST API over the socket.io-only approach:
Easier to organize hierarchically, more modular
Easier to test
More robust and elegant
Simpler auth implementation with middleware
Disadvantages of REST API over the socket.io-only approach:
Slightly less performant (source)
Since a socket connection needs to be open anyways, why not use it for everything?
Slightly harder to manage on the client side.
Thanks for reading !
This could be achieve this using sockets.
Why because a chat application will be having dozens of actions, like ..
'STARTS_TYPING', 'STOPS_TYPING', 'SEND_MESSAGE', 'RECIVE_MESSAGE',...
Accommodating all these features using rest api's will generate a complex system which lacks performance.
Also concept of rooms in socket.io simplifies lot of headache regarding group chat implementation.
So its better to build everything based on sockets[socket.io or web cluster].
Here is the solution I found to solve this problem.
The key mistake in my question was that I assumed a rest API and websockets were mutually exclusive, because I intended on integrating the business and database logic directly in express routes and socket.io handlers. Thus, choosing between socket.io and http was important, because it would influence the core business logic of my app.
Instead, it shouldn't matter which transport to use. The business logic has to be independent from the transport logic, in its own module.
To do this, I developed a service layer that handles CRUD tasks, but also more specific tasks such as authentication. Then, this service layer can be easily consumed from either or both express routes and socket.io handlers.
In the end, this architecture allowed me not to easily switch between transport technologies.

Implementing push notification using AWS lambda

I am referring to the diagram
NodeJS is used as run time in this case and AWS Lambda is used as event notifier (updates comes from other lambda or DB).
My challenge is, the "user browser" can also be a mobile client. The "API" should acts as a service which allows client (mobile or web) to subscribe, unsubscribe, or publish data, nothing else.
Can lambda works as API that has capabilities of "pushing events notifications" to directly clients?
Is there any solution and also sample work/source code can be used as POC?
Next question is, how can I scale such architecture since it becomes stateful (requires memory to remember states of clients connections)?
Or else, how possible is it persist client connections on DB (using frameworks like websocket or socket.io)?
AWS has the SNS service to send notifications, which you can use from Lambda.
You can also directly use the relevant platform's notification system e.g for iOS, Node has an "apn" module that is used to communicate with Apple's APNS service - it's straightforward to use and can be implemented in a Lambda function.
In brief:
Your iOS app registers for APNS which responds with an APNS device token. Your app should then send this to your API / server for storage.
Your API can then send notifications to APNS, referencing any device tokens, along with the private key file you create from the Apple Developer page.
APNS will send the notifications to the registered devices.
Here is a good tutorial.
Your other queries should perhaps be separate questions.
Can lambda works as API that has capabilities of "pushing events notifications" to directly clients?
Yes! As #AndyOS mentioned, SNS is a great service that is quite literally intended to send notifications. I won't go into details here to avoid duplication of response.
Is there any solution and also sample work/source code can be used as POC?
Or else, how possible is it persist client connections on DB (using frameworks like websocket or socket.io)?
If you are looking to use websockets, I'd encourage you to take a look at IoT (https://aws.amazon.com/iot). IoT supports the MQTT protocol (http://docs.aws.amazon.com/iot/latest/developerguide/protocols.html). This page also contains sample client-side code which might help you bootstrap your solution.
Next question is, how can I scale such architecture since it becomes stateful (requires memory to remember states of clients connections)?
You can view the service limits of IoT at http://docs.aws.amazon.com/general/latest/gr/aws_service_limits.html#limits_iot. You would need to decide if your app fits within these bounds, depending on the various metrics your app has (number of requests per second, number of concurrent connections, etc.).

Implementing Request/Response interaction in Node-RED

In my current project, we are trying to implement a functionality using Node-RED for experiments and exploring new technologies.
The functionality is shown as follows. Here, the BadgeReader publishes its data using publish-subscribe to Proximity(it can be easily implemented using MQTT in Node-Red). The Proximity component receives data from BadgeReader and using that data it interact with ProfileDB using request/response interaction mode. Now, my question is--- how can we implmement request/response interaction in Node-RED? (Note that - Request/Response can be implemented using MQTT, but this question is related to dedicated request-response functionality in Node-RED?)
All the available database nodes will allow you to send a query and receive a reply before moving on to the next node in the flow.
There is also the http-request node that will do the same for a HTTP call to a remote service.
You can't do this with the Node-RED MQTT nodes because they either start a flow or end a flow. MQTT is asynchronous and publishers should be totally decoupled from subscribers so there is no way to know if a message ever reaches a subscriber, so no way to handle error cases or time outs properly. While it is possible to do request/response with MQTT it is not best suited this task.
If you want to do this with MQTT or something else then you may have to look at writing your own node, there is no generic request/response capabilities built into Node-RED.
P.S. given your flow of questions over the last few days you should probably look at the Node-RED mailing list here:
https://groups.google.com/forum/#!forum/node-red
It will be better suited to answering your questions than Stack Overflow

What is the best way to communicate between two servers?

I am building a web app which has two parts. In one part it uses a real time connection between the server and the client and in the other part it does some cpu intensive task to provide relevant data.
Implementing the real time communication in nodejs and the cpu intensive part in python/java. What is the best way the nodejs server can participate in a duplex communication with the other server ?
For a basic solution you can use Socket.IO if you are already using it and know how it works, it will get the job done since it allows for communication between a client and server where the client can be a different server in a different language.
If you want a more robust solution with additional options and controls or which can handle higher traffic throughput (though this shouldn't be an issue if you are ultimately just sending it through the relatively slow internet) you can look at something like ØMQ (ZeroMQ). It is a messaging queue which gives you more control and lots of different communications methods beyond just request-response.
When you set either up I would recommend using your CPU intensive server as the stable end(server) and your web server(s) as your client. Assuming that you are using a single server for your CPU intensive tasks and you are running several NodeJS server instances to take advantage of multi-cores for your web server. This simplifies your communication since you want to have a single point to connect to.
If you foresee needing multiple CPU servers you will want to setup a routing server that can route between multiple web servers and multiple CPU servers and in this case I would recommend the extra work of learning ØMQ.
You can use http.request method provided to make curl request within node's code.
http.request method is also used for implementing Authentication api.
You can put your callback in the success of request and when you get the response data in node, you can send it back to user.
While in backgrount java/python server can utilize node's request for CPU intensive task.
I maintain a node.js application that intercommunicates among 34 tasks spread across 2 servers.
In your case, for communication between the web server and the app server you might consider mqtt.
I use mqtt for this kind of communication. There are mqtt clients for most languages, including node/javascript, python and java. In my case I publish json messages using mqtt 'topics' and any task that has registered to subscribe to a 'topic' receives it's data when published. If you google "pub sub", "mqtt" and "mosquitto" you'll find lots of references and examples. Mosquitto (now an Eclipse project) is only one of a number of mqtt brokers that are available. Another very good broker that is written in Java is called hivemq.
This is a very simple, reliable solution that scales well. In my case literally millions of messages reliably pass through mqtt every day.
You must be looking for socketio
Socket.IO enables real-time bidirectional event-based communication.
It works on every platform, browser or device, focusing equally on reliability and speed.
Sockets have traditionally been the solution around which most
realtime systems are architected, providing a bi-directional
communication channel between a client and a server.

How are Node.js+Socket.io+MongoDB webapps truly asynchronous?

I have a good old-style LAMP webapp. A week ago I needed to add a push notification mechanism to it.
Therefore, what I did was to add node.js+socket.io on the server and poll the MySQL database every 10 seconds using node.js to check whether there were new items: if so, I would have sent them to the client(s) with socket.io.
I was pretty happy with the result, even if that is not a proper realtime notification (as there is a lag of up to 10 secs).
Now, I am about to build a new webapp which will need push notifications, too. I am wondering whether to go with the same approach as the first one (that I believe is more stable and mature) or to go totally Node.js, without PHP and Apache. As for the database, I have already decided to go for MongoDB.
Finally, my question is: if I go for Node.js+Socket.io+MongoDB will I get a truly near-real-time webapp? I mean, as soon as a new record is inserted into MongoDB, will there be some sort of event triggered that I can catch via node.js, do some checking on it and, if relevant, send the notification to the client? Or will there be anyway some sort of polling on the db server-side and lag, as with my first LAMP webapp?
A related question: can you build a realtime webapp on MySQL without doing any polling as I did with my first app. Or do you need MongoDB (or Redis)?
I hope this question is not too silly - sorry, I am just starting with Node.js and co.
Thanks.
I understand your problem because I switched to node.js from php/apache/mysql too.
Generally node.js is stable, modules and your scripts are the main reasons for errors
Real-time has nothing to do with database, it's all about client and server, you can query as many data as you want in your requests and push it to the other client.
Choosing node.js is very wise but it's harder to implement.
When you insert a new record to your db, the event is the request itself, you will make a push event along with the database query something like:
// Please note this is not real code, just an example of the idea
app.get('/query', function(request, response){
// Query your database
db.query('SELECT * FROM users', function(rows){
// Push notification to dan
socket.emit('database_query_executed', 'to_dan', rows);
// End request
response.end('success');
})
})
Of course you can use MySQL! And any database you want, as I said real-time has nothing to do with databases because the database is in the middle of the process and it's totally optional.
If you want to use node.js for push notifications and php/apache for mysql then you will need to create 2 requests for each server something like:
// this is javascript
ajax('http://node.yoursite.com/push', node_options)
ajax('http://php.yoursite.com/mysql_query', php_options)
or if you want just one request, or you want to use a form, you can call your php and inside php you can create an http or net request to node.js from php, something like:
// this is php
new HttpRequest('http://node.youtsite.com/push', HttpRequest::METH_GET);
Using:
A regular MongoDB Collection as the Store,
A MongoDB Capped Collection with Tailable Cursors as the Queue,
A Node worker with Socket.IO watching the Queue as the Worker,
A Node server to serve the page with the Socket.IO client, and to receive POSTed data (or however else the data gets added) as the Server
It goes like:
The new data gets sent to the Server,
The Server puts the data in the Store,
The Server adds the data's ObjectID to the Queue,
The Queue will send the newly arrived ObjectID to the open Tailable Cursor on the Worker,
The Worker goes and gets the actual data in the ObjectID from the Store,
The Worker emits the data through the socket,
The client receives the data from the socket.
This is 'push' from the initial addition of the data all the way to receipt at the client - no polling, so as real-time as you can get given the processing time at each step.
Re: triggers in MongoDB - please see this answer: https://stackoverflow.com/a/12405093/1651408
There are much more convenient triggers in MySQL, but to call Node.js from them would require a bit of work with MySQL UDFs (user-defined functions), for instance pushing data through a Unix socket. Please note that this is necessary only when other applications (besides your Node.js process) are updating the database, and be sure to choose InnoDB as storage in this case (row- vs. table-level locking).
Can see no big problem with your technology choice of sockets.io, even if client-side web sockets aren't supported, you'll fall back (gracefully, I hope) to polling.
Finally, your question is not silly at all, since push technology is definitely superior to the flood of polling requests - it scales better. EDIT: However, would not describe either technology as real-time.
Another EDIT: for a quite well-known and successful setup of this kind please read this: http://blog.fogcreek.com/the-trello-tech-stack/
Have you discovered Chole? It works separately from your web sever and interfaces with it by using HTTP POSTs. That way you can code your web app any which way you want.
Actually Using Push Technology like Socket.IO helps you to use
the server's resource efficiently and also helps you to leverage old browsers to modern browsers making websocket or websocket-like connection.
10 sec polling is a HTTP request which is expensive especially when a lot of users present.
Unlike polling technology, push technology is relatively cheap. Users' client is opening a dedicated socket(ie. websocket) to listen to the server's push notification.
And usually your client-side JavaScript do some actions when the push notification is received.
Using your LAMP stack and Socket.IO with different port (other than 80) will be good enough to implement what you need.
But using Node.js + MongoDB + Socket.IO actually helps you to manage your server's resource much efficiently.
Because those three have non-blocking nature.
If you understand non-blocking concept correctly and implement your app appropriately,
your identical app, an app with same feature but with different language and different database, would be able to handle a lot more requests than general LAMP stack.
Above picture is a famous chart of comparing Non-blocking vs Thread way to handle concurrency
Apache(Thread) vs Nginx(Non-blocking)
MySQL is a great database. I believe you won't need join and transactions for realtime notification.
MongoDB does not have those two features unless you implement similar features by yourself.
Because of not having those two and some characteristics of its own, MongoDB can store and fetch data much faster than traditional SQL databases.
Switching from MySQL to MongoDB will decrease the time taking to insert and fetch data.
with JS you can open a socket to your server (not old browser), the server will have a ah-hoc program (on an ad-hoc port, so you need the permission to open door and run program on your server) that will send data (almost) realtime from and to the client, and without the HTTP's protocol overhead.old browser will just fall-back to polling mechanism.
I can't see other way to do this (probably there are already "coocked" framework that do this)

Resources