Implementing Request/Response interaction in Node-RED - node.js

In my current project, we are trying to implement a functionality using Node-RED for experiments and exploring new technologies.
The functionality is shown as follows. Here, the BadgeReader publishes its data using publish-subscribe to Proximity(it can be easily implemented using MQTT in Node-Red). The Proximity component receives data from BadgeReader and using that data it interact with ProfileDB using request/response interaction mode. Now, my question is--- how can we implmement request/response interaction in Node-RED? (Note that - Request/Response can be implemented using MQTT, but this question is related to dedicated request-response functionality in Node-RED?)

All the available database nodes will allow you to send a query and receive a reply before moving on to the next node in the flow.
There is also the http-request node that will do the same for a HTTP call to a remote service.
You can't do this with the Node-RED MQTT nodes because they either start a flow or end a flow. MQTT is asynchronous and publishers should be totally decoupled from subscribers so there is no way to know if a message ever reaches a subscriber, so no way to handle error cases or time outs properly. While it is possible to do request/response with MQTT it is not best suited this task.
If you want to do this with MQTT or something else then you may have to look at writing your own node, there is no generic request/response capabilities built into Node-RED.
P.S. given your flow of questions over the last few days you should probably look at the Node-RED mailing list here:
https://groups.google.com/forum/#!forum/node-red
It will be better suited to answering your questions than Stack Overflow

Related

Data Aggregator/composition service in Microservices

I am developing an application where there is a dashboard for data insights.
The backend is a set of microservices written in NodeJS express framework, with MySQL backend. The pattern used is the Database-Per-Service pattern, with a message broker in between.
The problem I am facing is, that I have this dashboard that derives data from multiple backend services(Different databases altogether, some are sql, some are nosql and some from graphDB)
I want to avoid multiple queries between front end and backend for this screen. However, I want to avoid a single point of failure as well. I have come up with the following solutions.
Use an API gateway aggregator/composition that makes multiple calls to backend services on behalf of a single frontend request, and then compose all the responses together and send it to the client. However, scaling even one server would require scaling of the gateway itself. Also, it makes the gateway a single point of contact.
Create a facade service, maybe called dashboard service, that issues calls to multiple services in the backend and then composes the responses together and sends a single payload back to the server. However, this creates a synchronous dependency.
I favor approach 2. However, I have a question there as well. Since the services are written in nodeJs, is there a way to enforce time-bound SLAs for each service, and if the service doesn't respond to the facade aggregator, the client shall be returned partial, or cached data? Is there any mechanism for the same?
GraphQL has been designed for this.
You start by defining a global GraphQL schema that covers all the schemas of your microservices. Then you implement the fetchers, that will "populate" the response by querying the appropriate microservices. You can start several instances to do not have a single point of failure. You can return partial responses if you have a timeout (your answer will incluse resolver errors). GraphQL knows how to manage cache.
Honestly, it is a bit confusing at first, but once you got it, it is really simple to extend the schema and include new microservices into it.
I can’t answer on node’s technical implementation but indeed the second approach allows to model the query calls to remote services in a way that the answer is supposed to be received within some time boundary.
It depends on the way you interconnect between the services. The easiest approach is to spawn an http request from the aggregator service to the service that actually bring the data.
This http request can be set in a way that it won’t wait longer than X seconds for response. So you spawn multiple http requests to different services simultaneously and wait for response. I come from the java world, where these settings can be set at the level of http client making those connections, I’m sure node ecosystem has something similar…
If you prefer an asynchronous style of communication between the services, the situation is somewhat more complicated. In this case you can design some kind of ‘transactionId’ in the message protocol. So the requests from the aggregator service might include such a ‘transactionId’ (UUID might work) and “demand” that the answer will include just the same transactionId. Now the sends when sent the messages should wait for the response for the certain amount of time and then “quit waiting” after X seconds/milliseconds. All the responses that might come after that time will be discarded because no one is expected to handle them at the aggregator side.
BTW this “aggregator” approach also good / simple from the front end approach because it doesn’t have to deal with many requests to the backend as in the gateway approach, but only with one request. So I completely agree that the aggregator approach is better here.

Implementing Push notification with React Ui, Node Server and Python API

I am new to the Javascript world, need to design a system where React Ui is calling a Node Server Rest API, which internally trigger Python Rest API, which is a hosting a long running (min 5 minutes) optimization model. We don't intend to make a blocking call instead wanted to notify the React Ui using Push notification (Observable Pattern). Something similar to Stack over flow page, where it provides an information that how many new questions are posted or a new answer or new comment is posted, which can then be refreshed and reviewed.
Sifting through the RxJs and Socket IO what I understood is, we can do the following:
Make an Async call using Observable, on the server Socket IO is used to subscribe and notify the client, which will do the communication between Node server and the React Client and follow a similar Push notification system at the Python server level, which notifies the Node server. Node server also executes a DB operation once the optimization job is complete
Another option is implementing the Queuing service between Node and python, which provides more durability and resilience to process the data using Pub - Sub model
Though our overall requirements are not enterprise oriented, we have low concurrency and resilience is not a critical factor, can some verify and let me know whether the above mentioned design options correct consideration or there are simpler options, which can help us meet our push notification use case

Node.js, Socket.IO, Express: Should app logic be in socket handlers or REST api?

I'm planning a non-trivial realtime chat platform. The app has several types of resources: Users, Groups, Channels, Messages. There are roughly 20 types of realtime events having to do with these resources: for instance, submitting a message, a user connecting or disconnecting, a user joining a group, a moderator kicking a user from a group, etc...
Overall, I see two paths to organizing all this complexity.
The first is to build a REST API to manage the resources. For instance, to send a message, POST to /api/v1/messages. Or, to kick a user from a group, POST to /api/v1/group/:group_id/kick/. Then, from within the Express route handler, call io.emit (made accessible through res.locals) with the updated data to notify all related clients. In this case, clients talk to the server through HTTP and the server notifies clients through socket.io.
The other option is to not have a rest API at all, and handle all events through socket.IO. For instance, to send a message, emit a SEND_MESSAGE event. Or, to kick a user, emit a KICK_USER event. Then, from within the socket.io event handler, call io.emit with the updated data to notify all clients.
Yet another option is to have certain actions handled by a REST API, others by socket.IO. For instance, to get all messages, GET api/v1/channel/:id/messages. But to post a message, emit SEND_MESSAGE to the socket.
Which is the most suitable option? How do I determine which actions need to be sent thorough an API, and which need to be sent through socket.io? Is it better not to have a REST API for this type of application?
Some of my thoughts so far, nothing conclusive:
Advantages of REST API over the socket.io-only approach:
Easier to organize hierarchically, more modular
Easier to test
More robust and elegant
Simpler auth implementation with middleware
Disadvantages of REST API over the socket.io-only approach:
Slightly less performant (source)
Since a socket connection needs to be open anyways, why not use it for everything?
Slightly harder to manage on the client side.
Thanks for reading !
This could be achieve this using sockets.
Why because a chat application will be having dozens of actions, like ..
'STARTS_TYPING', 'STOPS_TYPING', 'SEND_MESSAGE', 'RECIVE_MESSAGE',...
Accommodating all these features using rest api's will generate a complex system which lacks performance.
Also concept of rooms in socket.io simplifies lot of headache regarding group chat implementation.
So its better to build everything based on sockets[socket.io or web cluster].
Here is the solution I found to solve this problem.
The key mistake in my question was that I assumed a rest API and websockets were mutually exclusive, because I intended on integrating the business and database logic directly in express routes and socket.io handlers. Thus, choosing between socket.io and http was important, because it would influence the core business logic of my app.
Instead, it shouldn't matter which transport to use. The business logic has to be independent from the transport logic, in its own module.
To do this, I developed a service layer that handles CRUD tasks, but also more specific tasks such as authentication. Then, this service layer can be easily consumed from either or both express routes and socket.io handlers.
In the end, this architecture allowed me not to easily switch between transport technologies.

Implementing push notification using AWS lambda

I am referring to the diagram
NodeJS is used as run time in this case and AWS Lambda is used as event notifier (updates comes from other lambda or DB).
My challenge is, the "user browser" can also be a mobile client. The "API" should acts as a service which allows client (mobile or web) to subscribe, unsubscribe, or publish data, nothing else.
Can lambda works as API that has capabilities of "pushing events notifications" to directly clients?
Is there any solution and also sample work/source code can be used as POC?
Next question is, how can I scale such architecture since it becomes stateful (requires memory to remember states of clients connections)?
Or else, how possible is it persist client connections on DB (using frameworks like websocket or socket.io)?
AWS has the SNS service to send notifications, which you can use from Lambda.
You can also directly use the relevant platform's notification system e.g for iOS, Node has an "apn" module that is used to communicate with Apple's APNS service - it's straightforward to use and can be implemented in a Lambda function.
In brief:
Your iOS app registers for APNS which responds with an APNS device token. Your app should then send this to your API / server for storage.
Your API can then send notifications to APNS, referencing any device tokens, along with the private key file you create from the Apple Developer page.
APNS will send the notifications to the registered devices.
Here is a good tutorial.
Your other queries should perhaps be separate questions.
Can lambda works as API that has capabilities of "pushing events notifications" to directly clients?
Yes! As #AndyOS mentioned, SNS is a great service that is quite literally intended to send notifications. I won't go into details here to avoid duplication of response.
Is there any solution and also sample work/source code can be used as POC?
Or else, how possible is it persist client connections on DB (using frameworks like websocket or socket.io)?
If you are looking to use websockets, I'd encourage you to take a look at IoT (https://aws.amazon.com/iot). IoT supports the MQTT protocol (http://docs.aws.amazon.com/iot/latest/developerguide/protocols.html). This page also contains sample client-side code which might help you bootstrap your solution.
Next question is, how can I scale such architecture since it becomes stateful (requires memory to remember states of clients connections)?
You can view the service limits of IoT at http://docs.aws.amazon.com/general/latest/gr/aws_service_limits.html#limits_iot. You would need to decide if your app fits within these bounds, depending on the various metrics your app has (number of requests per second, number of concurrent connections, etc.).

What is the best way to communicate between two servers?

I am building a web app which has two parts. In one part it uses a real time connection between the server and the client and in the other part it does some cpu intensive task to provide relevant data.
Implementing the real time communication in nodejs and the cpu intensive part in python/java. What is the best way the nodejs server can participate in a duplex communication with the other server ?
For a basic solution you can use Socket.IO if you are already using it and know how it works, it will get the job done since it allows for communication between a client and server where the client can be a different server in a different language.
If you want a more robust solution with additional options and controls or which can handle higher traffic throughput (though this shouldn't be an issue if you are ultimately just sending it through the relatively slow internet) you can look at something like ØMQ (ZeroMQ). It is a messaging queue which gives you more control and lots of different communications methods beyond just request-response.
When you set either up I would recommend using your CPU intensive server as the stable end(server) and your web server(s) as your client. Assuming that you are using a single server for your CPU intensive tasks and you are running several NodeJS server instances to take advantage of multi-cores for your web server. This simplifies your communication since you want to have a single point to connect to.
If you foresee needing multiple CPU servers you will want to setup a routing server that can route between multiple web servers and multiple CPU servers and in this case I would recommend the extra work of learning ØMQ.
You can use http.request method provided to make curl request within node's code.
http.request method is also used for implementing Authentication api.
You can put your callback in the success of request and when you get the response data in node, you can send it back to user.
While in backgrount java/python server can utilize node's request for CPU intensive task.
I maintain a node.js application that intercommunicates among 34 tasks spread across 2 servers.
In your case, for communication between the web server and the app server you might consider mqtt.
I use mqtt for this kind of communication. There are mqtt clients for most languages, including node/javascript, python and java. In my case I publish json messages using mqtt 'topics' and any task that has registered to subscribe to a 'topic' receives it's data when published. If you google "pub sub", "mqtt" and "mosquitto" you'll find lots of references and examples. Mosquitto (now an Eclipse project) is only one of a number of mqtt brokers that are available. Another very good broker that is written in Java is called hivemq.
This is a very simple, reliable solution that scales well. In my case literally millions of messages reliably pass through mqtt every day.
You must be looking for socketio
Socket.IO enables real-time bidirectional event-based communication.
It works on every platform, browser or device, focusing equally on reliability and speed.
Sockets have traditionally been the solution around which most
realtime systems are architected, providing a bi-directional
communication channel between a client and a server.

Resources