MQTT server in Haskell - haskell

I would like to implement an MQTT server in Haskell.
I already have a HTTP REST server made in Haskell and would like to add some MQTT endpoints to that server.
For instance, there is an endpoint POST /foo, allowing users to send some information that will be stored in a Mongo DB. I would like to add an MQTT endpoint: if someone performs a PUBLISH with topic "/foo", the data will be stored to the same Mongo database, using the same internal functions than the POST.
Similarly for the SUBSCRIBE, the data should come from the backend database.
I saw http://hackage.haskell.org/package/mqtt-0.1.1.0
and
https://github.com/lpeterse/haskell-hummingbird
But I'm not sure if they are useable as a library to create the endpoints with specific callbacks.
So this is a two-fold question:
Any feedback on implementing MQTT endpoints in Haskell?
Is merging an HTTP and MQTT servers a good idea?

After some investigation, here are my findings on MQTT in Haskell:
First library I found is http://hackage.haskell.org/package/mqtt-hs. However it is buggy and not maintained anymore.
I'm now using http://hackage.haskell.org/package/net-mqtt, which works well.
I also understood that I didn't need to make a MQTT server: I just need to develop a client! My MQTT client will subscribe on a standard MQTT server (Mosquitto), and sink the data received in my database.
Another pain point of MQTT is the authentication/authorization. My server uses Keycloak for access control, while Mosquitto uses a static ACL file. I solved this problem by developing an authorization proxy for MQTT: the proxy sits in front of Mosquitto and filter the requests, based on Keycloak decisions.

Related

Nodejs - Transfer data between two server and client

I started to implement a HTTP ping health monitor as a private project with React and Node.js. I thought about making monitor with intervals that will send an axios request to server to receive all the urls and will return the results to server which will be shown later on in the client side.
I don't wanna use REST API to transfer data between the monitor and the server and to show it lively in the client side.
MONITOR <--> SERVER <--> CLIENT
What should I use instead of REST API in order to communicate between the monitor and the server? I know socket.io is fine to communicate between the client and the server but it is not so good for scaling.
What will be good and fast to transfer data for this specific project and not so hard to implement?
Thanks!
You can work with Server Sent Events in NodeJS, that is a way of receiving events from the server. Then you can use EventSource to open a connection to the server to begin receiving events from it.
https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events
Take a look at this tutorial from DigitalOcean:
How To Use Server-Sent Events in Node.js to Build a Realtime App
Also take a look at Socket.io:
https://socket.io/

Does MQTT protocol support database?

As we know MQTT have using subscribe/publish method. May i know what platform user can save the database using MQTT protocol. Its hivemq or mosquito support database so i can see previous data recorded from the sensor?
If MQTT can support database. What other method beside using apache webserver.
MQTT is a Pub/Sub protocol, it is purely for delivering messages. What happens to those messages once delivered is not the concern of the protocol.
If you want to store all messages then you are going to need to implement that yourself.
This is either as:
A dedicated client that will subscribe to # and then store the messages to a database.
Some brokers have a plugin API that will allow you to register hooks that can intercept every message and store that to a database.
You will have to research if any broker you want to use supports plugins of this nature.

Subscribe/publish with Express js

I have an Express js server that supplies weather forecast information to its clients. Right now, it's basically an RPC API where the client (a python client in this case) requests different types of weather info from different endpoints. This is fine for right now, but I was thinking about implementing a subscription model. Rather than connecting to different endpoints for different information, the client could just subscribe for the info that it wants and the server would send it every so often (obviously configurable by the client).
Is there a way to do this with express? Would I have to set up a server on the client side as well to listen for the publish events?
A very good option is RabbitMQ, it's used for that right purpose.
I tried to find a good tutorial for you as a reference, however, I only find one in portuguese, but if you search around google you will find others in english.
But it's very simple, just like you want, you publish to the queue on some topic and have other clients subscribing that topic and processing the messages
Here is the link
https://medium.com/#programadriano/rabbitmq-publish-subscribe-com-node-js-9363848f58fe

REST API chat - endpoint for real-time fetching messages

I have a REST API server powered by express+mongodb. There are a couple of endpoints with different resources. One of them is chat API. I already have several basic endpoints like:
POST http://api.example.com/v1/chat - to create chat
POST http://api.example.com/v1/chat/:id/message - to send message to existing chat
GET http://api.example.com/v1/chat/:id/messages - to get messages in the the specified chat
But I need to provide a way for API consumers to efficiently get new messages in real-time without reloading the page.
For now as you see it's possible just to poll GET endpoint from client but it seems not performant. For example client can have UI which will show new messages count in header (some kind of notifications).
I was thinking about websockets. Is it possible for example to provide endpoint like /chat/:id/subscribe which will proxy sockets' server and connect to it on client?
Is there some good examples of such API design where I can get inspiration from or maybe you can give me piece of advice? Thanks!
socket.io is the package you are looking for.
The namespace section in it's documentation is a good solution because namespaces can be authorization protected. It represents a pool of connected sockets.
Here is how I would do it :
Create a document for the chat between the two users with this route :
POST http://api.example.com/v1/chat
Create a namespace with socket.io when a user sends a message to another connected user and store it into your user's document in your database. This route would create a namespace and/or emit the message :
POST http://api.example.com/v1/chat/:id/message
In the client, you have to use socket.io again to listen to the messages in the namespace.
UPDATE FOR SCALABILITY :
Here is a good stackoverflow answered question about implementing a scalable chat server : Strategy to implement a scalable chat server
As you can see in this post, mongodb might not be the best solution to store your messages.

RESTful backend and socket.io to sync

Today, i had the idea of the following setup. Create a nodejs server along with express and socket.io. With express, i would create a RESTful API, which is connected to a mongo. BackboneJS or similar would connect the client to that REST API.
Now every time the mongodb(ie the data in it iam interested in) changes, socket.io would fire an event to the client, which would carry a courser to the data which has changed. The client then would trigger the appropriate AJAX requests to the REST to get the new data, where it needs it.
So, the socket.io connection would behave like a synchronize trigger. It would be there for the entire visit and could also manage sessions that way. All the payload would be send over http.
Pros:
REST API for use with other clients than web
Auth could be done entirely over socket.io. Only sending token along with REST requests.
Use the benefits of REST.
Would also play nicely with pub/sub service like Redis'
Cons:
Greater overhead, than using pure socket.io.
What do you think, are there any great disadvantages i did not think of?
I agree with #CharlieKey, you should send the updated data rather than re-requesting.
This is exactly what Tower is doing:
save some data: https://github.com/viatropos/tower/blob/development/src/tower/model/persistence.coffee#L77
insert into mongodb (cursor is a query/persistence abstraction): https://github.com/viatropos/tower/blob/development/src/tower/model/cursor/persistence.coffee#L29
notify sockets: https://github.com/viatropos/tower/blob/development/src/tower/model/cursor/persistence.coffee#L68
emit updated records to client: https://github.com/viatropos/tower/blob/development/src/tower/server/net/connection.coffee#L62
The disadvantage of using sockets as a trigger to re-request with Ajax is that every connected client will have to fetch the data, so if 100 people are on your site there's going to be 100 HTTP requests every time data changes - where you could just reuse the socket connections.
I think that pushing the updated data with the socket.io event would be better than re-requesting the lastest. Even better you could only push the modified pieces of data decreasing the amount of data sent over the line. Overall though a interesting idea.
I'd look into Now.js since it does pretty much exactly what you need.
It creates a namespace which is shared among the client and server. The server can call functions on the client directly and vice versa.
That is if you insist on your current infrastructure decision to use MongoDB and Node.js, otherwise there would be CouchDB which is a full web server and document database with sophisticated replication mechanisms built-in.

Resources