Is possible node.js interaction with JMS queue ? Context I need to implement a solution that needs a JMS and node to work, a can make a communication happen via java, but I want to know if it is possible to do it via node directly ?
The answer to your question will depend on which JMS broker you use. If your JMS broker also supports STOMP then you can use the node-stomp client. For example, both the ActiveMQ 5.x and ActiveMQ Artemis brokers support STOMP & JMS.
Related
As we know MQTT have using subscribe/publish method. May i know what platform user can save the database using MQTT protocol. Its hivemq or mosquito support database so i can see previous data recorded from the sensor?
If MQTT can support database. What other method beside using apache webserver.
MQTT is a Pub/Sub protocol, it is purely for delivering messages. What happens to those messages once delivered is not the concern of the protocol.
If you want to store all messages then you are going to need to implement that yourself.
This is either as:
A dedicated client that will subscribe to # and then store the messages to a database.
Some brokers have a plugin API that will allow you to register hooks that can intercept every message and store that to a database.
You will have to research if any broker you want to use supports plugins of this nature.
I am working on a POC using Spring Integration and STOMP. The initial POC is successful.
I followed the adapters configuration mentioned in https://docs.spring.io/spring-integration/reference/html/stomp.html#stomp
In my POC, I did not include the last two #Bean definitions from the above link.
Inbound Channel Adapter and Message Handlers were sufficient to handle the incoming messages.
Now, my question is:
What is the difference between Inbound channel adapters and application event listing message producers?
Is ApplicationListener used when we follow DSL as mentioned in an example here?
Thanks,
Mahesh
Well, as you noticed in that Spring Integration documentation about STOMP support there is some bunch of ApplicationEvents emitted by STOMP Channel Adapters. You indeed can handle them using regular ApplicationListener (#EventListener) if your logic for handling those events is pretty much simple and doesn't need further distribution. But if your logic is much complicated and you may need store an even (or its part) in some database, or send via email, do that in parallel after some aggregtion, etc., then indeed that ApplicationEventListeningMessageProducer is much better solution when we have Spring Integration on board already.
However if you talk about a StompInboundChannelAdapter nature and relationship with those mentioned events, you need to take a look into the StompIntegrationEvent implementations. You quickly realize that there is no events for payload in the STOMP frame. So, that is what is done really by the StompInboundChannelAdapter - it produces messages based on the body from STOMP frame.
All the mentioned events emitted fro that channel adapter are more about that adapter state sharing for possible management in your application.
I have a legacy app which uses Tibco RV to publish and subscribe. I am rewriting the app to spring boot app. Now, I am trying to remove Tibco RV dependency. Is there a way to use spring integration UDP adapter to publish and subscribe to/from Tibco RV subjects.
I don't think that there is a way to reinvent a Tibco RV Java Client... What is the point to getting rid of Tibco RV dependency? They provided and support that API and it should be fully enough to use in Spring Integration's Service Activator as a POJO method invocation.
You can use a standard UDP adapter to receive TibRV messages, but they will be presented to you just as byte buffers. You won't have an API to see the individual fields. How exactly Tibco chose to pack fields into the UDP buffer is part of a proprietary (unpublished) protocol.
So your best bet is probably to use the Tibrv Java API from your Spring project and port your publishers/subscribers one to one to Kafka. Another, smoother, transition might be to port all your Apps to JMS and then use a JMS Wrapper around TibRv and later Apache Kafka.
I am currently exploring the possibility of using MQTT protocol in my program and the system has found out that there are several different MQTT Brokers. So, my question is that can you mix and match brokers for this communication? For instance, Mosquitto broker on device 1 and ActiveMQ Broker on device 2. Will this work?
I think there might be a slight misunderstanding here.
In a simple deployment there would only be 1 MQTT broker that multiple MQTT clients (on one or many devices) would connect to this one broker and exchange messages on any topics. As long as all the client conforms to the MQTT specifications then they should be able to connect successfully to any broker implementation.
If you want a more complex deployment then it is possible to have multiple brokers and have groups of clients connect to different brokers. You can then set up what is known as a bridge between the brokers which allow the to share some/all of the topics. This allows messages to be shared by all clients regardless of which broker they connect to.
Assuming all the brokers conform to the MQTT spec (which is very likely) then it all should just work, but how you configure bridges differs between broker implementation.
Be aware that a new version of the MQTT spec (v5) just went live (end of 2017), brokers and client libraries will be updating to support this over the coming weeks/months. So check what versions you try and connect with.
Usually there's a bridge mode to connect brokers together, even for different kind of brokers such as Mosquitto and ActiveMQ, this is not only a concept in MQTT brokers but also in other message queue. Also, some kinds of brokers support with clustered, such as RabbitMQ. Official Mosquitto only support bridge, but there's a clustered mosquitto implementation on hui6075/mosquitto-cluster, it is easy to deploy.
Besides, the most significant different with "cluster" and "bridge" is that with clustered, the whole brokers looks like one logic broker for external clients, such as session, retain, qos, etc.
Background
I am very new to MQTT and ActiveMQ. I am trying to learn about both technologies, but their integration using Node.js is not clear.
Objective
The objective here would be to use MQTT with node, and then use ActiveMQ's broker.
Questions
If I publish a message on a MQTT topic then how I can transfer that message to an ActiveMQ queue?
If I have a MQTT topic named "Foo", does ActiveMQ need to have a queue named "Foo"?
Does Node.js support the MQTT protocol?
After publishing a message in a MQTT topic with content "Foo" using Node.js, how I can retrieve it from an ActiveMQ queue?
EDIT
My MQTT is running on a different server so I have added the below activemq.xml file. However, after adding it activemq gives me the following error on startup:
<transportConnectors>
<transportConnector name="mqtt" uri="tcp://<myhostname>:1883? maximumConnections=1000&wireFormat.maxFrameSize=104857600"/>
So how can I get the message published on MQTT topic in activemq queue?
Do I need any other configuration or do I need to first subscribe to a MQTT topic using java or any other technology and then push that message on ActiveMQ queue? Or ActiveMQ it does automatically?
By using compositeTopic in ActiveMQ configuration (activemq.xml).
No, ActiveMQ has a topic named FOO used by MQTT.
No, but there are extensions to Node.js which supports MQTT.
By using compositeTopic (see #1).
What do you mean by "My MQTT is running on different server"?
If you want a clustered things, you should use networkConnector instead of transportConnector.
If you want ActiveMQ to accept MQTT connection, simply change the protocol from uri="tcp://<myhostname>:1883?maximumConnections=1000&wireFormat.maxFrameSize=104857600" to "mqtt://<myhostname>:1883?maximumConnections=1000&wireFormat.maxFrameSize=104857600"