SpringXD receive MQTT and publish to MQTT on different topic - spring-integration

I'm currently working on my first SpringXD Stream. It should receive a message on an MQTT topic, do a HTTP POST on a service and publish this result on another MQTT topic.
Currently I'm stuck at publishing to a different MQTT topic than the initial one.
This is my stream:
stream create test --definition "in:mqtt --url='tcp://hivemq:1883' --topics='+/+/+/my/downlink' --username='test' --password='test' --clientId='client_downlink'
| header-enricher --headers={\"mqtt_topic\":\"headers['mqtt_topic'].replace('/downlink', '/uplink')\"}
| out:mqtt --url='tcp://hivemq:1883' --username='test' --password='test' --clientId='client_uplink'" --deploy
The approach is to replace "/downlink" by "/uplink" in the header 'mqtt_topic' for Publishing but header-enricher doesn't overwrite existing header values, so the publish is made on the same topic as we received the message.
Any Idea how I could achieve this?

I am working on an enhancement for this. Meanwhile you can edit the header-enricher groovy script modules/processor/header-enricher/config/header-enricher.groovy and make this change:
si.'header'(name:k,expression:v,overwrite:true)

Related

EventHub Golang client error: amqp:internal-error

I try to use EventHub Go client to send a simple "hello world" event but got this error message:
*Error{Condition: amqp:internal-error, Description: The service was unable to process the request; please retry the operation. For more information on exception types and proper exception handling, please refer to http://go.microsoft.com/fwlink/?LinkId=761101 TrackingId:be0c66437a1447b7accdc113c84955dd_G5, SystemTracker:gateway5, Timestamp:2021-07-10T21:28:48, Info: map[]}
The code is exactly the same as this sample code here: https://github.com/Azure/azure-event-hubs-go
The SO thread I found which somehow has similar error message is here Getting "amqp:internal-error" when peeking messages from Azure Service Bus Queue using AMQP, rhea and Node, but it is for Service Bus and Node client.
Any idea why this issue occured?
This error is pretty non-descriptive.
One way to trigger is to specify an EventHubs connection string without an EntityPath=<event hub name> in it.
So if you're using a broker level connection string you'll need to specify the EventHub you're attempting to connect to by adding EntityPath=eventHubName. The readme snippet does list this, but the error is admittedly not great in that situation.
I've filed this issue to at least improve the error message in that case, as it doesn't really lead you to what's wrong.
https://github.com/Azure/azure-event-hubs-go/issues/222

Google Pub/Sub - No event data found from local function after published message to topic

I'm using the Functions Framework with Python alongside Google Cloud Pub/Sub Emulator. I'm having issues with an event triggered from a published message to a topic, where there's no event data found for the function. See more details below.
Start Pub/Sub Emulator under http://localhost:8085 and project_id is local-test.
Spin up function with signature-type: http under http://localhost:8006.
Given a background cloud function with signature-type: event:
Topic is created as test-topic
Function is spinned up under http://localhost:8007.
Create push subscription test-subscription for test-topic for endpoint: http://localhost:8007
When I publish a message to test-topic from http://localhost:8006 via POST request in Postman, I get back a 200 response to confirm the message was published successfully. The function representing http://localhost:8007 gets executed as an event as shown in the logs from the functions-framework. However, there's no actual data for event when debugging the triggered function.
Has anyone encountered this? Any ideas/suggestions on this?Perhaps, this is true? #23 Functions Framework does not work with the Pub/Sub emulator
Modules Installed
functions-framework==2.1.1
google-cloud-pubsub==2.2.0
python version
3.8.8
I'll close this post, since the issue is an actual bug that was reported last year.
Update: As a workaround until this bug is fixed, I copied the code below locally to functions_framework/__init__.py within view_func nested function, inside _event_view_func_wrapper function.
if 'message' in event_data:
if 'data' not in event_data:
message = event_data['message']
event_data['data'] = {
'data': message.get('data'),
'attributes': message.get('attributes')
}

JMS Message body is null when publishing with jms:publish-subscribe-channel

I am trying to use jms:publish-subscribe-channel to pub/sub on a single ActiveMQ topic. I am able to receive messages from ActiveMQ on the channel just fine, however when I publish to the channel the message body is null (when received by another application listening on the ActiveMQ topic). I was able to recreate the problem using spring-integration-samples->basic->jms. I modified outboundChannelAdapter.xml to use jms:publish-subscribe-channel instead of jms:outbound-channel-adapter. Is there another step needed in order to publish a simple string message? Here's my code change to outboundChannelAdapter.xml:
<stream:stdin-channel-adapter id="stdin" channel="stdinToJmsoutChannel"/>
<jms:publish-subscribe-channel id="stdinToJmsoutChannel" topic="requestTopic" />
<stream:stdout-channel-adapter id="stdout" channel="stdinToJmsoutChannel" append-newline="true"/>
I am not sure what you mean by "the message body is null".
I just made the exact same change to the sample and it worked fine for me...
Please type something and hit <enter>
foo
foo
I had to add -Dorg.apache.activemq.SERIALIZABLE_PACKAGES=* to the command line because activemq needs whitelisting for classes (the whole message is serialized in jms-backed channels).

How to add JMS properties via Spring Integration using ws:outbound-gateway (JMS transport)

I have a ws:outbound-gateway in place pointing to a org.springframework.ws.transport.jms.JmsMessageSender class in order to push a Soap message into the queue.
The output message has been generated okay and published into the queue normally with the following JMS properties on it: SOAPJMS_soapAction, SOAPJMS_contentLength, SOAPJMS_contentType, etc.
My question is: how can I add a custom JMS property as part of the JMS properties generated by default? Is this possible? I'm using Spring Integration 4.3.5.RELEASE.
The JmsMessageSender can be supplied with the MessagePostProcessor.
The you can supply any desired JMS property on target Message.

How to configure Kafka binded Dead Letter Queue in Spring Cloud Data Flow

In Spring Cloud Dataflow document, there is some mentioning of 'Dead Letter Queue' which can be used for exceptions in message processing. But I did not find further on this any where.
I am trying to configure a Kafka binded Dead Letter Queue in my processing pipeline.
Can anyone help me or point to documentation to understand more on this ?
Kafka consumer and the supported overrides are explained here. You'd have to specifically turn on the enableDlq boolean flag.
In your stream definition, at the consumer application level, you could optionally configure DLQ's and supply necessary properties at the time of stream deployment.
dataflow:>stream create foo --definition "http | log"
dataflow:>stream deploy foo --properties "app.http.spring.cloud.stream.bindings.output.destination=test,app.log.spring.cloud.stream.bindings.input.destination=test,app.log.spring.cloud.stream.kafka.bindings.test.consumer.enableDlq=true"
Here we are explicitly overriding the channelName to be "test", so we could use it at the consumer (log) application and enable DLQ flag.

Resources