Spring Integration - JMS outbound adapter post-send database update - spring-integration

We previously used to have a Spring Integration flow (XML configuration-based) where we would do an update in a database after sending a message to a JMS queue. To achieve this, the SI flow was configured with a publish-subscribe queue channel as an input to a JMS Outbound Channel Adapter (order 0) and a Service Activator (order 1). The idea here being that after a successful JMS send, the service activator would be called thus, updating the data in the database.
We are now in the process of updating our flows to work with spring-integration:4.0.x APIs and wanted to use this opportunity to see if the described flow pattern is still a good/recommended way of doing a database update after a successful JMS send or if there is now a simpler/better way of achieving this? As a side note, our flows are now being implemented using spring-integration-java-dsl:1.0.0.M3 APIs.
Thanks in advance for any input on this,
PM.

publish-subscribe queue channel
There's no such thing as a pub-sub queue channel; by definition, it's a subscribable channel; so I assume that's what you mean.
It is one of the ways to do what you need, and perfectly fine; you can also achieve the same result with a RecipientListRouter. The dsl syntax is quite nice, especially with Java 8; see the SpringOne demo app for an example.

Related

Is it possible to make a Poller (or PollableMessageSource) to poll messages as List?

Following the example found in GitHub https://github.com/spring-cloud/spring-cloud-gcp/tree/master/spring-cloud-gcp-samples/spring-cloud-gcp-pubsub-polling-binder-sample regarding polling messages from a PubSub subscription, I was wondering...
Is it possible to make a PollableMessageSource retrieve List<Message<?>> instead of a single message per poll?
I've seen the #Poller notation only being used in Source typed objects, never in Processor or Sink. Is it possible to use in such context when for example using #StreamListener or with a functional approach?
The PollableMessageSource binding and Source stream applications are fully based on the Poller and MessageSource abstraction from Spring Integration where its contract is to produce a single message to the channel configured. The point of the messaging is really to process a single message not affecting others. The failure for one message doesn't mean to fail others in the flow.
On the other hand you probably mean GCP Pub/Sub messages to be produced as a list in the Spring message payload. That is really possible, but via some custom code from Pub/Sub consumer and MessageSource impl. Although I would think twice to expect some batched from the source. Probably you may utilize an aggregator to build some small windows if your further logic is about processing as list. But again: it is going to be a single Spring message.
May be better to start thinking about a reactive function implementation where you indeed can expect a Flux<Message<?>> as an input and Spring Cloud Stream framework will take care for you how to emit the data from Pub/Sub into the reactive stream you expect.
See more info in docs: https://docs.spring.io/spring-cloud-stream/docs/3.1.0/reference/html/spring-cloud-stream.html#_reactive_functions_support

Spring Cloud Stream #StreamListener and Spring Integration's Resequencer Pattern

AFAIK the Spring Cloud Stream project is based on Spring Integration. Hence I was wondering if there is a nice way to resequence a subset of inbound messages before the StreamListener handler is triggered? Or do I need to assemble the whole IntegrationFlow from scratch using XML or Java DSL config from Spring Integration?
My use case is as follows. Most of the time I process inbound messages on a Kafka topic as they come. However, a few events have to be resequenced based on CORRELATION_ID, SEQUENCE_NUMBER, and SEQUENCE_SIZE headers. In other words I'd like to keep using StreamListener as much as possible and simply plug in resequencing strategy for some events.
Yes, you would need to use Spring Integration for it. In fact Spring Cloud Stream is effectively a binding framework only. It binds message handlers to the message brokers via binders. The message handlers themselves are provided by the users.
The #StreamListener annotation is pretty much an equivalent of Spring Integration's #ServiceActivator with few extra features (e.g., conditional routing), but other then it is just a message handler.
Now, as you eluded to, you are aware that you can use Spring Integration (SI) to implement a message handler or an internal SI flow, and that is normal and recommended for complex cases.
That said, we do provide out of the box apps that implements certain EIP components and we do have, for example, and aggregator app which you can use as a starting point in implementing resequencer. Further more, given that we have an aggregator app and not resequencer, we would be glad to accept a contribution for it if you're interested.
I hope this answers you question.

Message persistence in Spring Integration Aggregator without MessageStore by using AMQP?

I would like to know if I can have persistence in my Spring Integration setup when I use a aggregator, which is not backed by a MessageStore, by leveraging the persistence of AMQP (RabbitMQ) queues before and after the aggregator.
I imagine that this would use ack's: The aggregator won't ack a message before it's collected all the parts and sent out the resulting message.
Additionally I would like to know if this is ever a good idea :)
I am new working with queue's, and am trying to get a good feel for patterns to use.
My business logic for this is as follows:
I receive a messages on one queue.
Each message must result in two unrelated webservice calls (preferably in parallel).
The results of these two calls must be combined with details from the original message.
The combination must then be sent out as a new message on a queue.
Messages are important, so they must not be lost.
I was/am hoping to use only one 'persistent' system, namely RabbitMQ, and not having to add a database as well.
I've tried to keep the question specific, but any other suggestions on how to approach this are greatly appreciated :)
What you would like to do recalls me Scatter-Gather EI Pattern.
So, you get a message from the AMQP send it into the ScatterGather endpoint and wait for the aggregated reply. That's enough for to stick with the default acknowledge.
Right, the scatterChannel can be PublishSubscribeChannel with an executor to call Web Services in parallel. Anyway the gatherer process will wait for replies according the release strategy and will block the original AMQP listener do not ack the message prematurely.

Spring Integration Multiple Endpoints

Currently i am working on a Spring Integration application which has a following scenario.
There is a Transformer which transforms incoming message in to a particular object type
Once the transformation is done, we need to write it to a log file and to a database table and then finally send to a JMS outbound adapter.
I was reading the Spring Integration reference and found out there are two ways we can approach this scenario.
Introduce a pub-sub channel as the output channel of the above mentioned transformer and have File-outbound, DB-outbound and JMS-outbound as the subscribers.
Introduce a Recipient List Router just after the transformer and specify the File-outbound, DB-outbound and JMS-outbound as the recipients.
When it comes to Enterprise Integration Patterns what is the best way to handle this scenario? Any new suggestions and improvements are welcome
Thanks,
Keth
There is no "best way" - both solutions are equivalent and there is little difference at runtime. So it's your preference; I generally use pub/sub for the simple case and an RLR if the recipients are conditional (with selectors).

Transaction Spring XD

I'm working on a module which consumes some HTTP resources, write in a postgres, and finally push a message to the message bus (RabbitMQ).
I would like to figure out how to deal with transactions inside a module: how to encapsulate my postgres operation and the push to RabbitMQ (i.e in case the message could not be push to RabbitMQ my DB operation should be rollbacked) ?
Thanks.
There are several techniques to wrap parts of a Spring Integration flow in a transaction; see this answer for some examples.
You must, of course, use direct channels throughout.

Resources