Spring Integration Multiple Endpoints - spring-integration

Currently i am working on a Spring Integration application which has a following scenario.
There is a Transformer which transforms incoming message in to a particular object type
Once the transformation is done, we need to write it to a log file and to a database table and then finally send to a JMS outbound adapter.
I was reading the Spring Integration reference and found out there are two ways we can approach this scenario.
Introduce a pub-sub channel as the output channel of the above mentioned transformer and have File-outbound, DB-outbound and JMS-outbound as the subscribers.
Introduce a Recipient List Router just after the transformer and specify the File-outbound, DB-outbound and JMS-outbound as the recipients.
When it comes to Enterprise Integration Patterns what is the best way to handle this scenario? Any new suggestions and improvements are welcome
Thanks,
Keth

There is no "best way" - both solutions are equivalent and there is little difference at runtime. So it's your preference; I generally use pub/sub for the simple case and an RLR if the recipients are conditional (with selectors).

Related

Is it possible to make a Poller (or PollableMessageSource) to poll messages as List?

Following the example found in GitHub https://github.com/spring-cloud/spring-cloud-gcp/tree/master/spring-cloud-gcp-samples/spring-cloud-gcp-pubsub-polling-binder-sample regarding polling messages from a PubSub subscription, I was wondering...
Is it possible to make a PollableMessageSource retrieve List<Message<?>> instead of a single message per poll?
I've seen the #Poller notation only being used in Source typed objects, never in Processor or Sink. Is it possible to use in such context when for example using #StreamListener or with a functional approach?
The PollableMessageSource binding and Source stream applications are fully based on the Poller and MessageSource abstraction from Spring Integration where its contract is to produce a single message to the channel configured. The point of the messaging is really to process a single message not affecting others. The failure for one message doesn't mean to fail others in the flow.
On the other hand you probably mean GCP Pub/Sub messages to be produced as a list in the Spring message payload. That is really possible, but via some custom code from Pub/Sub consumer and MessageSource impl. Although I would think twice to expect some batched from the source. Probably you may utilize an aggregator to build some small windows if your further logic is about processing as list. But again: it is going to be a single Spring message.
May be better to start thinking about a reactive function implementation where you indeed can expect a Flux<Message<?>> as an input and Spring Cloud Stream framework will take care for you how to emit the data from Pub/Sub into the reactive stream you expect.
See more info in docs: https://docs.spring.io/spring-cloud-stream/docs/3.1.0/reference/html/spring-cloud-stream.html#_reactive_functions_support

Spring Cloud Stream #StreamListener and Spring Integration's Resequencer Pattern

AFAIK the Spring Cloud Stream project is based on Spring Integration. Hence I was wondering if there is a nice way to resequence a subset of inbound messages before the StreamListener handler is triggered? Or do I need to assemble the whole IntegrationFlow from scratch using XML or Java DSL config from Spring Integration?
My use case is as follows. Most of the time I process inbound messages on a Kafka topic as they come. However, a few events have to be resequenced based on CORRELATION_ID, SEQUENCE_NUMBER, and SEQUENCE_SIZE headers. In other words I'd like to keep using StreamListener as much as possible and simply plug in resequencing strategy for some events.
Yes, you would need to use Spring Integration for it. In fact Spring Cloud Stream is effectively a binding framework only. It binds message handlers to the message brokers via binders. The message handlers themselves are provided by the users.
The #StreamListener annotation is pretty much an equivalent of Spring Integration's #ServiceActivator with few extra features (e.g., conditional routing), but other then it is just a message handler.
Now, as you eluded to, you are aware that you can use Spring Integration (SI) to implement a message handler or an internal SI flow, and that is normal and recommended for complex cases.
That said, we do provide out of the box apps that implements certain EIP components and we do have, for example, and aggregator app which you can use as a starting point in implementing resequencer. Further more, given that we have an aggregator app and not resequencer, we would be glad to accept a contribution for it if you're interested.
I hope this answers you question.

how to unit test spring integration dsl code

i was unable to find an simple example to unit test the spring integration dsl, which involves picking up a message from queue and making a rest call.
I looked at the examples https://github.com/spring-projects/spring-integration-java-dsl but was not clear on qualifiers etc for the below code for which i want to write unit test on.
IntegrationFlows.from(Jms.inboundGateway(connectionFactory)
.id("inputChannel")
.destination(sourceQueue)
.jmsMessageConverter(new MarshallingMessageConverter(jaxbMarshaller())))
.something to validate and route
.handle(Http.outboundGateway("http://localhost:9999/create)
.httpMethod(HttpMethod.POST)
.expectedResponseType(String.class))
.get();
Something else is needed in your question to explain more the requirements.
Anyway I'll try to answer in my best feeling on the matter.
Spring Integration Java DSL is nothing more then codding tool to wire beans and build integration components into flows. In the end, at runtime, we just have a set of beans with which we can interact as with any other beans in the application context.
So, if the story is about consuming some destination from JMS and verify what we get from there, there is just enough to run ActiveMQ in the embedded mode - it is as simple as bean for:
new ActiveMQConnectionFactory("vm://localhost?broker.persistent=false")
Then you use JmsTemplate to send some test data to the desired destination (will be created on demand) and consume an Integration message from the channel defined in the mentioned in your question IntegrationFlow.
Typically for consuming test data we use a QueueChannel and its receive(long timeout). This way we block a unit test until data arrives or timeout is elapsed.
Another way to verify a flow work is with the Spring Integration Testing Framework. From there you can use a MockIntegration to replace the real MessageHandler in the application context and verify an interaction with the mock afterward.
Hope that helps a bit.

Spring Integration - JMS outbound adapter post-send database update

We previously used to have a Spring Integration flow (XML configuration-based) where we would do an update in a database after sending a message to a JMS queue. To achieve this, the SI flow was configured with a publish-subscribe queue channel as an input to a JMS Outbound Channel Adapter (order 0) and a Service Activator (order 1). The idea here being that after a successful JMS send, the service activator would be called thus, updating the data in the database.
We are now in the process of updating our flows to work with spring-integration:4.0.x APIs and wanted to use this opportunity to see if the described flow pattern is still a good/recommended way of doing a database update after a successful JMS send or if there is now a simpler/better way of achieving this? As a side note, our flows are now being implemented using spring-integration-java-dsl:1.0.0.M3 APIs.
Thanks in advance for any input on this,
PM.
publish-subscribe queue channel
There's no such thing as a pub-sub queue channel; by definition, it's a subscribable channel; so I assume that's what you mean.
It is one of the ways to do what you need, and perfectly fine; you can also achieve the same result with a RecipientListRouter. The dsl syntax is quite nice, especially with Java 8; see the SpringOne demo app for an example.

How to process message in order with spring integration

Does the spring integration framework have any features I can use to guarantee message order?
We are needing to run our spring integration flow on two different nodes to keep up with the message volume. I have seen several solutions for solving this issue, but I wanted to see if the framework already has something.
This article might better explain what I'm trying to ask.
http://sleeplessinslc.blogspot.com/2010/02/message-ordering-in-jms-using-coherence.html
Yes, it is here. And its name Resequencer.
From Spring Integration Reference Manual: http://docs.spring.io/spring-integration/docs/latest-ga/reference/html/messaging-routing-chapter.html#resequencer.
Each of message for reordering has to have a correlationId header to compare with other messages and sequenceNumber to determine the real order of that message to emit.
The release-partial-sequences="true" does the stuff to release messages, when the correct order is achieved by default SequenceSizeReleaseStrategy on the next message.

Resources