How to handle subflows that is packaged in separate jar - spring-integration

I have four subflows each sitting in its own application and packaged in jar file.
Each subflow is build as spring boot application. Each subflow has input channel and output channel.
I would like the to have the main flow sitting in its own spring-boot application that will sequentially called those 4 subflows.
Is it possible with spring integration?
If it is not possible what would be the best approach that is possible?
I searched the internet and look at How to handle subflows
https://github.com/spring-projects/spring-integration-flow and everything else that I can find but still not sure how to proceed.
https://github.com/spring-projects/spring-integration-flow states that to use subflow I need to have in my xml
int-flow:flow id="subflow1"
How do I tie this subflow1 to my subflow packaged in the separate jar. Then how would I invoke this subflows in my main flow that is spring boot application? Do I sent message to the input channel to each subflow to start subflow or do something else?
Thanks,David
Example of subflow xml file. I removed irrelevant parts of the file to shorten it and put ... in place of removed info
<int:channel id="createTwo"/>
<int:service-activator input-channel="createOne" output-channel="createTwo"
ref="automationUtilities" method="createTwo"/>
...
<int:service-activator input-channel="createFive"
ref="automationUtilities" method="createSix"/>
<bean id="automationUtilities" class="package.BeanName" />
<bean id="validator" class="package.anotherBeanName" />
<util:properties id="config" location="classpath:application.properties"/>

If all your applications are Spring Boot, then they are Microservices and each of them lives in its onw JVM. Therefore just having input and output channel isn't enough. Because you can't just send message from one JVM to another.
Since all of them are Spring Boot, consider to add some REST capabilities for them. At least simple <int-http:inbound-gateway> to receive messages from external world. And <int-http:outbound-gateway> to perform REST request from one application to another.
OTOH consider to use Spring Cloud Stream project, which is founded exactly for Messaging Microservices communication. It is based on the Binder concept, currently Kafka or RabbitMQ.

Related

Sleuth, Spring Integration and ThreadPoolExecutor: how to have one span per spawned thread?

There is an ExecutorChannel in my integration flow that will spawn one thread per message. The delegation chain looks like this:
ExecutorChannel (Spring Integration) -> BlockingExecutor (my own) -> ThreadPoolExecutor (vanilla Java)
Everything that happens in the Spring Integration part is of no interest to me. Ideally, I'd like turn off Spring Integration tracing with spring.sleuth.integration.enabled: false and simply annotate the method that will eventually called by the Spring Integration part with #NewSpan.
But when I disable spring integration, the span will appear only once: created by the main thread.
I've tried setting spring.sleuth.integration.enabled: true and excluding all but the relevant outbound channel via spring.sleuth.integration.patterns, but it's the same: only the main thread's span will appear in Zipkin.
Am I going at this from the wrong angle? What would be the best way of doing this?
The tracing propagation is done in Sleuth by the TracingChannelInterceptor. So, if you disable it, this one is not going to be applied to channels. If you see issues, then you need to apply it manually on the channel which are switching threads.
You can do that manually adding an interceptor to the channel directly or via #GlobalChannelInterceptor with respective pattern matching:
https://docs.spring.io/spring-integration/docs/current/reference/html/core.html#global-channel-configuration-interceptors

Simple spring integraction xml config to read ftp file list and split it further

I am studying spring integration and want to write simple app to retrieve the filelist from the ftp by scheduler and split it to few channels to parallel handling.
But couldn't understand how to run it from xml configured scheduler and will it work as it outbound and what should be in inbound1 channel? (code section)
searched such examles, but failed, reading ref docs
found from doc reference
<int-ftp:outbound-gateway id="gateway1"
session-factory="ftpSessionFactory"
request-channel="inbound1"
command="ls"
command-options="-1"
expression="payload"
reply-channel="toSplitter"/>
<int:channel id="inbound1"/>
<int:inbound-channel-adapter id="i_hope_it_start_run_on_app_start"
channel="inbound1"
auto-startup="true">
<int:poller fixed-rate="2000" max-messages-per-poll="10"/>
</int:inbound-channel-adapter>
expect spring integration xml config with scheduled run retrieving file list from ftp
Actually you go right way: the <int-ftp:outbound-gateway> with LS command indeed returns for you a list of files in the remote directory provided by the expression="payload".
Your understanding about <int:inbound-channel-adapter> is also correct: with it your really initiate a task to be called every time trigger comes to activity.
What you need here is something like expression="'/YOUR_REMOTE_DIR'". So, the result of that expression is sent as a payload to the channel="inbound1". That's how your remote directory can be available for listing in the FTP gateway via mentioned expression="payload".
I wouldn't do though a fixed-rate="2000" because there is no reason to poll remote directory concurrently. The fixed-delay should be considered instead. Also the max-messages-per-poll="10" doesn't bring value here, too. You just going to send a message with the /YOUR_REMOTE_DIR 10 times on a single polling task. Configure it to 1, which is default in case of <int:inbound-channel-adapter>.
Plus with such a polling logic you will realize that you get in the toSplitter the same list of files all the time. I may guess that it is not what you may expect and your goal is really poll only new files. For this purpose you should consider to use an Idempotent Receiver approach to filter out those files you have already processed: https://docs.spring.io/spring-integration/docs/current/reference/html/messaging-endpoints-chapter.html#idempotent-receiver

Spring Cloud Stream #StreamListener and Spring Integration's Resequencer Pattern

AFAIK the Spring Cloud Stream project is based on Spring Integration. Hence I was wondering if there is a nice way to resequence a subset of inbound messages before the StreamListener handler is triggered? Or do I need to assemble the whole IntegrationFlow from scratch using XML or Java DSL config from Spring Integration?
My use case is as follows. Most of the time I process inbound messages on a Kafka topic as they come. However, a few events have to be resequenced based on CORRELATION_ID, SEQUENCE_NUMBER, and SEQUENCE_SIZE headers. In other words I'd like to keep using StreamListener as much as possible and simply plug in resequencing strategy for some events.
Yes, you would need to use Spring Integration for it. In fact Spring Cloud Stream is effectively a binding framework only. It binds message handlers to the message brokers via binders. The message handlers themselves are provided by the users.
The #StreamListener annotation is pretty much an equivalent of Spring Integration's #ServiceActivator with few extra features (e.g., conditional routing), but other then it is just a message handler.
Now, as you eluded to, you are aware that you can use Spring Integration (SI) to implement a message handler or an internal SI flow, and that is normal and recommended for complex cases.
That said, we do provide out of the box apps that implements certain EIP components and we do have, for example, and aggregator app which you can use as a starting point in implementing resequencer. Further more, given that we have an aggregator app and not resequencer, we would be glad to accept a contribution for it if you're interested.
I hope this answers you question.

Spring Batch Partitioned Job Using Durable Subscriber

We are using Spring Batch and partitioning of jobs in a 10 server JBoss EAP 5.2 cluster. Because of a problem in JBoss messaging, we needed to use a topic for the reply message from the partitioned steps. All has been working fine until we see JBoss Messaging glitches (on the server that launches the job) and that drops it from the cluster. It recovers but the main partition does no``t pick up the messages sent from the partition steps. I see the messages in the topic in the JMX-Console but also see that the subscription and the messages are non-durable. Therefore I would like to make the communication for the partition step reply into a durable subscription. I can't seem to fine a document way to do this. This is my current configuration of the partitioned step and associated bean.
Inbound Gateway Configuration
<int:channel id="springbatch.slave.jms.request"/>
<int:channel id="springbatch.slave.jms.response" />
<int-jms:inbound-gateway
id="springbatch.master.inbound.gateway"
connection-factory="springbatch.listener.jmsConnectionFactory"
request-channel="springbatch.slave.jms.request"
request-destination="springbatch.partition.jms.requestsQueue"
reply-channel="springbatch.slave.jms.response"
concurrent-consumers="${springbatch.partition.concurrent.consumers}"
max-concurrent-consumers="${springbatch.partition.concurrent.maxconsumers}"
max-messages-per-task="${springbatch.partition.concurrent.maxmessagespertask}"
reply-time-to-live="${springbatch.partition.reply.time.to.live}"
/>
Outbound Gateway Configuration
<int:channel id="jms.requests">
<int:dispatcher task-executor="springbatch.partitioned.jms.taskExecutor" />
</int:channel>
<int:channel id="jms.reply" />
<int-jms:outbound-gateway id="outbound-gateway"
auto-startup="false" connection-factory="springbatch.jmsConnectionFactory"
request-channel="jms.requests"
request-destination="springbatch.partition.jms.requestsQueue"
reply-channel="jms.reply"
reply-destination="springbatch.partition.jms.repliesQueue"
correlation-key="JMSCorrelationID">
<int-jms:reply-listener />
</int-jms:outbound-gateway>
</code>
Further to Michael's comment; there is currently no way to configure a topic for the <reply-listener/> - it's rather unusual to use a topic in a request/reply scenario and we didn't anticipate that requirement.
Feel free to open a JIRA Issue against Spring Integration.
An alternative would be to wire in an outbound-channel-adapter for the requests and an inbound-channel-adapter for the replies. However, some special handling of the replyChannel header is needed when doing that - see the docs here for more information about that.

Create programatically message-driven-channel-adapter to process the messages on queue

I would like process the message programmatically using message-driven-channel-adapter. Here is scenario which I have to implement:
My application during the startup read the configuration from a service. The configuration provides information about the queues which will contain the messages. Hence I would like to create a message-driven-channel-adapter for each queue to listen to messages asynchronously.
Any example which initializes all the spring integration context programatically instead of using XML will be useful.
If you are going to do everything programmatically, I'd suggest you bypass Spring Integration magic and just use DefaultMessageListenerContainer directly.
Afterwards you can send messages to an existing MessageChannel directly from the MessageListener implementation or using Messaging Gateway.
Please, be careful with programmatic configuration with that do not miss important attributes like ApplicationContext or invocation for afterPropertiesSet().

Resources