I have a JdbcPollingChannelAdapter defined as following:
#Bean
public MessageSource<Object> jdbcMessageSource(DataSource dataSource) {
JdbcPollingChannelAdapter jdbcPollingChannelAdapter = new JdbcPollingChannelAdapter(dataSource,
"SELECT * FROM common_task where due_at <= NOW() and retries < order by due_at ASC FOR UPDATE SKIP LOCKED");
jdbcPollingChannelAdapter.setMaxRowsPerPoll(1);
jdbcPollingChannelAdapter.setUpdateSql("Update common_task set retries = :retries, due_at = due_at + interval '10 minutes' WHERE ID = (:id)");
jdbcPollingChannelAdapter.setUpdatePerRow(true);
jdbcPollingChannelAdapter.setRowMapper(this::mapRow);
jdbcPollingChannelAdapter.setUpdateSqlParameterSourceFactory(this::updateParamSource);
return jdbcPollingChannelAdapter;
}
The integration flow for this:
#Bean
public IntegrationFlow pollingFlow(MessageSource<Object> jdbcMessageSource) {
return IntegrationFlows.from(jdbcMessageSource,
c -> c.poller(Pollers.fixedRate(250, TimeUnit.MILLISECONDS)
.maxMessagesPerPoll(1)
.transactional()))
.split()
.channel(taskSourceChannel())
.get();
}
The service activator is defined as
#ServiceActivator(inputChannel = "taskSourceChannel")
public void doSomething(FooTask event) {
//do something but ** not ** within the transaction of the poller.
}
The poller in integration flow is defined as transactional. Based on my understanding, this will
1. Execute select query and update query in transaction.
2. It will also execute doSomething() method in the same transaction.
Goal: I would like to do 1 and not 2. I would like to do select and update in a transaction to make sure both happens. But, I don;t want to execute doSomething() in the same transaction. In case of an exeception in doSomething(), I still want to persist the updates made during polling. How can I acheive this ?
This is done via simple thread shifting. So, what you need is just leave the polling thread, allow it to commit TX and continue process in the separate thread.
According your logic with the .split(), it's even better to have new thread processing already after splitting, so items are going to be even processed by that doSomething() in parallel.
The goal simply can be achieved with an ExecutorChannel. Since you already have that taskSourceChannel(), just consider to replace it with ExecutorChannel based on some managed ThreadPoolTaskExecutor.
See more info in the Reference Manual: https://docs.spring.io/spring-integration/reference/html/messaging-channels-section.html#channel-configuration-executorchannel
And its Javadocs.
The simple Java Configuration variant is like this:
#Bean
public MessageChannel taskSourceChannel() {
return new ExecutorChannel(executor());
}
#Bean
public Executor executor() {
return new ThreadPoolTaskExecutor();
}
Related
I have the following Spring Integration flow:
It gathers records from one database, converts to json and sends to another database.
The idea is to have 10 pollers (channel0 to 9). Each one is a pollingFlowChanN Bean. But I suspect they are sharing the same thread.
How to I make the polling multi-thread in this scenario?
private IntegrationFlow getChannelPoller(final int channel, final int pollSize, final long delay) {
return IntegrationFlows.from(jdbcMessageSource(channel, pollSize), c -> c.poller(Pollers.fixedDelay(delay)
.transactional(transactionManager)))
.split()
.handle(intControleToJson())
.handle(pgsqlSink)
.get();
}
#Bean
public IntegrationFlow pollingFlowChan0() {
return getChannelPoller(0, properties.getChan0PollSize(), properties.getChan0Delay());
}
#Bean
public IntegrationFlow pollingFlowChan1() {
return getChannelPoller(1, properties.getChan1PollSize(), properties.getChan1Delay());
}
....
I assume you use the latest Spring Boot, which have a TaskScheduler auto-configured with one thread: https://docs.spring.io/spring-boot/docs/current/reference/htmlsingle/#features.spring-integration. That's the best guess why your tasks use the same thread.
See also answer here: Why does the SFTP Outbound Gateway not start working as soon as I start its Integration Flow?
I have this direct channel:
#Bean
public DirectChannel emailingChannel() {
return MessageChannels
.direct( "emailingChannel")
.get();
}
Can I define multiple flows for the same channel like this:
#Bean
public IntegrationFlow flow1FromEmailingChannel() {
return IntegrationFlows.from( "emailingChannel" )
.handle( "myService" , "handler1")
.get();
}
#Bean
public IntegrationFlow flow2FromEmailingChannel() {
return IntegrationFlows.from( "emailingChannel" )
.handle( "myService" , "handler2" )
.get();
}
EDIT
#Service
public class MyService {
public void handler1(Message<String> message){
....
}
public void handler2(Message<List<String>> message){
....
}
}
Each flow's handle(...) method manipulates different payload data types but the goal is the same, i-e reading data from the channel and call the relevant handler. I would like to avoid many if...else to check the data type in one handler.
Additional question: What happens when multiple threads call the same channel (no matter its type: Direct, PubSub or Queue) at the same time (as per default a #Bean has singleton scope)?
Thanks a lot
With a direct channel messages will be round-robin distributed to the consumers.
With a queue channel only one consumer will get each message; the distribution will be based on their respective pollers.
With a pub/sub channel both consumers will get each message.
You need to provide more information but it sounds like you need to add a payload type router to your flow to direct the messages to the right consumer.
EDIT
When the handler methods are in the same class you don't need two flows; the framework will examine the methods and, as long as there is no ambiguity) will call the method that matches the payload type.
.handle(myServiceBean())
The last element in the code for the following DSL flow is Service Activator (.handle method).
Is there a default output direct channel to which I can subscribe here? If I understand things correctly, the output channel must be present
I know I can add .channel("name") at the end but the question is what if it's not written explicitly.
Here is the code:
#SpringBootApplication
#IntegrationComponentScan
public class QueueChannelResearch {
#Bean
public IntegrationFlow lambdaFlow() {
return f -> f.channel(c -> c.queue(50))
.handle(System.out::println);
}
public static void main(String[] args) {
ConfigurableApplicationContext ctx = SpringApplication.run(QueueChannelResearch.class, args);
MessageChannel inputChannel = ctx.getBean("lambdaFlow.input", MessageChannel.class);
for (int i = 0; i < 1000; i++) {
inputChannel.send(MessageBuilder.withPayload("w" + i)
.build());
}
ctx.close();
}
Another question is about QueueChannel. The program hangs if comment handle() and completes if uncomment it. Does that mean that handle() add a default Poller before it?
return f -> f.channel(c -> c.queue(50));
// .handle(System.out::println);
No, that doesn't work that way.
Just recall that integration flow is a filter-pipes architecture and result of the current step is going to be sent to next one. Since you use .handle(System.out::println) there is no output from that println() method call therefore nothing is returned to build a Message to sent to the next channel if any. So, the flow stops here. The void return type or null returned value is a signal for service activator to stop the flow. Consider your .handle(System.out::println) as an <outbound-channel-adapter> in the XML configuration.
And yes: there is no any default channels, unless you define one via replyChannel header in advance. But again: your service method must return something valuable.
The output from service activator is optional, that's why we didn't introduce extra operator for the Outbound Channel Adapter.
The question about QueueChannel would be better to handle in the separate SO thread. There is no default poller unless you declare one as a PollerMetadata.DEFAULT_POLLER. You might use some library which delcares that one for you.
I have a JdbcPollingChannelAdapter which reads data via JDBC. I want to make it poll manually (using a commandChannel). It should never poll automatically, and it should run immediately when I trigger a manual poll.
Below I am using a poller which runs every 24 hours to get the channel running at all. I cannot use a cronExpression that never fires as in Quartz: Cron expression that will never execute since Pollers.cronExpression() takes no year.
#Bean
public MessageSource<Object> jdbcMessageSource() {
return new JdbcPollingChannelAdapter(this.dataSource, "SELECT...");
}
#Bean
public IntegrationFlow jdbcFlow() {
return IntegrationFlows
.from(jdbcMessageSource(),
spec -> spec.poller(
Pollers.fixedRate(24, TimeUnit.HOURS)))
.handle(System.out::println)
.get();
}
Well, you go right way about JdbcPollingChannelAdapter and commandChannel, but you don't have configure SourcePollingChannelAdapter as you do with that IntegrationFlows.from(jdbcMessageSource().
What you need is really the jdbcMessageSource(), but to poll it manually you should configure command-based flow:
#Bean
public IntegrationFlow jdbcFlow() {
return IntegrationFlows
.from("commandChannel")
.handle(jdbcMessageSource(), "receive")
.handle(System.out::println)
.get();
}
Exactly that receive() is called from the SourcePollingChannelAdapter on the timing basis.
I need to get a file daily via SFTP. I would like to use Spring Integration with Java config. The file is generally available at a specific time each day. The application should try to get the file near that time each day. If the file is not available, it should continue to retry for x attempts. After x attempts, it should send an email to let the admin know that the file is still not available on the SFTP site.
One option is to use SftpInboundFileSynchronizingMessageSource. In the MessageHandler, I can kick off a job to process the file. However, I really don't need synchronization with the remote file system. After all, it is a scheduled delivery of the file. Plus, I need to delay at most 15 minutes for the next retry and to poll every 15 minutes seems a bit overkill for a daily file. I guess that I could use this but would need some mechanism to send email after a certain time elapsed and no file was received.
The other option seems to be using get of the SFTP Outbound Gateway. But the only examples I can find seem to be XML config.
Update
Adding code after using help provided by Artem Bilan's answer below:
Configuration class:
#Bean
#InboundChannelAdapter(autoStartup="true", channel = "sftpChannel", poller = #Poller("pollerMetadata"))
public SftpInboundFileSynchronizingMessageSource sftpMessageSource(ApplicationProperties applicationProperties, PropertiesPersistingMetadataStore store) {
SftpInboundFileSynchronizingMessageSource source =
new SftpInboundFileSynchronizingMessageSource(sftpInboundFileSynchronizer(applicationProperties));
source.setLocalDirectory(new File("ftp-inbound"));
source.setAutoCreateLocalDirectory(true);
FileSystemPersistentAcceptOnceFileListFilter local = new FileSystemPersistentAcceptOnceFileListFilter(store,"test");
source.setLocalFilter(local);
source.setCountsEnabled(true);
return source;
}
#Bean
public PollerMetadata pollerMetadata() {
PollerMetadata pollerMetadata = new PollerMetadata();
List<Advice> adviceChain = new ArrayList<Advice>();
adviceChain.add(retryCompoundTriggerAdvice());
pollerMetadata.setAdviceChain(adviceChain);
pollerMetadata.setTrigger(compoundTrigger());
return pollerMetadata;
}
#Bean
public RetryCompoundTriggerAdvice retryCompoundTriggerAdvice() {
return new RetryCompoundTriggerAdvice(compoundTrigger(), secondaryTrigger());
}
#Bean
public CompoundTrigger compoundTrigger() {
CompoundTrigger compoundTrigger = new CompoundTrigger(primaryTrigger());
return compoundTrigger;
}
#Bean
public Trigger primaryTrigger() {
return new CronTrigger("*/60 * * * * *");
}
#Bean
public Trigger secondaryTrigger() {
return new PeriodicTrigger(10000);
}
#Bean
#ServiceActivator(inputChannel = "sftpChannel")
public MessageHandler handler(PropertiesPersistingMetadataStore store) {
return new MessageHandler() {
#Override
public void handleMessage(Message<?> message) throws MessagingException {
System.out.println(message.getPayload());
store.flush();
}
};
}
RetryCompoundTriggerAdvice class:
public class RetryCompoundTriggerAdvice extends AbstractMessageSourceAdvice {
private final CompoundTrigger compoundTrigger;
private final Trigger override;
private int count = 0;
public RetryCompoundTriggerAdvice(CompoundTrigger compoundTrigger, Trigger overrideTrigger) {
Assert.notNull(compoundTrigger, "'compoundTrigger' cannot be null");
this.compoundTrigger = compoundTrigger;
this.override = overrideTrigger;
}
#Override
public boolean beforeReceive(MessageSource<?> source) {
return true;
}
#Override
public Message<?> afterReceive(Message<?> result, MessageSource<?> source) {
if (result == null && count <= 5) {
count++;
this.compoundTrigger.setOverride(this.override);
}
else {
this.compoundTrigger.setOverride(null);
if (count > 5) {
//send email
}
count = 0;
}
return result;
}
}
Since Spring Integration 4.3 there is CompoundTrigger:
* A {#link Trigger} that delegates the {#link #nextExecutionTime(TriggerContext)}
* to one of two Triggers. If the {#link #setOverride(Trigger) override} trigger is
* {#code null}, the primary trigger is invoked; otherwise the override trigger is
* invoked.
With the combination of CompoundTriggerAdvice:
* An {#link AbstractMessageSourceAdvice} that uses a {#link CompoundTrigger} to adjust
* the poller - when a message is present, the compound trigger's primary trigger is
* used to determine the next poll. When no message is present, the override trigger is
* used.
it can be used to reach your task:
The primaryTrigger can be a CronTrigger to run the task only once a day.
The override could be a PeriodicTrigger with desired short period to retry.
The retry logic you can utilize with one more Advice for poller or just extend that CompoundTriggerAdvice to add count logic to send an email eventually.
Since there is no file, therefore no message to kick the flow. And we don't have choice unless dance around the poller infrastructure.