Spring integration use PollableChannel simultaneously - spring-integration

In my project, I have the following List of AccountWithFiles objects.
#AllArgsConstructor
#Getter
class AccountWithFiles {
private String account;
private List<S3FileInfo> s3FileInfoList;
}
I want to process each AccountWithFiles separately in a new thread. Then split s3FileInfoList using split() and process them one by one with 20 min delay, however in parallel each account with s3FileInfoList.
So I have the following DSL definition:
#Bean
public IntegrationFlow s3DownloadFlowEnhanced() {
return IntegrationFlows.fromSupplier(s3FileInfoRepository::findAllGroupByAccount,
c -> c.poller(Pollers.cron(time, TimeZone.getTimeZone(zone))).id("s3TEMPO"))
.channel("manualS3EnhancedFlow")
.split()
.channel("myChannel")
.get();
}
s3FileInfoRepository::findAllGroupByAccount returns the list of AccountWithFiles objects after that I'm splitting them and send them to MessageChannels Executors channel (with defined number of threads)
#Bean
public MessageChannel myChannel() {
return MessageChannels.publishSubscribe(Executors.newFixedThreadPool(10)).get();
}
After that
#Bean
public IntegrationFlow processEachAccountSeparately() {
return IntegrationFlows.from("myChannel")
.<AccountWithFiles, Object>transform(m -> m.getS3FileInfoList().stream().sorted(
Comparator.comparing(i -> i.getOperationType() == FILE_OPERATION_TYPE.ADD))
.collect(Collectors.toList()))
.log()
//.resequence()
.split()
.channel("bookItemsChannel")
.get();
}
#Bean
public PollableChannel bookItemsChannel(){
return new QueueChannel();
}
#Bean
public IntegrationFlow test() {
return IntegrationFlows.from("bookItemsChannel")
.delay("delayer.messageGroupId", d -> d
.defaultDelay(25000L)
.delayExpression("headers['delay']"))
.log()
.get();
}
#Bean(name = PollerMetadata.DEFAULT_POLLER)
public PollerMetadata defaultPoller() {
PollerMetadata pollerMetadata = new PollerMetadata();
ThreadPoolTaskExecutor taskExecutor = new ThreadPoolTaskExecutor();
taskExecutor.setCorePoolSize(5);
taskExecutor.initialize();
pollerMetadata.setTaskExecutor(taskExecutor);
pollerMetadata.setTrigger(new PeriodicTrigger(15000L));
pollerMetadata.setMaxMessagesPerPoll(3);
return pollerMetadata;
}
When messages are received by the Pollable channel they are processed one by one with delay. I want my messages to process one by one however in parallel based on splitter from s3DownloadFlowEnhancedflow.
I know that pollable channels distinguish a sender and receiver for the message in a different thread. Maybe there is any workaround here?
In processEachAccountSeparately flow I see that each account has its own thread.
2021-06-24 15:33:34.585 INFO 56174 --- [pool-4-thread-1] o.s.integration.handler.LoggingHandler : GenericMessage [payload=[S3FileInfo(fileName=sdfsdf, timeStamp=null, serviceName=null, accountLogin=login2, operationType=ADD)], headers={sequenceNumber=1, correlationId=36b132ac-7c5b-af66-96c0-2334a757c960, id=3bafdaa7-ed4c-087f-5f2c-cc114eae42cd, sequenceSize=2, timestamp=1624538014577}]
2021-06-24 15:33:34.585 INFO 56174 --- [pool-4-thread-2] o.s.integration.handler.LoggingHandler : GenericMessage [payload=[S3FileInfo(fileName=sdfjsjfj, timeStamp=null, serviceName=null, accountLogin=login1, operationType=DELETE), S3FileInfo(fileName=s3/outgoing/file2, timeStamp=null, serviceName=IPASS, accountLogin=login1, operationType=DELETE), S3FileInfo(fileName=outgoing/s3/ipass.xlsx, timeStamp=null, serviceName=IPASS, accountLogin=login1, operationType=ADD), S3FileInfo(fileName=dsfsdf, timeStamp=null, serviceName=null, accountLogin=login1, operationType=ADD)], headers={sequenceNumber=2, correlationId=36b132ac-7c5b-af66-96c0-2334a757c960, id=d8506721-2cfd-b6da-d353-4fb8bd5744fb, sequenceSize=2, timestamp=1624538014577}]
However, PollableChannel executes it one by one
2021-06-24 15:33:46.328 INFO 56174 --- [lTaskExecutor-3] o.s.integration.handler.LoggingHandler : GenericMessage [payload=S3FileInfo(fileName=sdfjsjfj, timeStamp=null, serviceName=null, accountLogin=login1, operationType=DELETE), headers={sequenceNumber=1, sequenceDetails=[[36b132ac-7c5b-af66-96c0-2334a757c960, 2, 2]], correlationId=d8506721-2cfd-b6da-d353-4fb8bd5744fb, id=7f8bd9a6-25ce-0bb2-c3f3-581d823d8fce, sequenceSize=4, timestamp=1624538014585}]
2021-06-24 15:33:46.329 INFO 56174 --- [lTaskExecutor-3] o.s.integration.handler.LoggingHandler : GenericMessage [payload=S3FileInfo(fileName=sdfsdf, timeStamp=null, serviceName=null, accountLogin=login2, operationType=ADD), headers={sequenceNumber=1, sequenceDetails=[[36b132ac-7c5b-af66-96c0-2334a757c960, 1, 2]], correlationId=3bafdaa7-ed4c-087f-5f2c-cc114eae42cd, id=f697b52b-1053-51aa-232f-88bb602dc1c9, sequenceSize=1, timestamp=1624538014585}]
2021-06-24 15:33:46.329 INFO 56174 --- [lTaskExecutor-3] o.s.integration.handler.LoggingHandler : GenericMessage [payload=S3FileInfo(fileName=s3/outgoing/file2, timeStamp=null, serviceName=IPASS, accountLogin=login1, operationType=DELETE), headers={sequenceNumber=2, sequenceDetails=[[36b132ac-7c5b-af66-96c0-2334a757c960, 2, 2]], correlationId=d8506721-2cfd-b6da-d353-4fb8bd5744fb, id=a6754c98-fce0-f132-664a-65d61f553ae2, sequenceSize=4, timestamp=1624538014585}]
2021-06-24 15:34:01.333 INFO 56174 --- [lTaskExecutor-4] o.s.integration.handler.LoggingHandler : GenericMessage [payload=S3FileInfo(fileName=outgoing/s3/ipass.xlsx, timeStamp=null, serviceName=IPASS, accountLogin=login1, operationType=ADD), headers={sequenceNumber=3, sequenceDetails=[[36b132ac-7c5b-af66-96c0-2334a757c960, 2, 2]], correlationId=d8506721-2cfd-b6da-d353-4fb8bd5744fb, id=71fa915a-fcaa-3d00-023b-5cf51be3b183, sequenceSize=4, timestamp=1624538014585}]
2021-06-24 15:34:01.333 INFO 56174 --- [lTaskExecutor-4] o.s.integration.handler.LoggingHandler : GenericMessage [payload=S3FileInfo(fileName=dsfsdf, timeStamp=null, serviceName=null, accountLogin=login1, operationType=ADD), headers={sequenceNumber=4, sequenceDetails=[[36b132ac-7c5b-af66-96c0-2334a757c960, 2, 2]], correlationId=d8506721-2cfd-b6da-d353-4fb8bd5744fb, id=7c513e23-5484-4f61-b7d3-362648c7b89c, sequenceSize=4, timestamp=1624538014585}]
What I want is to have something like this:
[pool-4-thread-1] simultaneously
[pool-4-thread-2] simultaneously
[pool-4-thread-2] 20 min delay
[pool-4-thread-2] 20 min delay
[pool-4-thread-2] 20 min delay

Your first step is correct about splitting a list of AccountWithFiles. Then you say that you'd like to split s3FileInfoList and process them sequentially, one by one, but why you place them into a QeueuChannel? The regular, default DirectChannel would be enough in this case.
However then you go to delay() which does not block the current thread, but schedule a task in a future in a separate thread. So, you probably need to rethink your solution since with the current approach, even if you get rid off that bookItemsChannel as a queue, you still are not going to process delayed messages sequentially. There is just no guarantee on TaskScheduler which one is going to be performed first when they all have the same schedule time.

Related

Spring Integration TCP Inbound Adapter client disconnect issue

We have created an IntegrationFlow for an inbound TCP server. Whenever we test this the integration is fine until the client disconnects, then we get a continuous flow of empty messages.
Spring Integration 5.2.5.RELEASE
Flow:
#Bean
public IntegrationFlow inboundFlow(MyService myService) {
return IntegrationFlows.from(
Tcp.inboundAdapter(
Tcp.netServer(12345)
.deserializer(TcpCodecs.lengthHeader4())))
.transform(Transformers.objectToString())
.log(INFO) // Logging only for testing
.handle(myService, "process") // Do some processing of the message
.get();
}
Test method:
public void sendPackets() throws Exception {
List<Path> files = getAllFilesSorted("/some/location");
try (Socket socket = new Socket("localhost", 12345)) {
log.info("local port: {}", socket.getLocalPort());
try (OutputStream outputStream = socket.getOutputStream()) {
for (int i = 0; i < 10; i++) {
Path path = files.get(i);
log.info("{} ({} of {})", path.toString(), i, files.size());
byte[] bytes = Files.readAllBytes(path);
outputStream.write(bytes);
outputStream.flush();
Thread.sleep(200);
}
}
}
}
Partial Log after test completes:
2020-05-05 11:00:52.963 INFO 31793 --- [pool-1-thread-2] o.s.integration.handler.LoggingHandler : GenericMessage [payload=, headers={ip_tcp_remotePort=59407, ip_connectionId=localhost:59407:44745:df23388b-b8b0-4250-b926-4199030b2ff6, ip_localInetAddress=/127.0.0.1, ip_address=127.0.0.1, id=813ccf6b-9eb7-2ff3-0f8d-f2810966d8d7, ip_hostname=localhost, timestamp=1588672852963}]
2020-05-05 11:00:52.963 INFO 31793 --- [pool-1-thread-2] o.s.integration.handler.LoggingHandler : GenericMessage [payload=, headers={ip_tcp_remotePort=59407, ip_connectionId=localhost:59407:44745:df23388b-b8b0-4250-b926-4199030b2ff6, ip_localInetAddress=/127.0.0.1, ip_address=127.0.0.1, id=1fd5c4c7-b2e0-6a89-7de6-894c8e42e242, ip_hostname=localhost, timestamp=1588672852963}]
2020-05-05 11:00:52.963 INFO 31793 --- [pool-1-thread-2] o.s.integration.handler.LoggingHandler : GenericMessage [payload=, headers={ip_tcp_remotePort=59407, ip_connectionId=localhost:59407:44745:df23388b-b8b0-4250-b926-4199030b2ff6, ip_localInetAddress=/127.0.0.1, ip_address=127.0.0.1, id=78b852e8-0094-f8b9-8d05-37bc4912d096, ip_hostname=localhost, timestamp=1588672852963}]
2020-05-05 11:00:52.963 INFO 31793 --- [pool-1-thread-2] o.s.integration.handler.LoggingHandler : GenericMessage [payload=, headers={ip_tcp_remotePort=59407, ip_connectionId=localhost:59407:44745:df23388b-b8b0-4250-b926-4199030b2ff6, ip_localInetAddress=/127.0.0.1, ip_address=127.0.0.1, id=4e4a7ee0-c66e-9501-9ef7-4d25143531b3, ip_hostname=localhost, timestamp=1588672852963}]
2020-05-05 11:00:52.963 INFO 31793 --- [pool-1-thread-2] o.s.integration.handler.LoggingHandler : GenericMessage [payload=, headers={ip_tcp_remotePort=59407, ip_connectionId=localhost:59407:44745:df23388b-b8b0-4250-b926-4199030b2ff6, ip_localInetAddress=/127.0.0.1, ip_address=127.0.0.1, id=c90abe17-7036-94cc-af59-3b0cac5e75e4, ip_hostname=localhost, timestamp=1588672852963}]
2020-05-05 11:00:52.963 INFO 31793 --- [pool-1-thread-2] o.s.integration.handler.LoggingHandler : GenericMessage [payload=, headers={ip_tcp_remotePort=59407, ip_connectionId=localhost:59407:44745:df23388b-b8b0-4250-b926-4199030b2ff6, ip_localInetAddress=/127.0.0.1, ip_address=127.0.0.1, id=425d2e2a-75c5-0184-06da-6d27eb546931, ip_hostname=localhost, timestamp=1588672852963}]
2020-05-05 11:00:52.965 INFO 31793 --- [pool-1-thread-2] o.s.integration.handler.LoggingHandler : GenericMessage [payload=, headers={ip_tcp_remotePort=59407, ip_connectionId=localhost:59407:44745:df23388b-b8b0-4250-b926-4199030b2ff6, ip_localInetAddress=/127.0.0.1, ip_address=127.0.0.1, id=4c55d1d0-7421-1832-23ca-03744419c847, ip_hostname=localhost, timestamp=1588672852965}]
2020-05-05 11:00:52.965 INFO 31793 --- [pool-1-thread-2] o.s.integration.handler.LoggingHandler : GenericMessage [payload=, headers={ip_tcp_remotePort=59407, ip_connectionId=localhost:59407:44745:df23388b-b8b0-4250-b926-4199030b2ff6, ip_localInetAddress=/127.0.0.1, ip_address=127.0.0.1, id=05b6fea6-8f76-5c74-e4cc-33bd8099d8e8, ip_hostname=localhost, timestamp=1588672852965}]
2020-05-05 11:00:52.965 INFO 31793 --- [pool-1-thread-2] o.s.integration.handler.LoggingHandler : GenericMessage [payload=, headers={ip_tcp_remotePort=59407, ip_connectionId=localhost:59407:44745:df23388b-b8b0-4250-b926-4199030b2ff6, ip_localInetAddress=/127.0.0.1, ip_address=127.0.0.1, id=9b8fe16c-07f4-b64b-faec-794eb743559c, ip_hostname=localhost, timestamp=1588672852965}]
2020-05-05 11:00:52.965 INFO 31793 --- [pool-1-thread-2] o.s.integration.handler.LoggingHandler : GenericMessage [payload=, headers={ip_tcp_remotePort=59407, ip_connectionId=localhost:59407:44745:df23388b-b8b0-4250-b926-4199030b2ff6, ip_localInetAddress=/127.0.0.1, ip_address=127.0.0.1, id=222725e8-19c6-62b6-8ba7-103892191c96, ip_hostname=localhost, timestamp=1588672852965}]
2020-05-05 11:00:52.965 INFO 31793 --- [pool-1-thread-2] o.s.integration.handler.LoggingHandler : GenericMessage [payload=, headers={ip_tcp_remotePort=59407, ip_connectionId=localhost:59407:44745:df23388b-b8b0-4250-b926-4199030b2ff6, ip_localInetAddress=/127.0.0.1, ip_address=127.0.0.1, id=951403b6-18a9-ffe9-29b7-67b8c01ea311, ip_hostname=localhost, timestamp=1588672852965}]
2020-05-05 11:00:52.965 INFO 31793 --- [pool-1-thread-2] o.s.integration.handler.LoggingHandler : GenericMessage [payload=, headers={ip_tcp_remotePort=59407, ip_connectionId=localhost:59407:44745:df23388b-b8b0-4250-b926-4199030b2ff6, ip_localInetAddress=/127.0.0.1, ip_address=127.0.0.1, id=2ac3002b-ccd1-9506-f3aa-60f4b63e26b8, ip_hostname=localhost, timestamp=1588672852965}]
2020-05-05 11:00:52.965 INFO 31793 --- [pool-1-thread-2] o.s.integration.handler.LoggingHandler : GenericMessage [payload=, headers={ip_tcp_remotePort=59407, ip_connectionId=localhost:59407:44745:df23388b-b8b0-4250-b926-4199030b2ff6, ip_localInetAddress=/127.0.0.1, ip_address=127.0.0.1, id=dbbf12cf-0a27-407b-0a96-5b27ab0539db, ip_hostname=localhost, timestamp=1588672852965}]
2020-05-05 11:00:52.965 INFO 31793 --- [pool-1-thread-2] o.s.integration.handler.LoggingHandler : GenericMessage [payload=, headers={ip_tcp_remotePort=59407, ip_connectionId=localhost:59407:44745:df23388b-b8b0-4250-b926-4199030b2ff6, ip_localInetAddress=/127.0.0.1, ip_address=127.0.0.1, id=3282d09e-9508-b053-3763-cc23d3c817a8, ip_hostname=localhost, timestamp=1588672852965}]
2020-05-05 11:00:52.966 INFO 31793 --- [pool-1-thread-2] o.s.integration.handler.LoggingHandler : GenericMessage [payload=, headers={ip_tcp_remotePort=59407, ip_connectionId=localhost:59407:44745:df23388b-b8b0-4250-b926-4199030b2ff6, ip_localInetAddress=/127.0.0.1, ip_address=127.0.0.1, id=549b2326-341f-19c0-db98-6b31da1901d8, ip_hostname=localhost, timestamp=1588672852966}]
There is nothing in the log about the client disconnecting. We would see socket close activity in the log.
The log implies we are receiving packets containing 0x00000000.
Use wireshark or similar to look at the activity on the wire.
EDIT
Your custom deserializer is causing the problem.
if (read < 0) {
return 0;
}
You need to throw a SoftEndOfStreamException when you detect a socket close.
This is explained in the documentation.
When the deserializer detects a closed input stream between messages, it must throw a SoftEndOfStreamException; this is a signal to the framework to indicate that the close was "normal". If the stream is closed while decoding a message, some other exception should be thrown instead.

Possible issue with Publish subscribe Channel

I have an application that monitors FTP folder for a specific csv file foo.csv, once the file is located it pulls it to my local and generate a new output format bar.csv, the application then will send the new file finalBEY.csv back to the FTP folder and erase it from local.
Now that I have introduced a process using publishSubscribeChannel where it transforms the file to a message then uses jobLaunchingGateway that will read finalBEY.csv using batch and print it to the consol, it is not working since the finalBEY.csv is being deleted from the local forlder after sending it back to FTP, I'm using .channel("nullChannel") on the jobLaunchingGateway within the first subscribe which suppose to hold it until having a reply from the batch and then moves to the next subscribe which will send it to the ftp and remove it from local, but seems it is not the case where it is removing it from local and thus the batch is not finding finalBEY.csv and throws an error that I'm pasting below with the code.
If I remove the advice from the second subscribe it works fine as this will not delete it from local anymore.
Can you please assist on this matter?
public IntegrationFlow localToFtpFlow(Branch myBranch) {
return IntegrationFlows.from(Files.inboundAdapter(new File(myBranch.getBranchCode()))
.filter(new ChainFileListFilter<File>()
.addFilter(new RegexPatternFileListFilter("final" + myBranch.getBranchCode() + ".csv"))
.addFilter(new FileSystemPersistentAcceptOnceFileListFilter(metadataStore(dataSource), "foo"))),//FileSystemPersistentAcceptOnceFileListFilter
e -> e.poller(Pollers.fixedDelay(10_000)))
.enrichHeaders(h ->h.headerExpression("file_originalFile", "new java.io.File('"+ myBranch.getBranchCode() +"/FEFOexport" + myBranch.getBranchCode() + ".csv')",true))
.transform(p -> {
LOG.info("Sending file " + p + " to FTP branch " + myBranch.getBranchCode());
return p;
})
.log()
.transform(m -> {
this.defaultSessionFactoryLocator.addSessionFactory(myBranch.getBranchCode(),createNewFtpSessionFactory(myBranch));
LOG.info("Adding factory to delegation");
return m;
})
.publishSubscribeChannel(s ->
s.subscribe(f ->f.transform(fileMessageToJobRequest())
.handle(jobLaunchingGateway()).channel("nullChannel"))
.subscribe(h -> h.handle(Ftp.outboundAdapter(createNewFtpSessionFactory(myBranch), FileExistsMode.REPLACE)
.useTemporaryFileName(true)
.autoCreateDirectory(false)
.remoteDirectory(myBranch.getFolderPath()), e -> e.advice(expressionAdvice()))))
.get();
}
#Bean
public FileMessageToJobRequest fileMessageToJobRequest(){
FileMessageToJobRequest fileMessageToJobRequest = new FileMessageToJobRequest();
fileMessageToJobRequest.setFileParameterName("file_path");
fileMessageToJobRequest.setJob(orderJob);
return fileMessageToJobRequest;
}
#Bean
public JobLaunchingGateway jobLaunchingGateway() {
SimpleJobLauncher simpleJobLauncher = new SimpleJobLauncher();
simpleJobLauncher.setJobRepository(jobRepository);
simpleJobLauncher.setTaskExecutor(new SyncTaskExecutor());
JobLaunchingGateway jobLaunchingGateway = new JobLaunchingGateway(simpleJobLauncher);
return jobLaunchingGateway;
}
/**
* Creating the advice for routing the payload of the outbound message on different expressions (success, failure)
* #return Advice
*/
#Bean
public Advice expressionAdvice() {
ExpressionEvaluatingRequestHandlerAdvice advice = new ExpressionEvaluatingRequestHandlerAdvice();
advice.setSuccessChannelName("success.input");
advice.setOnSuccessExpressionString("payload.delete() + ' was successful'");
//advice.setOnSuccessExpressionString("inputMessage.headers['file_originalFile'].renameTo(new java.io.File(payload.absolutePath + '.success.to.send'))");
//advice.setFailureChannelName("failure.input");
advice.setOnFailureExpressionString("payload + ' was bad, with reason: ' + #exception.cause.message");
advice.setTrapException(true);
return advice;
}
Here is the error and as you can see the first lines show that it is transferred to FTP and then initiated the batch job while in the subscribe should do the batch first...
INFO 10452 --- [ask-scheduler-2] o.s.integration.ftp.session.FtpSession : File has been successfully renamed from: /ftp/erbranch/EDMS/FEFO/finalBEY.csv.writing to /ftp/erbranch/EDMS/FEFO/finalBEY.csv
Caused by: java.lang.IllegalStateException: Input resource must exist (reader is in 'strict' mode): file [C:\Java Programs\spring4ftpappftp\BEY\finalBEY.csv]
at org.springframework.batch.item.file.FlatFileItemReader.doOpen(FlatFileItemReader.java:251) ~[spring-batch-infrastructure-4.0.1.RELEASE.jar:4.0.1.RELEASE]
at org.springframework.batch.item.support.AbstractItemCountingItemStreamItemReader.open(AbstractItemCountingItemStreamItemReader.java:146) ~[spring-batch-infrastructure-4.0.1.RELEASE.jar:4.0.1.RELEASE]
... 123 common frames omitted
Debugging code:
2019-07-15 10:43:02.838 INFO 4280 --- [ask-scheduler-2] o.s.integration.ftp.session.FtpSession : File has been successfully transferred to: /ftp/erbranch/EDMS/FEFO/finalBEY.csv.writing
2019-07-15 10:43:02.845 INFO 4280 --- [ask-scheduler-2] o.s.integration.ftp.session.FtpSession : File has been successfully renamed from: /ftp/erbranch/EDMS/FEFO/finalBEY.csv.writing to /ftp/erbranch/EDMS/FEFO/finalBEY.csv
2019-07-15 10:43:02.845 DEBUG 4280 --- [ask-scheduler-2] o.s.b.f.s.DefaultListableBeanFactory : Returning cached instance of singleton bean 'integrationEvaluationContext'
2019-07-15 10:43:02.848 DEBUG 4280 --- [ask-scheduler-2] o.s.b.f.s.DefaultListableBeanFactory : Returning cached instance of singleton bean 'success.input'
2019-07-15 10:43:02.849 DEBUG 4280 --- [ask-scheduler-2] o.s.integration.channel.DirectChannel : preSend on channel 'success.input', message: AdviceMessage [payload=true was successful, headers={id=eca55e1d-918e-3334-afce-66f8ab650748, timestamp=1563176582848}, inputMessage=GenericMessage [payload=BEY\finalBEY.csv, headers={file_originalFile=BEY\FEFOexportBEY.csv, id=a2f029b0-2609-1a11-67ef-4f56c7dd0752, file_name=finalBEY.csv, file_relativePath=finalBEY.csv, timestamp=1563176582787}]]
2019-07-15 10:43:02.849 DEBUG 4280 --- [ask-scheduler-2] o.s.i.t.MessageTransformingHandler : success.org.springframework.integration.transformer.MessageTransformingHandler#0 received message: AdviceMessage [payload=true was successful, headers={id=eca55e1d-918e-3334-afce-66f8ab650748, timestamp=1563176582848},
inputMessage=GenericMessage [payload=BEY\finalBEY.csv, headers={file_originalFile=BEY\FEFOexportBEY.csv, id=a2f029b0-2609-1a11-67ef-4f56c7dd0752, file_name=finalBEY.csv, file_relativePath=finalBEY.csv, timestamp=1563176582787}]]
2019-07-15 10:43:02.951 DEBUG 4280 --- [ask-scheduler-2] o.s.b.i.launch.JobLaunchingGateway : jobLaunchingGateway received message: GenericMessage [payload=JobLaunchRequest: orderJob, parameters={file_path=C:\Java Programs\spring4ftpappftp\BEY\finalBEY.csv, dummy=1563176582946}, headers={file_originalFile=BEY\FEFOexportBEY.csv, id=c98ad6cb-cced-c911-1b93-9d054baeb9d0, file_name=finalBEY.csv, file_relativePath=finalBEY.csv, timestamp=1563176582951}]
2019-07-16 08:35:29.442 INFO 10208 --- [nio-8081-exec-3] o.s.i.channel.PublishSubscribeChannel : Channel 'application.FTPOutp' has 1 subscriber(s).
2019-07-16 08:35:29.442 INFO 10208 --- [nio-8081-exec-3] o.s.i.endpoint.EventDrivenConsumer : started 1o.org.springframework.integration.config.ConsumerEndpointFactoryBean#3
2019-07-16 08:35:29.442 INFO 10208 --- [nio-8081-exec-3] o.s.i.endpoint.EventDrivenConsumer : Adding {message-handler} as a subscriber to the '1o.subFlow#1.channel#0' channel
2019-07-16 08:35:29.442 INFO 10208 --- [nio-8081-exec-3] o.s.integration.channel.DirectChannel : Channel 'application.1o.subFlow#1.channel#0' has 1 subscriber(s).
2019-07-16 08:35:29.442 INFO 10208 --- [nio-8081-exec-3] o.s.i.endpoint.EventDrivenConsumer : started 1o.subFlow#1.org.springframework.integration.config.ConsumerEndpointFactoryBean#1
2019-07-16 08:35:29.442 INFO 10208 --- [nio-8081-exec-3] o.s.i.endpoint.EventDrivenConsumer : Adding {bridge} as a subscriber to the 'FTPOutp' channel
2019-07-16 08:35:29.442 INFO 10208 --- [nio-8081-exec-3] o.s.i.channel.PublishSubscribeChannel : Channel 'application.FTPOutp' has 2 subscriber(s).
Since you are using a SyncTaskExecutor the batch job should run on the calling thread and then followed by the FTP adapter.
Use DEBUG logging and follow the message flow to see why that's not happening.

Handling a method in integration flow

I have created an application that uses Spring MVC to add ftp servers through a web page. When I add a server with details, it will start a flow polling a directory on the sever for specific csv file, once that file is there it will pull it to my local folder then a method is created to read that csv file and generate a new csv that will be sent back to the server, now this is accomplished, my issue is that when I add a new ftp server to poll and it finds a csv file on the second server, the application is pulling that file to my local as it should do but the handling method is being triggered again for both csv files while it should only do so for the second server since it is already triggered for the first server, so when I add the second server then again another file for server 1 generated and sent back to ftp and new one for the second server is send to the second ftp, at the end I have two files sent to first serve and one file to the second server, below is the coding and the output consol of the app, please assist on pinpointing my issue.
#Configuration
#EnableIntegration
#ComponentScan
public class FTIntegration {
public static final String TIMEZONE_UTC = "UTC";
public static final String TIMESTAMP_FORMAT_OF_FILES = "yyyyMMddHHmmssSSS";
public static final String TEMPORARY_FILE_SUFFIX = ".part";
public static final int POLLER_FIXED_PERIOD_DELAY = 5000;
public static final int MAX_MESSAGES_PER_POLL = 100;
//private static final Logger LOG = LoggerFactory.getLogger(FTIntegration.class);
private static final Logger LOG1 = Logger.getLogger(FTIntegration.class);
private static final String CHANNEL_INTERMEDIATE_STAGE = "intermediateChannel";
private static final String OUTBOUND_CHANNEL = "outboundChannel";
/* pulling the server config from postgres DB*/
private final BranchRepository branchRepository;
#Autowired
private CSVToCSVNoQ csvToCSVNoQ;
#Value("${app.temp-dir}")
private String localTempPath;
public FTIntegration(BranchRepository branchRepository) {
this.branchRepository = branchRepository;
}
#Bean
public Branch myBranch(){
return new Branch();
}
/**
* The default poller with 5s, 100 messages, RotatingServerAdvice and transaction.
*
* #return default poller.
*/
#Bean(name = PollerMetadata.DEFAULT_POLLER)
public PollerMetadata poller(){
return Pollers
.fixedDelay(POLLER_FIXED_PERIOD_DELAY)
.maxMessagesPerPoll(MAX_MESSAGES_PER_POLL)
.transactional()
.get();
}
/**
* The direct channel for the flow.
*
* #return MessageChannel
*/
#Bean
public MessageChannel stockIntermediateChannel() {
return new DirectChannel();
}
/**
* Get the files from a remote directory. Add a timestamp to the filename
* and write them to a local temporary folder.
*
* #return IntegrationFlow
*/
#Bean
public PropertiesPersistingMetadataStore store() {
PropertiesPersistingMetadataStore store = new PropertiesPersistingMetadataStore();
return store;
}
public IntegrationFlow fileInboundFlowFromFTPServer(Branch myBranch) throws IOException {
final FtpInboundChannelAdapterSpec sourceSpecFtp = Ftp.inboundAdapter(createNewFtpSessionFactory(myBranch))
.preserveTimestamp(true)
//.patternFilter("*.csv")
.maxFetchSize(MAX_MESSAGES_PER_POLL)
.remoteDirectory(myBranch.getFolderPath())
.regexFilter("FEFOexport"+myBranch.getBranchCode()+".csv")
.deleteRemoteFiles(true)
.localDirectory(new File(myBranch.getBranchCode()))
.temporaryFileSuffix(TEMPORARY_FILE_SUFFIX)
/*.localFilenameExpression(new FunctionExpression<String>(s -> {
final int fileTypeSepPos = s.lastIndexOf('.');
return DateTimeFormatter
.ofPattern(TIMESTAMP_FORMAT_OF_FILES)
.withZone(ZoneId.of(TIMEZONE_UTC))
.format(Instant.now())
+ "_"
+ s.substring(0,fileTypeSepPos)
+ s.substring(fileTypeSepPos);
}))*/;
// Poller definition
final Consumer<SourcePollingChannelAdapterSpec> stockInboundPoller = endpointConfigurer -> endpointConfigurer
.id("stockInboundPoller")
.autoStartup(true)
.poller(poller());
IntegrationFlow flow = IntegrationFlows
.from(sourceSpecFtp, stockInboundPoller)
.transform(File.class, p ->{
// log step
LOG1.info("flow=stockInboundFlowFromAFT, message=incoming file: " + p);
return p;
})
.channel(CHANNEL_INTERMEDIATE_STAGE)
.handle(m -> {
try {
this.csvToCSVNoQ.writeCSVfinal("test", myBranch.getBranchCode() + "/final" + myBranch.getBranchCode() + ".csv", myBranch.getBranchCode() + "/FEFOexport" + myBranch.getBranchCode() + ".csv");
LOG1.info("Writing final file .csv " + m);
} catch (IOException e) {
e.printStackTrace();
}
})
//.handle(m -> this.csvToCSVNoQ.writeCSVfinal(m.getPayload(),m.getHeaders().get("csv", "FEFOexport"+myBranch.getBranchCode()+".csv")))
.get();
return flow;
}
#Bean
public IntegrationFlow stockIntermediateStageChannel() {
IntegrationFlow flow = IntegrationFlows
.from(CHANNEL_INTERMEDIATE_STAGE)
.transform(p -> {
//log step
LOG1.info("flow=stockIntermediateStageChannel, message=rename file: " + p);
return p;
})
//TODO
.channel(new NullChannel())
.get();
return flow;
}
/*
* Creating the outbound adaptor to send files from local to FTP server
*
* */
public IntegrationFlow localToFtpFlow(Branch myBranch){
return IntegrationFlows.from(Files.inboundAdapter(new File(myBranch.getBranchCode()))
.filter(new ChainFileListFilter<File>()
.addFilter(new RegexPatternFileListFilter("final" + myBranch.getBranchCode() +".csv"))
.addFilter(new FileSystemPersistentAcceptOnceFileListFilter(new SimpleMetadataStore(), "foo"))),
e -> e.poller(Pollers.fixedDelay(10_000)))
.transform( p ->{
LOG1.info("Sending file " + p + " to FTP branch " + myBranch.getBranchCode());
return p;
})
.log()
.handle(Ftp.outboundAdapter(createNewFtpSessionFactory(myBranch),FileExistsMode.REPLACE)
.useTemporaryFileName(true)
.autoCreateDirectory(false)
.remoteDirectory(myBranch.getFolderPath()))
.get();
}
public DefaultFtpSessionFactory createNewFtpSessionFactory(Branch branch){
final DefaultFtpSessionFactory factory = new DefaultFtpSessionFactory();
factory.setHost(branch.getHost());
factory.setUsername(branch.getUsern());
factory.setPort(branch.getFtpPort());
factory.setPassword(branch.getPassword());
return factory;
}
Console Output:
Saved Branch : BEY
Hibernate: select branch0_._id as _id1_0_0_, branch0_.branch_code as branch_c2_0_0_, branch0_.folder_path as folder_p3_0_0_, branch0_.ftp_port as ftp_port4_0_0_, branch0_.host as host5_0_0_, branch0_.password as password6_0_0_, branch0_.usern as usern7_0_0_ from branch branch0_ where branch0_._id=?
2018-12-20 07:58:39.218 INFO 6668 --- [nio-8081-exec-5] o.s.integration.channel.DirectChannel : Channel 'application.intermediateChannel' has 2 subscriber(s).
2018-12-20 07:58:39.218 INFO 6668 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : started 1.org.springframework.integration.config.ConsumerEndpointFactoryBean#1
2018-12-20 07:58:39.218 INFO 6668 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : Adding {transformer} as a subscriber to the '1.channel#0' channel
2018-12-20 07:58:39.218 INFO 6668 --- [nio-8081-exec-5] o.s.integration.channel.DirectChannel : Channel 'application.1.channel#0' has 1 subscriber(s).
2018-12-20 07:58:39.218 INFO 6668 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : started 1.org.springframework.integration.config.ConsumerEndpointFactoryBean#0
2018-12-20 07:58:39.229 INFO 6668 --- [nio-8081-exec-5] o.s.i.e.SourcePollingChannelAdapter : started stockInboundPoller
2018-12-20 07:58:39.417 INFO 6668 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : Adding {message-handler} as a subscriber to the '1o.channel#2' channel
2018-12-20 07:58:39.417 INFO 6668 --- [nio-8081-exec-5] o.s.integration.channel.DirectChannel : Channel 'application.1o.channel#2' has 1 subscriber(s).
2018-12-20 07:58:39.418 INFO 6668 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : started 1o.org.springframework.integration.config.ConsumerEndpointFactoryBean#1
2018-12-20 07:58:39.418 INFO 6668 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : Adding {transformer} as a subscriber to the '1o.channel#0' channel
2018-12-20 07:58:39.418 INFO 6668 --- [nio-8081-exec-5] o.s.integration.channel.DirectChannel : Channel 'application.1o.channel#0' has 1 subscriber(s).
2018-12-20 07:58:39.418 INFO 6668 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : started 1o.org.springframework.integration.config.ConsumerEndpointFactoryBean#0
2018-12-20 07:58:39.419 INFO 6668 --- [nio-8081-exec-5] o.s.i.e.SourcePollingChannelAdapter : started 1o.org.springframework.integration.config.SourcePollingChannelAdapterFactoryBean#0
2018-12-20 07:59:41.165 INFO 6668 --- [sk-scheduler-10] o.s.integration.ftp.session.FtpSession : File has been successfully transferred from: /ftp/erbranch/EDMS/FEFO/FEFOexportBEY.csv
2018-12-20 07:59:49.446 INFO 6668 --- [ask-scheduler-1] o.s.i.file.FileReadingMessageSource : Created message: [GenericMessage [payload=BEY\finalBEY.csv, headers={file_originalFile=BEY\finalBEY.csv, id=7f0cccb7-a070-bd4c-d468-977a265ceb2e, file_name=finalBEY.csv, file_relativePath=finalBEY.csv, timestamp=1545285589446}]]
2018-12-20 07:59:49.448 INFO 6668 --- [ask-scheduler-1] o.s.integration.handler.LoggingHandler : GenericMessage [payload=BEY\finalBEY.csv, headers={file_originalFile=BEY\finalBEY.csv, id=d857130a-d4a0-eaeb-19ea-f819924d94e2, file_name=finalBEY.csv, file_relativePath=finalBEY.csv, timestamp=1545285589447}]
2018-12-20 07:59:50.488 INFO 6668 --- [ask-scheduler-1] o.s.integration.ftp.session.FtpSession : File has been successfully transferred to: /ftp/erbranch/EDMS/FEFO/finalBEY.csv.writing
2018-12-20 07:59:50.899 INFO 6668 --- [ask-scheduler-1] o.s.integration.ftp.session.FtpSession : File has been successfully renamed from: /ftp/erbranch/EDMS/FEFO/finalBEY.csv.writing to /ftp/erbranch/EDMS/FEFO/finalBEY.csv
Hibernate: select branch0_._id as _id1_0_, branch0_.branch_code as branch_c2_0_, branch0_.folder_path as folder_p3_0_, branch0_.ftp_port as ftp_port4_0_, branch0_.host as host5_0_, branch0_.password as password6_0_, branch0_.usern as usern7_0_ from branch branch0_
Hibernate: insert into branch (branch_code, folder_path, ftp_port, host, password, usern) values (?, ?, ?, ?, ?, ?)
Hibernate: select currval('branch__id_seq')
Saved Branch : JNB
Hibernate: select branch0_._id as _id1_0_0_, branch0_.branch_code as branch_c2_0_0_, branch0_.folder_path as folder_p3_0_0_, branch0_.ftp_port as ftp_port4_0_0_, branch0_.host as host5_0_0_, branch0_.password as password6_0_0_, branch0_.usern as usern7_0_0_ from branch branch0_ where branch0_._id=?
2018-12-20 08:02:24.966 INFO 6668 --- [nio-8081-exec-8] o.s.integration.channel.DirectChannel : Channel 'application.intermediateChannel' has 3 subscriber(s).
2018-12-20 08:02:24.966 INFO 6668 --- [nio-8081-exec-8] o.s.i.endpoint.EventDrivenConsumer : started 2.org.springframework.integration.config.ConsumerEndpointFactoryBean#1
2018-12-20 08:02:24.966 INFO 6668 --- [nio-8081-exec-8] o.s.i.endpoint.EventDrivenConsumer : Adding {transformer} as a subscriber to the '2.channel#0' channel
2018-12-20 08:02:24.966 INFO 6668 --- [nio-8081-exec-8] o.s.integration.channel.DirectChannel : Channel 'application.2.channel#0' has 1 subscriber(s).
2018-12-20 08:02:24.966 INFO 6668 --- [nio-8081-exec-8] o.s.i.endpoint.EventDrivenConsumer : started 2.org.springframework.integration.config.ConsumerEndpointFactoryBean#0
2018-12-20 08:02:24.966 INFO 6668 --- [nio-8081-exec-8] o.s.i.e.SourcePollingChannelAdapter : started stockInboundPoller
2018-12-20 08:02:24.992 INFO 6668 --- [nio-8081-exec-8] o.s.i.endpoint.EventDrivenConsumer : Adding {message-handler} as a subscriber to the '2o.channel#2' channel
2018-12-20 08:02:24.992 INFO 6668 --- [nio-8081-exec-8] o.s.integration.channel.DirectChannel : Channel 'application.2o.channel#2' has 1 subscriber(s).
2018-12-20 08:02:24.992 INFO 6668 --- [nio-8081-exec-8] o.s.i.endpoint.EventDrivenConsumer : started 2o.org.springframework.integration.config.ConsumerEndpointFactoryBean#1
2018-12-20 08:02:24.992 INFO 6668 --- [nio-8081-exec-8] o.s.i.endpoint.EventDrivenConsumer : Adding {transformer} as a subscriber to the '2o.channel#0' channel
2018-12-20 08:02:24.992 INFO 6668 --- [nio-8081-exec-8] o.s.integration.channel.DirectChannel : Channel 'application.2o.channel#0' has 1 subscriber(s).
2018-12-20 08:02:24.992 INFO 6668 --- [nio-8081-exec-8] o.s.i.endpoint.EventDrivenConsumer : started 2o.org.springframework.integration.config.ConsumerEndpointFactoryBean#0
2018-12-20 08:02:24.992 INFO 6668 --- [nio-8081-exec-8] o.s.i.e.SourcePollingChannelAdapter : started 2o.org.springframework.integration.config.SourcePollingChannelAdapterFactoryBean#0
Hibernate: select branch0_._id as _id1_0_, branch0_.branch_code as branch_c2_0_, branch0_.folder_path as folder_p3_0_, branch0_.ftp_port as ftp_port4_0_, branch0_.host as host5_0_, branch0_.password as password6_0_, branch0_.usern as usern7_0_ from branch branch0_
2018-12-20 08:03:00.225 INFO 6668 --- [ask-scheduler-8] o.s.integration.ftp.session.FtpSession : File has been successfully transferred from: /ftp/erbranch/EDMS/FEFO/FEFOexportJNB.csv
2018-12-20 08:03:00.929 INFO 6668 --- [ask-scheduler-5] o.s.i.file.FileReadingMessageSource : Created message: [GenericMessage [payload=BEY\finalBEY.csv, headers={file_originalFile=BEY\finalBEY.csv, id=6ed554eb-f553-0293-f042-d633155357c0, file_name=finalBEY.csv, file_relativePath=finalBEY.csv, timestamp=1545285780929}]]
2018-12-20 08:03:00.930 INFO 6668 --- [ask-scheduler-5] o.s.integration.handler.LoggingHandler : GenericMessage [payload=BEY\finalBEY.csv, headers={file_originalFile=BEY\finalBEY.csv, id=b2d76ac5-fb85-1313-a37e-8849714a545e, file_name=finalBEY.csv, file_relativePath=finalBEY.csv, timestamp=1545285780930}]
2018-12-20 08:03:01.958 INFO 6668 --- [ask-scheduler-5] o.s.integration.ftp.session.FtpSession : File has been successfully transferred to: /ftp/erbranch/EDMS/FEFO/finalBEY.csv.writing
2018-12-20 08:03:02.373 INFO 6668 --- [ask-scheduler-5] o.s.integration.ftp.session.FtpSession : File has been successfully renamed from: /ftp/erbranch/EDMS/FEFO/finalBEY.csv.writing to /ftp/erbranch/EDMS/FEFO/finalBEY.csv
2018-12-20 08:03:05.033 INFO 6668 --- [ask-scheduler-7] o.s.i.file.FileReadingMessageSource : Created message: [GenericMessage [payload=JNB\finalJNB.csv, headers={file_originalFile=JNB\finalJNB.csv, id=4514e132-5684-9e82-28e7-f75c5c3dcf91, file_name=finalJNB.csv, file_relativePath=finalJNB.csv, timestamp=1545285785033}]]
2018-12-20 08:03:05.034 INFO 6668 --- [ask-scheduler-7] o.s.integration.handler.LoggingHandler : GenericMessage [payload=JNB\finalJNB.csv, headers={file_originalFile=JNB\finalJNB.csv, id=59e62375-f1da-461d-ee61-d105ac3159a0, file_name=finalJNB.csv, file_relativePath=finalJNB.csv, timestamp=1545285785034}]
2018-12-20 08:03:07.530 INFO 6668 --- [ask-scheduler-7] o.s.integration.ftp.session.FtpSession : File has been successfully transferred to: /ftp/erbranch/EDMS/FEFO/finalJNB.csv.writing
2018-12-20 08:03:08.539 INFO 6668 --- [ask-scheduler-7] o.s.integration.ftp.session.FtpSession : File has been successfully renamed from: /ftp/erbranch/EDMS/FEFO/finalJNB.csv.writing to /ftp/erbranch/EDMS/FEFO/finalJNB.csv
.addFilter(new FileSystemPersistentAcceptOnceFileListFilter(new SimpleMetadataStore(), "foo"))),
The SimpleMetadataStore() is in-memory - using a new store for each flow means it starts empty each time - you need to use a shared metadata store across all the flows.
Also, if you need to prevent duplicates across server restarts, you need to use a persistent shared store, e.g. Redis, JDBC etc.

Spring Integration Bridge with poller not working as expected for JMS

Using spring-integration 5.0.7 to throttle the bridging of msgs between two JMS queues.
The docs at: https://docs.spring.io/spring-integration/docs/5.0.7.RELEASE/reference/html/messaging-channels-section.html#bridge-namespace
suggest:
<int:bridge input-channel="pollable" output-channel="subscribable">
<int:poller max-messages-per-poll="10" fixed-rate="5000"/>
</int:bridge>
But schema validator complains "no nested poller allowed for subscribable input channel" on bridge elt.
But, if I put the poller on the input-channel-adapter as in:
<?xml version="1.0" encoding="UTF-8"?>
<beans:beans xmlns:int="http://www.springframework.org/schema/integration"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:beans="http://www.springframework.org/schema/beans"
xmlns:int-jms="http://www.springframework.org/schema/integration/jms"
xsi:schemaLocation="http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans.xsd
http://www.springframework.org/schema/integration
http://www.springframework.org/schema/integration/spring-integration.xsd
http://www.springframework.org/schema/integration/jms
http://www.springframework.org/schema/integration/jms/spring-integration-jms.xsd
">
<int:channel id="inChannel" />
<int:channel id="outChannel" />
<int-jms:inbound-channel-adapter id="jmsIn" connection-factory="jmsConnectionFactory" destination-name="_dev.inQueue" channel="inChannel">
<int:poller fixed-delay="5000" max-messages-per-poll="2"/>
</int-jms:inbound-channel-adapter>
<int-jms:outbound-channel-adapter id="jmsOut" connection-factory="jmsConnectionFactory" destination-name="_dev.outQueue" channel="outChannel"/>
<int:bridge input-channel="inChannel" output-channel="outChannel">
</int:bridge>
</beans:beans>
Nothing is ever moved from input to output.
How can I bridge from one JMS queue to another with a rate-limit?
Update:
Turning on logging confirms nothing getting picked up from input channel but otherwise not helpful:
018-08-10 15:36:33.345 DEBUG 112066 --- [ask-scheduler-1] o.s.i.e.SourcePollingChannelAdapter : Received no Message during the poll, returning 'false'
2018-08-10 15:36:38.113 DEBUG 112066 --- [ask-scheduler-2] o.s.integration.jms.DynamicJmsTemplate : Executing callback on JMS Session: ActiveMQSession {id=ID:whitechapel-35247-1533940593148-3:2:1,started=true} java.lang.Object#5c278302
2018-08-10 15:36:38.116 DEBUG 112066 --- [ask-scheduler-2] o.s.i.e.SourcePollingChannelAdapter : Received no Message during the poll, returning 'false'
2018-08-10 15:36:43.115 DEBUG 112066 --- [ask-scheduler-1] o.s.integration.jms.DynamicJmsTemplate : Executing callback on JMS Session: ActiveMQSession {id=ID:whitechapel-35247-1533940593148-3:3:1,started=true} java.lang.Object#1c09a81e
2018-08-10 15:36:43.118 DEBUG 112066 --- [ask-scheduler-1] o.s.i.e.SourcePollingChannelAdapter : Received no Message during the poll, returning 'false'
Here is a Spring Boot app, using Java DSL configuration which is the exact equivalent of what you have in XML (minus the bridge); it works fine.
#SpringBootApplication
public class So51792909Application {
private static final Logger logger = LoggerFactory.getLogger(So51792909Application.class);
public static void main(String[] args) {
SpringApplication.run(So51792909Application.class, args);
}
#Bean
public ApplicationRunner runner(JmsTemplate template) {
return args -> {
for (int i = 0; i < 10; i++) {
template.convertAndSend("foo", "test");
}
};
}
#Bean
public IntegrationFlow flow(ConnectionFactory connectionFactory) {
return IntegrationFlows.from(Jms.inboundAdapter(connectionFactory)
.destination("foo"), e -> e
.poller(Pollers
.fixedDelay(5000)
.maxMessagesPerPoll(2)))
.handle(Jms.outboundAdapter(connectionFactory)
.destination("bar"))
.get();
}
#JmsListener(destination = "bar")
public void listen(String in) {
logger.info(in);
}
}
and
2018-08-10 19:38:52.534 INFO 13408 --- [enerContainer-1] com.example.So51792909Application : test
2018-08-10 19:38:52.543 INFO 13408 --- [enerContainer-1] com.example.So51792909Application : test
2018-08-10 19:38:57.566 INFO 13408 --- [enerContainer-1] com.example.So51792909Application : test
2018-08-10 19:38:57.582 INFO 13408 --- [enerContainer-1] com.example.So51792909Application : test
2018-08-10 19:39:02.608 INFO 13408 --- [enerContainer-1] com.example.So51792909Application : test
2018-08-10 19:39:02.622 INFO 13408 --- [enerContainer-1] com.example.So51792909Application : test
2018-08-10 19:39:07.640 INFO 13408 --- [enerContainer-1] com.example.So51792909Application : test
2018-08-10 19:39:07.653 INFO 13408 --- [enerContainer-1] com.example.So51792909Application : test
2018-08-10 19:39:12.672 INFO 13408 --- [enerContainer-1] com.example.So51792909Application : test
2018-08-10 19:39:12.687 INFO 13408 --- [enerContainer-1] com.example.So51792909Application : test
As you can see, the consumer gets 2 messages every 5 seconds.
Your debug log implies there are no messages in the queue.
EDIT
I figured it out; the XML parser sets the JmsTemplate receiveTimeout to nowait (-1). Since you are not using a caching connection factory, we'll never get a message because the ActiveMQ client returns immediately if there's not already a message present in the client (see this answer). Since there's no caching going on, we get a new consumer on every poll (and do a no-wait receive each time).
The DSL leaves the JmsTemplate's default (Infinite wait - which is actually wrong since it blocks the poller thread indefinitely if there are no messages).
To fix the XML version, adding receive-timeout="1000" fixes it.
However, it's better to use a CachingConnectionFactory to avoid creating a new connection/session/consumer on each poll.
Unfortunately, configurating a CachingConnectionFactory turns off Spring Boot's auto-configuration. This is fixed in Boot 2.1.
I have opened an issue to resolve the inconsistency between the DSL and XML here.
If you stick with the DSL, I would recommend setting the receive timeout to something reasonable, rather than indefinite:
#Bean
public IntegrationFlow flow(ConnectionFactory connectionFactory) {
return IntegrationFlows.from(Jms.inboundAdapter(connectionFactory)
.configureJmsTemplate(t -> t.receiveTimeout(1000))
.destination("foo"), e -> e
.poller(Pollers
.fixedDelay(5000)
.maxMessagesPerPoll(2)))
.handle(Jms.outboundAdapter(connectionFactory)
.destination("bar"))
.get();
}
But, the best solution is to use a CachingConnectionFactory.

Error sending message to a QueueChannel from chain in spring integration

I am using spring integration in on of my project. And following the is the flow with in my application. When ever I am publishing a messaged to "domainEventChannel" , it goes through the transformer and then messages is sent to inputFromTransformer channel and from this as message is in chain it goes through splitter , transformer and filter. I have configured output-channel="domainEventQueueChannel" ,which is a queue channel.
I am able to track messages till the filter, but they never reach the QueueChannel.
Can I not sent a message to queue channel from a chain ?
<int:transformer ref="partitionlessTransformer" method="transform"
input-channel="domainEventChannel" output-channel="inputFromTransformer" />
<!-- Chain which has the output channel as queue channel-->
<int:chain input-channel="inputFromTransformer"
output-channel="domainEventQueueChannel">
<int:splitter ref="messageSplitter" method="split" />
<int:transformer ref="jsonToObjectTransformer" />
<int:filter ref="autopayModelProcessTransFilter"></int:filter>
</int:chain>
<!-- Queue Channel-->
<int:channel id="domainEventQueueChannel">
<int:queue capacity="10" />
</int:channel>
<!-- service activator which polls queueChannel -->
<int:service-activator id="domainEventReconProcessServiceActivator"
input-channel="domainEventQueueChannel" ref="domainEventReconProcessServiceActivator" method="intiateAutopayProcessTransRecon">
<int:poller task-executor="domainEventReconProcessServicetaskExecutor" fixed-rate="10" >
</int:poller>
</int:service-activator>
<task:executor id="domainEventReconProcessServicetaskExecutor"
pool-size="10" queue-capacity="10" />
Update 2#:
When I add the following piece of config everything works fine. Message flows from queue channel to service activator:
<int:chain input-channel="domainEventQueueChannel"
output-channel="nullChannel">
<int:service-activator id="domainEventReconProcessServiceActivator"
ref="domainEventReconProcessServiceActivator" method="intiateAutopayProcessTransRecon" />
<int:poller task-executor="domainEventReconProcessServicetaskExecutor" fixed-rate="10"></int:poller>
</int:chain>
Update 3#:
With the chain in place I see the service activator's poller polling the queueChannel. But without the chain , the poller is not starting at all I don't see poller logs at all.
With Chain in place:
2015-09-08 09:29:31.438 DEBUG 12817 --- [ask-scheduler-8] o.s.integration.channel.QueueChannel : preSend on channel 'domainEventQueueChannel', message: GenericMessage [payload=com.autopayprocesstransrecon.resource.DomainEvent#6ac4fcc0, headers={sequenceNumber=1, correlationId=7a91b5fb-b1aa-464a-2e1b-5309652a0520, id=25ae85aa-e0df-6c72-9a54-2e8dac3c5ddc, sequenceSize=1, timestamp=1441718971437}]
2015-09-08 09:29:31.438 DEBUG 12817 --- [ask-scheduler-8] o.s.integration.channel.QueueChannel : postSend (sent=true) on channel 'domainEventQueueChannel', message: GenericMessage [payload=com.autopayprocesstransrecon.resource.DomainEvent#6ac4fcc0, headers={sequenceNumber=1, correlationId=7a91b5fb-b1aa-464a-2e1b-5309652a0520, id=25ae85aa-e0df-6c72-9a54-2e8dac3c5ddc, sequenceSize=1, timestamp=1441718971437}]
2015-09-08 09:29:31.438 DEBUG 12817 --- [ask-scheduler-8] o.s.integration.channel.DirectChannel : postSend (sent=true) on channel 'inputFromTransformer', message: GenericMessage [payload=[{"eventInfo":{"eventId":"4535345353","eventName":"AutopayModelsExtracted","parentEventId":"4535345353"},"numOfAutopayModels":20,"message":"Published autopaymodels needed to be processed"}], headers={id=7a91b5fb-b1aa-464a-2e1b-5309652a0520, timestamp=1441718971410}]
2015-09-08 09:29:31.438 DEBUG 12817 --- [ask-scheduler-8] o.s.integration.channel.DirectChannel : postSend (sent=true) on channel 'domainEventChannel', message: GenericMessage [payload={domainevents={0=[{"eventInfo":{"eventId":"4535345353","eventName":"AutopayModelsExtracted","parentEventId":"4535345353"},"numOfAutopayModels":20,"message":"Published autopaymodels needed to be processed"}]}}, headers={id=438c63c8-1ed1-6516-110e-c3478aca35c8, timestamp=1441718971409}]
2015-09-08 09:29:31.438 DEBUG 12817 --- [etaskExecutor-1] o.s.integration.channel.QueueChannel : postReceive on channel 'domainEventQueueChannel', message: GenericMessage [payload=com.autopayprocesstransrecon.resource.DomainEvent#6ac4fcc0, headers={sequenceNumber=1, correlationId=7a91b5fb-b1aa-464a-2e1b-5309652a0520, id=25ae85aa-e0df-6c72-9a54-2e8dac3c5ddc, sequenceSize=1, timestamp=1441718971437}]
2015-09-08 09:29:31.438 DEBUG 12817 --- [etaskExecutor-1] o.s.i.endpoint.PollingConsumer : Poll resulted in Message: GenericMessage [payload=com.autopayprocesstransrecon.resource.DomainEvent#6ac4fcc0, headers={sequenceNumber=1, correlationId=7a91b5fb-b1aa-464a-2e1b-5309652a0520, id=25ae85aa-e0df-6c72-9a54-2e8dac3c5ddc, sequenceSize=1, timestamp=1441718971437}]
2015-09-08 09:29:31.438 DEBUG 12817 --- [etaskExecutor-1] o.s.i.handler.MessageHandlerChain : org.springframework.integration.handler.MessageHandlerChain#1 received message: GenericMessage [payload=com.autopayprocesstransrecon.resource.DomainEvent#6ac4fcc0, headers={sequenceNumber=1, correlationId=7a91b5fb-b1aa-464a-2e1b-5309652a0520, id=25ae85aa-e0df-6c72-9a54-2e8dac3c5ddc, sequenceSize=1, timestamp=1441718971437}]
2015-09-08 09:29:31.438 DEBUG 12817 --- [etaskExecutor-1] o.s.i.handler.ServiceActivatingHandler : ServiceActivator for [org.springframework.integration.handler.MethodInvokingMessageProcessor#72193fb5] (org.springframework.integration.handler.MessageHandlerChain#1$child.domainEventReconProcessServiceActivator) received message: GenericMessage [payload=com.autopayprocesstransrecon.resource.DomainEvent#6ac4fcc0, headers={sequenceNumber=1, correlationId=7a91b5fb-b1aa-464a-2e1b-5309652a0520, id=25ae85aa-e0df-6c72-9a54-2e8dac3c5ddc, sequenceSize=1, timestamp=1441718971437}]`
Without Chain:
2015-09-08 09:48:02.572 DEBUG 13043 --- [ask-scheduler-1] o.s.i.splitter.MethodInvokingSplitter : org.springframework.integration.splitter.MethodInvokingSplitter#53abfc07 received message: GenericMessage [payload=[{"eventInfo":{"eventId":"4535345353","eventName":"AutopayModelsExtracted","parentEventId":"4535345353"},"numOfAutopayModels":20,"message":"Published autopaymodels needed to be processed"}], headers={id=f477f025-e92f-f896-9ac4-2ce1b91fb895, timestamp=1441720082572}]
2015-09-08 09:48:02.574 DEBUG 13043 --- [ask-scheduler-1] o.s.i.t.MessageTransformingHandler : org.springframework.integration.transformer.MessageTransformingHandler#2c8c16c0 received message: GenericMessage [payload={"eventInfo":{"eventId":"4535345353","eventName":"AutopayModelsExtracted","parentEventId":"4535345353"},"numOfAutopayModels":20,"message":"Published autopaymodels needed to be processed"}, headers={sequenceNumber=1, correlationId=f477f025-e92f-f896-9ac4-2ce1b91fb895, id=1364077a-6d5c-87f9-7d7c-3d144e45d661, sequenceSize=1, timestamp=1441720082574}]
2015-09-08 09:48:02.599 DEBUG 13043 --- [ask-scheduler-1] o.s.integration.filter.MessageFilter : org.springframework.integration.filter.MessageFilter#80bfa9d received message: GenericMessage [payload=com.autopayprocesstransrecon.resource.DomainEvent#fd57501, headers={sequenceNumber=1, correlationId=f477f025-e92f-f896-9ac4-2ce1b91fb895, id=c2a324c8-9eb0-66e0-73a1-5595ed87b3ae, sequenceSize=1, timestamp=1441720082599}]
hey I am in filter and my result would betrue
2015-09-08 09:48:02.599 DEBUG 13043 --- [ask-scheduler-1] o.s.integration.channel.QueueChannel : preSend on channel 'domainEventQueueChannel', message: GenericMessage [payload=com.autopayprocesstransrecon.resource.DomainEvent#fd57501, headers={sequenceNumber=1, correlationId=f477f025-e92f-f896-9ac4-2ce1b91fb895, id=c2a324c8-9eb0-66e0-73a1-5595ed87b3ae, sequenceSize=1, timestamp=1441720082599}]
2015-09-08 09:48:02.599 DEBUG 13043 --- [ask-scheduler-1] o.s.integration.channel.QueueChannel : postSend (sent=true) on channel 'domainEventQueueChannel', message: GenericMessage [payload=com.autopayprocesstransrecon.resource.DomainEvent#fd57501, headers={sequenceNumber=1, correlationId=f477f025-e92f-f896-9ac4-2ce1b91fb895, id=c2a324c8-9eb0-66e0-73a1-5595ed87b3ae, sequenceSize=1, timestamp=1441720082599}]
2015-09-08 09:48:02.599 DEBUG 13043 --- [ask-scheduler-1] o.s.integration.channel.DirectChannel : postSend (sent=true) on channel 'inputFromTransformer', message: GenericMessage [payload=[{"eventInfo":{"eventId":"4535345353","eventName":"AutopayModelsExtracted","parentEventId":"4535345353"},"numOfAutopayModels":20,"message":"Published autopaymodels needed to be processed"}], headers={id=f477f025-e92f-f896-9ac4-2ce1b91fb895, timestamp=1441720082572}]
2015-09-08 09:48:02.599 DEBUG 13043 --- [ask-scheduler-1] o.s.integration.channel.DirectChannel : postSend (sent=true) on channel 'domainEventChannel', message: GenericMessage [payload={domainevents={0=[{"eventInfo":{"eventId":"4535345353","eventName":"AutopayModelsExtracted","parentEventId":"4535345353"},"numOfAutopayModels":20,"message":"Published autopaymodels needed to be processed"}]}}, headers={id=bf6f4f20-52bb-27e4-df97-4112ad4be3b5, timestamp=1441720082571}]
2015-09-08 09:48:03.616 DEBUG 13043 --- [ask-scheduler-3] o.s.i.e.SourcePollingChannelAdapter : Received no Message during the poll, returning 'false'
2015-09-08 09:48:04.634 DEBUG 13043 --- [ask-scheduler-4] o.s.i.e.SourcePollingChannelAdapter : Received no Message during the poll, returning 'false'
That simply means the message is not passing the filter.
(The filter is dropping the message).

Resources