Spring Integration preventDuplicates(false) not working - spring-integration

Given the code below, I would expect preventDuplicates(false) to allow files with the same name and size and last modification to be processed. However, the code still acts as it this flag is true in that only a file with a new name is accepted.
The API says Configure an AcceptOnceFileListFilter if preventDuplicates == true, otherwise - AcceptAllFileListFilter.
Which I assume to mean that AcceptAllFileListFilter is being used automatically, however when setting a break-point on this filter, it is never hit.
The patternFilter is also not working. *.csv is also being processed.
Removing either/or preventDuplicates() or patternFilter() doesn't make any difference (assuming that a Filter Chain was causing the problem).
Can anyone explain why this is happening?
#SpringBootApplication
public class ApplicationTest {
public static void main(String[] args) {
ConfigurableApplicationContext ctx =
new SpringApplicationBuilder(ApplicationTest.class)
.web(WebApplicationType.NONE)
.run(args);
}
#Bean
public IntegrationFlow readBackUpFlow() {
return IntegrationFlows
.from(
Files
.inboundAdapter(new File("d:/temp/in"))
.patternFilter("*.txt")
.preventDuplicates(false)
.get(),
e -> e.poller(Pollers.fixedDelay(5000))
)
.log("File", m -> m)
.transform(Files.toStringTransformer())
.log("Content", m -> m)
.handle((p, h) -> h.get(FileHeaders.ORIGINAL_FILE, File.class))
.handle(Files.outboundAdapter(new File("d:/temp/done")).deleteSourceFiles(true))
.get();
}
}

Don't call get() on that Files.inboundAdapter(). It does some extra work in the getComponentsToRegister() which is called by the framework at specific phase. So, that's how your filters are not applied to the target channel adapter.

Related

Split/Aggregate never releases the groups (Spring Integration with Java DSL)

I am trying to do a GroupBy a list of GeoJSON Features based on a shared ID, in order to aggregate a single field of these Features, by using split/aggregate, like so:
#Bean
IntegrationFlow myFlow() {
return IntegrationFlows.from(MY_DIRECT_CHANNEL)
.handle(Http.outboundGateway(myRestUrl)
.httpMethod(HttpMethod.GET)
.expectedResponseType(FeatureCollection.class)
.mappedResponseHeaders(""))
.split(FeatureCollection.class, FeatureCollection::getFeatures)
.aggregate(aggregator -> aggregator
.outputProcessor(agg -> {
final List<String> collected = agg
.getMessages()
.stream()
.map(m -> ((Number)((Feature) m.getPayload()).getProperties().get("my_field")).intValue() + "")
.collect(Collectors.toList());
return MyPojo.builder()
.myId(((Number) agg.getGroupId()).longValue())
.myListString(String.join(",", collected))
.build();
})
.correlationStrategy(m -> ((Feature) m.getPayload()).getProperties().get("shared_id"))
// .sendPartialResultOnExpiry(true)
// .groupTimeout(10000) // there's got to be a better way ...
// .expireGroupsUponTimeout(false)
)
.handle(Jpa.updatingGateway(myEntityManagerFactory).namedQuery(MyPojo.QUERY_UPDATE),
spec -> spec.transactional(myTransactionManager))
.nullChannel();
}
Unless I un-comment those 3 lines, the aggregator never releases the groups and the database never receives any updates. If I set groupTimeout to less than 5 seconds, I am missing partial results.
I expected the releaseStrategy to be SimpleSequenceSizeReleaseStrategy by default which I expected would automatically release all the groups after all of the (split) Features had been processed (there are only 129 Features in total from the REST service message). Manually setting this as the releaseStrategy doesn't help.
What is the proper way to release the groups once all 129 messages have been processed ?
I got it to work using a transformer instead of split/aggregate:
#Bean
IntegrationFlow myFlow(MyTransformer myTransformer) {
return IntegrationFlows.from(MY_DIRECT_CHANNEL)
.handle(Http.outboundGateway(myRestUrl)
.httpMethod(HttpMethod.GET)
.expectedResponseType(FeatureCollection.class)
.mappedResponseHeaders(""))
.transform(myTransformer)
.split()
.handle(Jpa.updatingGateway(myEntityManagerFactory).namedQuery(MyEntity.QUERY_UPDATE),
spec -> spec.transactional(myTransactionManager))
.nullChannel();
}
And the signature of the transformer is:
#Component
public class MyTransformer implements GenericTransformer<FeatureCollection, List<MyEntity>> {
#Override
public List<MyEntity> transform(FeatureCollection featureCollection) {
...
}
}

Spring Integration - Stop Inbound Adapter after files have been copied

I am doing a local file copy using Spring integration. My class is presented later in this post.
I initiate the file copy by issuing an adapter.start() from another class. That works fine, and the adapter (localFileTransferAdapter) runs once (based on the FireOnceTrigger) copying all the files I expect. But I then want to stop the adapter after all the files have been copied.
What can I do to detect that the adapter has copied the files so I can then stop the adapter? The FireOnceTrigger will never fire again - but the adapter still shows running when I query it from the class I use to start the adapter. I could wait some number of seconds and stop the adapter - but it could stop the adapter in the middle of copying files if there are many large files to copy - which I don't want to happen.
I've reviewed How to stop polling after a message is received? Spring Integration but it does not appear to match my use case.
Thanks in advance for for any assistance.
public class LocalFileTransfer {
#Value("${source.directory:c:/source}")
private String sourceDirectory;
#Value("${target.directory:c:/target}")
private String targetDirectory;
#Bean
public MessageSource<File> sourceDirectory() {
FileReadingMessageSource messageSource = new FileReadingMessageSource();
messageSource.setDirectory(new File(sourceDirectory));
return messageSource;
}
#Bean
public IntegrationFlow fileMover() {
return IntegrationFlows.from(sourceDirectory(),
c -> c.autoStartup(false)
.id("localFileTransferAdapter")
.poller(Pollers.trigger(new FireOnceTrigger())
.maxMessagesPerPoll(-1)))
.filter(source -> ((File) source).getName().endsWith(".txt"))
.log(LoggingHandler.Level.ERROR, "localfile.category", m -> m.getPayload())
.log(LoggingHandler.Level.ERROR, "localfile.category", m -> m.getHeaders())
.handle(targetDirectory())
.get();
}
#Bean
public MessageHandler targetDirectory() {
FileWritingMessageHandler handler = new FileWritingMessageHandler(new File(targetDirectory));
handler.setFileExistsMode(FileExistsMode.REPLACE);
handler.setExpectReply(false);
return handler;
}
}
You are on the right track with the FireOnceTrigger and maxMessagesPerPoll(-1). Also that answer with an AbstractMessageSourceAdvice sample fully fits to your scenario.
In the MessageSourceMutator.afterReceive() you just check for the null of result argument and call a stop() of that localFileTransferAdapter.
Since everything happens on the same thread, you are safe to call stop just when you meet a null: no need to introduce some delay and the channel adapter doesn't exist until it finishes producing messages in that one poll cycle.

How to register the integration flows in runtime?

I'm building a micro service for multiple properties. So, each property has different configuration. To do that, I've implemented something like this;
#Autowired
IntegrationFlowContext flowContext;
#Bean
public void setFlowContext() {
List<Login> loginList = DAO.getLoginList(); // a web service
loginList.forEach(e -> {
IntegrationFlow flow = IntegrationFlows.from(() -> e, c -> c.poller(Pollers.fixedRate(e.getPeriod(), TimeUnit.SECONDS, 5)))
.channel("X_CHANNEL")
.get();
flowContext.registration(flow).register();
});
}
By this implementation, I'm getting the loginList before application started. So, after application is started, I'm not able to get loginList from web service since there is no poller config. The problem is loginList could change; new logins credentials could be added or deleted. Therefore, I want to implement something will work X time period to get loginList from web service, then, by loginList I need to register the flows that are created for each loginList. To achieve, I've implemented something like this;
#Bean
public IntegrationFlow setFlowContext() {
return IntegrationFlows
.from(this::getSpecification, p -> p.poller(Pollers.fixedRate(X))) // the specification is constant.
.transform(payload -> DAO.getLoginList(payload))
.split()
.<Login>handle((payload, header) -> {
IntegrationFlow flow = IntegrationFlows.from(() -> payload, c -> c.poller(Pollers.fixedRate(payload.getPeriod(), TimeUnit.SECONDS, 5)))
.channel("X_CHANNEL")
.get();
flowContext.registration(flow).register().start();
return null;
})
.get();
}
Basically, I've used start() method, but this is not working as aspected. See this;
flowContext.registration(flow).register().start();
Lastly, I've read the Dynamic and Runtime Integration Flows, but still couldn't implement this feature.
Dynamic flow registration cannot be used within a #Bean definition.
It is designed to be used at runtime AFTER the application context is fully initialized.

Is there a default output channel if DSL flow ends with endpoin?

The last element in the code for the following DSL flow is Service Activator (.handle method).
Is there a default output direct channel to which I can subscribe here? If I understand things correctly, the output channel must be present
I know I can add .channel("name") at the end but the question is what if it's not written explicitly.
Here is the code:
#SpringBootApplication
#IntegrationComponentScan
public class QueueChannelResearch {
#Bean
public IntegrationFlow lambdaFlow() {
return f -> f.channel(c -> c.queue(50))
.handle(System.out::println);
}
public static void main(String[] args) {
ConfigurableApplicationContext ctx = SpringApplication.run(QueueChannelResearch.class, args);
MessageChannel inputChannel = ctx.getBean("lambdaFlow.input", MessageChannel.class);
for (int i = 0; i < 1000; i++) {
inputChannel.send(MessageBuilder.withPayload("w" + i)
.build());
}
ctx.close();
}
Another question is about QueueChannel. The program hangs if comment handle() and completes if uncomment it. Does that mean that handle() add a default Poller before it?
return f -> f.channel(c -> c.queue(50));
// .handle(System.out::println);
No, that doesn't work that way.
Just recall that integration flow is a filter-pipes architecture and result of the current step is going to be sent to next one. Since you use .handle(System.out::println) there is no output from that println() method call therefore nothing is returned to build a Message to sent to the next channel if any. So, the flow stops here. The void return type or null returned value is a signal for service activator to stop the flow. Consider your .handle(System.out::println) as an <outbound-channel-adapter> in the XML configuration.
And yes: there is no any default channels, unless you define one via replyChannel header in advance. But again: your service method must return something valuable.
The output from service activator is optional, that's why we didn't introduce extra operator for the Outbound Channel Adapter.
The question about QueueChannel would be better to handle in the separate SO thread. There is no default poller unless you declare one as a PollerMetadata.DEFAULT_POLLER. You might use some library which delcares that one for you.

Spring Integration DSL Dynamic Inbound Channel

Is it possible to register MessageSources at runtime with spring-integration-dsl?
In my case I want to create multiple FileReadingMessageSources (based on input from UI) and then send the payload to a specific channel/jms route (which is read from metadata or user input)
Another Question is, is it possible to dynamically register IntegrationFlows?
It's a bit tricky and requires some Spring infrastructure knowledges, but yes it is possible:
#Service
public static class MyService {
#Autowired
private AutowireCapableBeanFactory beanFactory;
#Autowired
#Qualifier("dynamicAdaptersResult")
PollableChannel dynamicAdaptersResult;
public void pollDirectories(File... directories) {
for (File directory : directories) {
StandardIntegrationFlow integrationFlow = IntegrationFlows
.from(s -> s.file(directory),
e -> e.poller(p -> p.fixedDelay(1000))
.id(directory.getName() + ".adapter"))
.transform(Transformers.fileToString(),
e -> e.id(directory.getName() + ".transformer"))
.channel(this.dynamicAdaptersResult)
.get();
this.beanFactory.initializeBean(integrationFlow, directory.getName());
this.beanFactory.getBean(directory.getName() + ".transformer", Lifecycle.class).start();
this.beanFactory.getBean(directory.getName() + ".adapter", Lifecycle.class).start();
}
}
}
Investigate this my sample, please, and let me know what isn't clear for you.

Resources