Spring Integration DSL Dynamic Inbound Channel - spring-integration

Is it possible to register MessageSources at runtime with spring-integration-dsl?
In my case I want to create multiple FileReadingMessageSources (based on input from UI) and then send the payload to a specific channel/jms route (which is read from metadata or user input)
Another Question is, is it possible to dynamically register IntegrationFlows?

It's a bit tricky and requires some Spring infrastructure knowledges, but yes it is possible:
#Service
public static class MyService {
#Autowired
private AutowireCapableBeanFactory beanFactory;
#Autowired
#Qualifier("dynamicAdaptersResult")
PollableChannel dynamicAdaptersResult;
public void pollDirectories(File... directories) {
for (File directory : directories) {
StandardIntegrationFlow integrationFlow = IntegrationFlows
.from(s -> s.file(directory),
e -> e.poller(p -> p.fixedDelay(1000))
.id(directory.getName() + ".adapter"))
.transform(Transformers.fileToString(),
e -> e.id(directory.getName() + ".transformer"))
.channel(this.dynamicAdaptersResult)
.get();
this.beanFactory.initializeBean(integrationFlow, directory.getName());
this.beanFactory.getBean(directory.getName() + ".transformer", Lifecycle.class).start();
this.beanFactory.getBean(directory.getName() + ".adapter", Lifecycle.class).start();
}
}
}
Investigate this my sample, please, and let me know what isn't clear for you.

Related

Spring Integration preventDuplicates(false) not working

Given the code below, I would expect preventDuplicates(false) to allow files with the same name and size and last modification to be processed. However, the code still acts as it this flag is true in that only a file with a new name is accepted.
The API says Configure an AcceptOnceFileListFilter if preventDuplicates == true, otherwise - AcceptAllFileListFilter.
Which I assume to mean that AcceptAllFileListFilter is being used automatically, however when setting a break-point on this filter, it is never hit.
The patternFilter is also not working. *.csv is also being processed.
Removing either/or preventDuplicates() or patternFilter() doesn't make any difference (assuming that a Filter Chain was causing the problem).
Can anyone explain why this is happening?
#SpringBootApplication
public class ApplicationTest {
public static void main(String[] args) {
ConfigurableApplicationContext ctx =
new SpringApplicationBuilder(ApplicationTest.class)
.web(WebApplicationType.NONE)
.run(args);
}
#Bean
public IntegrationFlow readBackUpFlow() {
return IntegrationFlows
.from(
Files
.inboundAdapter(new File("d:/temp/in"))
.patternFilter("*.txt")
.preventDuplicates(false)
.get(),
e -> e.poller(Pollers.fixedDelay(5000))
)
.log("File", m -> m)
.transform(Files.toStringTransformer())
.log("Content", m -> m)
.handle((p, h) -> h.get(FileHeaders.ORIGINAL_FILE, File.class))
.handle(Files.outboundAdapter(new File("d:/temp/done")).deleteSourceFiles(true))
.get();
}
}
Don't call get() on that Files.inboundAdapter(). It does some extra work in the getComponentsToRegister() which is called by the framework at specific phase. So, that's how your filters are not applied to the target channel adapter.

Spring Integration read and process a file without polling

I'm currently trying to write and integration flow then reads a csv file and processes it in chunks (Calls API for enrichment) then writes in back out as a new csv. I currently have an example working perfectly except that it is polling a directory. What I would like to do is be able to pass the file-path and file-name to the integration flow in the headers and then just perform the operation on that one file.
Here is my code for the polling example that works great except for the polling.
#Bean
#SuppressWarnings("unchecked")
public IntegrationFlow getUIDsFromTTDandOutputToFile() {
Gson gson = new GsonBuilder().disableHtmlEscaping().create();
return IntegrationFlows
.from(Files.inboundAdapter(new File(inputFilePath))
.filter(getFileFilters())
.preventDuplicates(true)
.autoCreateDirectory(true),
c -> c
.poller(Pollers.fixedRate(1000)
.maxMessagesPerPoll(1)
)
)
.log(Level.INFO, m -> "TTD UID 2.0 Integration Start" )
.split(Files.splitter())
.channel(c -> c.executor(Executors.newFixedThreadPool(7)))
.handle((p, h) -> new CSVUtils().csvColumnSelector((String) p, ttdColNum))
.channel("chunkingChannel")
.get();
}
#Bean
#ServiceActivator(inputChannel = "chunkingChannel")
public AggregatorFactoryBean chunker() {
log.info("Initializing Chunker");
AggregatorFactoryBean aggregator = new AggregatorFactoryBean();
aggregator.setReleaseStrategy(new MessageCountReleaseStrategy(batchSize));
aggregator.setExpireGroupsUponCompletion(true);
aggregator.setGroupTimeoutExpression(new ValueExpression<>(100L));
aggregator.setOutputChannelName("chunkingOutput");
aggregator.setProcessorBean(new DefaultAggregatingMessageGroupProcessor());
aggregator.setSendPartialResultOnExpiry(true);
aggregator.setCorrelationStrategy(new CorrelationStrategyIml());
return aggregator;
}
#Bean
public IntegrationFlow enrichFlow() {
return IntegrationFlows.from("chunkingOutput")
.handle((p, h) -> gson.toJson(new TradeDeskUIDRequestPayloadBean((Collection<String>) p)))
.enrichHeaders(eh -> eh.async(false)
.header("accept", "application/json")
.header("contentType", "application/json")
.header("Authorization", "Bearer [TOKEN]")
)
.log(Level.INFO, m -> "Sending request of size " + batchSize + " to: " + TTD_UID_IDENTITY_MAP)
.handle(Http.outboundGateway(TTD_UID_IDENTITY_MAP)
.requestFactory(
alliantPooledHttpConnection.get_httpComponentsClientHttpRequestFactory())
.httpMethod(HttpMethod.POST)
.expectedResponseType(TradeDeskUIDResponsePayloadBean.class)
.extractPayload(true)
)
.log(Level.INFO, m -> "Writing response to output file" )
.handle((p, h) -> ((TradeDeskUIDResponsePayloadBean) p).printMappedBodyAsCSV2())
.handle(Files.outboundAdapter(new File(outputFilePath))
.autoCreateDirectory(true)
.fileExistsMode(FileExistsMode.APPEND)
//.appendNewLine(true)
.fileNameGenerator(m -> m.getHeaders().getOrDefault("file_name", "outputFile") + "_out.csv")
)
.get();
}
public class CorrelationStrategyIml implements CorrelationStrategy {
#Override
public Object getCorrelationKey(Message<?> message) {
return message.getHeaders().getOrDefault("", 1);
}
}
#Component
public class CSVUtils {
#ServiceActivator
String csvColumnSelector(String inputStr, Integer colNum) {
return StringUtils.commaDelimitedListToStringArray(inputStr)[colNum];
}
}
private FileListFilter<File> getFileFilters(){
ChainFileListFilter<File> cflf = new ChainFileListFilter<>();
cflf.addFilter(new LastModifiedFileListFilter(30));
cflf.addFilter(new AcceptOnceFileListFilter<>());
cflf.addFilter(new SimplePatternFileListFilter(fileExtention));
return cflf;
}
If you know the file, then there is no reason in any special component from the framework. You just start your flow from a channel and send a message to it with File object as a payload. That message is going to be carried on to the slitter in your flow and everything is going to work OK.
If you really want to have a high-level API on the matter, you can expose a #MessagingGateway as a beginning of that flow and end-user is going to call your gateway method with desired file as an argument. The framework will create a message on your behalf and send it to the message channel in the flow for processing.
See more info in docs about gateways:
https://docs.spring.io/spring-integration/docs/current/reference/html/messaging-endpoints.html#gateway
https://docs.spring.io/spring-integration/docs/current/reference/html/dsl.html#integration-flow-as-gateway
And also a DSL definition starting from some explicit channel:
https://docs.spring.io/spring-integration/docs/current/reference/html/dsl.html#java-dsl-channels

How to register the integration flows in runtime?

I'm building a micro service for multiple properties. So, each property has different configuration. To do that, I've implemented something like this;
#Autowired
IntegrationFlowContext flowContext;
#Bean
public void setFlowContext() {
List<Login> loginList = DAO.getLoginList(); // a web service
loginList.forEach(e -> {
IntegrationFlow flow = IntegrationFlows.from(() -> e, c -> c.poller(Pollers.fixedRate(e.getPeriod(), TimeUnit.SECONDS, 5)))
.channel("X_CHANNEL")
.get();
flowContext.registration(flow).register();
});
}
By this implementation, I'm getting the loginList before application started. So, after application is started, I'm not able to get loginList from web service since there is no poller config. The problem is loginList could change; new logins credentials could be added or deleted. Therefore, I want to implement something will work X time period to get loginList from web service, then, by loginList I need to register the flows that are created for each loginList. To achieve, I've implemented something like this;
#Bean
public IntegrationFlow setFlowContext() {
return IntegrationFlows
.from(this::getSpecification, p -> p.poller(Pollers.fixedRate(X))) // the specification is constant.
.transform(payload -> DAO.getLoginList(payload))
.split()
.<Login>handle((payload, header) -> {
IntegrationFlow flow = IntegrationFlows.from(() -> payload, c -> c.poller(Pollers.fixedRate(payload.getPeriod(), TimeUnit.SECONDS, 5)))
.channel("X_CHANNEL")
.get();
flowContext.registration(flow).register().start();
return null;
})
.get();
}
Basically, I've used start() method, but this is not working as aspected. See this;
flowContext.registration(flow).register().start();
Lastly, I've read the Dynamic and Runtime Integration Flows, but still couldn't implement this feature.
Dynamic flow registration cannot be used within a #Bean definition.
It is designed to be used at runtime AFTER the application context is fully initialized.

Spring Integration Java DSL bridge to poll two different directories

I have setup file poller/channel adapter which polls a directory and handler integration flow using Java DSL. But I am not getting any reference how to add another directory/channel adapter and bridge to same handler. Here is my code.
#Bean
public IntegrationFlow integrationFlow(JobLaunchingGateway jobLaunchingGateway) {
return IntegrationFlows.from(Files.inboundAdapter(new File(incomingDir)).
filter(new SimplePatternFileListFilter("*.csv")).
filter(new AcceptOnceFileListFilter<>()),
c -> c.poller(Pollers.fixedRate(500).maxMessagesPerPoll(1))).
handle(fileMessageToJobRequest()).
handle(jobLaunchingGateway).
log(LoggingHandler.Level.WARN, "headers.id + ': ' + payload").
get();
}
Thanks #Artem. How about following ?
#Bean
public IntegrationFlow integrationFlowUi(JobLaunchingGateway jobLaunchingGateway) {
return IntegrationFlows.from(Files.inboundAdapter(new File(incomingDirUi)).
filter(new SimplePatternFileListFilter("*.csv")).
filter(new AcceptOnceFileListFilter<>()),
c -> c.poller(Pollers.fixedRate(500).maxMessagesPerPoll(1))).
channel("to-bridge").
handle(fileMessageToJobRequest()).
handle(jobLaunchingGateway).
log(LoggingHandler.Level.WARN, "headers.id + ': ' + payload").
get();
}
#Bean
public IntegrationFlow integrationFlowSftp(JobLaunchingGateway jobLaunchingGateway) {
return IntegrationFlows.from(Files.inboundAdapter(new File(incomingDirSftp)).
filter(new SimplePatternFileListFilter("*.csv")).
filter(new AcceptOnceFileListFilter<>()),
c -> c.poller(Pollers.fixedRate(500).maxMessagesPerPoll(1))).
channel("to-bridge").get();
}
One of the first class citizens in Spring Integration is a MessageChannel entity. You always can set explicit channels between endpoints in you IntegrationFlow definition and explicitly send messages to them.
For you “merging” use-case I would suggest to place a .channel() before .handle() and declare the second flow for the second directory, but in the end of that flow use the same .channel() to “bridge” messages from this flow to the middle of the first one.
See more information in the reference manual : https://docs.spring.io/spring-integration/docs/5.0.0.RELEASE/reference/html/java-dsl.html#java-dsl-channels

Spring integration FTP recursively read files with java DSL

I want to configure a gateway with Java DSL to read all the files from a FTP server recursively, because they are in different folders.
How can I do it? I´ll apreciate code examples please
Something like this:
#Bean
public FtpOutboundGatewaySpec ftpOutboundGateway() {
return Ftp.outboundGateway(this.ftpSessionFactory,
AbstractRemoteFileOutboundGateway.Command.MGET, "payload")
.options(AbstractRemoteFileOutboundGateway.Option.RECURSIVE)
.regexFileNameFilter("(subFtpSource|.*1.txt)")
.localDirectoryExpression("#ftpServer.targetLocalDirectoryName + #remoteDirectory")
.localFilenameExpression("#remoteFileName.replaceFirst('ftpSource', 'localTarget')");
}
#Bean
public IntegrationFlow ftpMGetFlow(AbstractRemoteFileOutboundGateway<FTPFile> ftpOutboundGateway) {
return f -> f
.handle(ftpOutboundGateway)
.channel(remoteFileOutputChannel());
}
#Bean
public PollableChannel remoteFileOutputChannel() {
return new QueueChannel();
}
The copy/paste from the project's test-cases.

Resources