In the current release 1.0.2 of spring integration dsl I can see some of the basic channels are not present like ReST/HTTP, TCP/UDP, JDBC, MQTT, etc.
Just wanted to know whether this protocols/channel are in roadmap or it has been excluded deliberately.
PS: I might be sounding stupid with posted question but just wanted to know the reason.
From one side you should understand that it is enough big work to address them all. For example HTTP module is on our radar for the 1.1 release.
From other side the Spring Integration Java DSL is just an edition to the existing Spring Java & Annotation configuration, so any #Bean definition is valid there, too.
With those desired protocols you can go ahead and configure their components as #Bean and refer them from the .handle() or .from() EIP-methods.
For example:
#Bean
public MessageSource<Object> jdbcMessageSource() {
return new JdbcPollingChannelAdapter(this.dataSource, "SELECT * FROM foo");
}
#Bean
public IntegrationFlow myFlow() {
return IntegrationFlows.from(jdbcMessageSource())
.split(...)
.transform(...)
.handle(new MqttPahoMessageHandler("tcp://localhost:1883", "si-test-out"))
.get();
}
Related
How do we use a custom CqlSession on a Spring Webflux application combined with Spring starter reactive Cassandra please?
I am currently doing the following, which is working perfectly:
public class BaseCassandraConfiguration extends AbstractReactiveCassandraConfiguration {
#Bean
#NonNull
#Override
public CqlSessionFactoryBean cassandraSession() {
final CqlSessionFactoryBean cqlSessionFactoryBean = new CqlSessionFactoryBean();
cqlSessionFactoryBean.setContactPoints(contactPoints);
cqlSessionFactoryBean.setKeyspaceName(keyspace);
cqlSessionFactoryBean.setLocalDatacenter(datacenter);
cqlSessionFactoryBean.setPort(port);
cqlSessionFactoryBean.setUsername(username);
cqlSessionFactoryBean.setPassword(passPhrase);
return cqlSessionFactoryBean;
}
However, I would like to use a custom session, something like:
CqlSession session = CqlSession.builder().build();
How do we tell this configuration to use it?
Thank you
Option 1:
If you are looking to completely override the auto configured CqlSession bean, you can do so by providing your own CqlSesson bean ie.
#Bean
public CqlSession cassandraSession() {
return CqlSession.builder().withClientId(MyClientId).build();
}
The downside of override the entire bean is that you will lose the ability to configure this session via application properties and you will lose the defaults spring boot ships with.
Option 2:
If you want to leave the default values provided by spring boot and have the ability to configure the session via application properties you can use CqlSessionBuilderCustomizer to provide specific custom configurations to the CqlSession. This can be achieved by defining a bean of that type ie:
#Bean
public CqlSessionBuilderCustomizer myCustomiser() {
return cqlSessionBuilder -> cqlSessionBuilder.withClientId(MyClientId);;
}
My personal preference is option 2 as it maintains the functionality provided by spring boot which in my opinion results in an easier to maintain application over time.
Is there any working example for file writing support of the Spring Integration DSL? I cannot find anything about DSL implementation.(e.g. a handle() step in the integration flow, etc.)
Thanks.
There is a sample in the Reference Manual:
#Bean
public IntegrationFlow fileWritingFlow() {
return IntegrationFlows.from("fileWritingInput")
.enrichHeaders(h -> h.header(FileHeaders.FILENAME, "foo.txt")
.header("directory", new File(tmpDir.getRoot(), "fileWritingFlow")))
.handle(Files.outboundAdapter(m -> m.getHeaders().get("directory")))
.channel(MessageChannels.queue("fileWritingResultChannel"))
.get();
}
The File-Split-FTP may give you some insights too.
I am using SpringBoot 2.0 with Spring Integration 5.0.3 and have an issue with my HTTP.inboundGateway. My goal is to validate the JSON posted to the gateway, because the request pojo consists of mandatory fields.
#Bean
public IntegrationFlow notifyUpdateVehicleFlow() {
return IntegrationFlows.from(Http.inboundGateway("/update")
.requestMapping(r -> r.methods(HttpMethod.POST))
.requestPayloadType(RequestPojo.class)
.requestChannel(updateChannel())
.replyChannel(updateReplyChannel()))
.get();
}
Is there an easy way to validate fields in the pojo have been set? What I have already tested is using #NotNull SpringValidation but it seems not to be supported with Spring Integration.
Greetings,
smoothny
There is no such a functionality in the Spring Integration. You can use .filter() downstream that Http.inboundGateway() and really perform Validator.validate() from there on the payload.
If you think it must be done somehow on the Http.inboundGateway() and you have strong requirements and clean description, feel free to raise a JIRA on the matter and we will discuss what can be done from the Framework perspective.
For my SFTP client project, I am using spring integration. We have different clients and have to connect to different SFTP servers, but, all of the logic is same, so I have abstracted them out into AbstractSFTPEndPoint. Each client-specific class implements getClientId(), which is used by AbstractSFTPEndPoint to get client-specific details like SFTP credentials.
However, the entire logic is same for all the clients, but I am still having to implement specific classes for each client. This is mainly because we need separate "MessageSource" for each client.
How can I get rid of this duplication?
public class SFTPEndPointForClientAAAA extends AbstractSFTPEndPoint {
public String getClientId(){
return "clientAAAA";
}
#Bean(name = "channelForClientAAAA")
public QueueChannel inputFileChannel() {
return super.inputFileChannel();
}
#ServiceActivator(inputChannel = "channelForClientAAAA", poller = #Poller(fixedDelay = "500"))
public void serviceActivator(Message message) {
super.serviceActivator(message);
}
#Bean(name = "messageSourceForClientAAAA")
#InboundChannelAdapter(value = "channelForClientAAAA",
poller = #Poller(fixedDelay = "50", maxMessagesPerPoll = "2"))
public MessageSource messageSource() {
return super.messageSource();
}
}
Basically I have a bunch of SFTP hosts to connect to and apply same logic. I want that to be done automatically without having to implement class for each SFTP host.
See the dynamic ftp sample. It uses XML but the same techniques apply to Java configuration. It uses outbound adapters; inbound are a little more complicated because you might need to hook them into a common context. There are links in the readme for how to do that.
However, I recently answered a similar question for multiple IMAP mail adapters using Java configuration and then a follow-up question.
You should be able to use the technique used there.
Can we use somehow "subflows" in Spring Integration?
I have many different processes which would use the same "subflow". These processes have always the same part which would be good to be put into a separate file.
What would be the corrent way to implement these flows?
I tried to find a solution to use subflows in Spring Integration but I could not find anything.
One simple technique is to put the subflow in a separate file with "well-known" input and output channels (the subflow starts with one channel and ends with another). Then, simply <import/> the subflow and send/consume to/from the input/output channel.
Or, instead of an import you can use the Java DSL to define the subflow and add it to your application contexts that need the subflow...
#Configuration
public class MySubflowDefinition {
#Bean
public IntegrationFlow subflow() {
return IntegrationFlows.from("someInChannel")
.transform(...)
...
.channel("someOutChannel")
.get();
}
}
For a more formal "subflow" definition, see the spring-integration-flow extension. This solution also allows the same subflow to be invoked from multiple places in the same application context.
spring-integration-java-dsl and spring-integration-flow are both available in the spring repo and maven central with (currently) versions 1.0.0.RELEASE.