Now I accept an http request to handle time-consuming tasks. So i want to reply the requester first and then use other flow async to handler it.
I try to use CompletableFuture.runAsync in a handler() method and then use MessageChannel.send to do . But I think there should be a more elegant way
#Bean
public IntegrationFlow testFlow(){
return IntegrationFlows.from("requestChannel")//accept an request
.handle(handlerSomeThing())//do something
.handle()?// how to send message to main and reply quickly
.enrichHeaders(c->c.header(HttpHeaders.STATUS_CODE,HttpStatus.OK))
.get();
}
#Bean
public IntegrationFlow mainFlow(){
return IntegrationFlows.from("mainChannel")//accept an request
.handleMainLogic()//handler main logic
.get();
}
User a publishSubscribeChannel.
See this answer.
Add a task executor so that the two subflows run in parallel.
Related
I have to listen a queue using spring integration flow and intgeration sqs. Once message is received from queue it should trigger a integration flow. Below is the things which I am trying but everythings fine in but afater receiving test it is not triggering any Integration flow. Please let me know where I am doing wrong:
UPDATED as per comment from Artem
Adapter for SQS.
#Bean
public MessageProducerSupport sqsMessageDrivenChannelAdapter() {
SqsMessageDrivenChannelAdapter adapter = new SqsMessageDrivenChannelAdapter(amazonSQSAsync, "Main");
adapter.setOutputChannel(inputChannel());
adapter.setAutoStartup(true);
adapter.setMessageDeletionPolicy(SqsMessageDeletionPolicy.NEVER);
adapter.setMaxNumberOfMessages(10);
adapter.setVisibilityTimeout(5);
return adapter;
}
Queue configured:
#Bean
public MessageChannel inputChannel() {
return new DirectChannel();
}
Now the main integration flow trigger point:
#Bean
public IntegrationFlow inbound() {
return IntegrationFlows.from("inputChannel").transform(i -> "TEST_FLOW").get();
}
}
Appreciate any type of help.
The sqsMessageDrivenChannelAdapter() must be declared as a #Bean
The inbound() must be declared as a #Bean
This one fully does not make sense IntegrationFlows.from(MessageChannels.queue()). What is the point to start the flow from anonymous channel? Who and how is going to produce messages to that channel?
Make yourself familiar with different channels: https://docs.spring.io/spring-integration/docs/current/reference/html/core.html#channel-implementations
Pay attention that QueueChannel must be consumed via polling endpoint.
Right, there is a default poller auto-configured by Spring Boot, but it is based on a single thread in the TaskScheduler and has a polling period as 10 millis.
I wouldn't recommend to hand off SQS messages to the QueueChannel: when consumer fails, you lose the data. It is better to process those messages in the consumer thread.
Otherwise your intention is not clear in the provided code.
Can you, please, share with us what error you get or anything else?
You also can turn on DEBUG logging level for org.springframework.integration to see how your messages are processed.
I am using following to define my integration flow:
#Bean
public IntegrationFlow pollingFlow(MessageSource<Object> jdbcMessageSource) {
return IntegrationFlows.from(jdbcMessageSource,
c -> c.poller(Pollers.fixedRate(250, TimeUnit.MILLISECONDS)
.maxMessagesPerPoll(1)
.transactional()))
.split()
.channel(taskSourceChannel())
.get();
}
I would like to make call to service activator that reads from taskSourceChannel as transactional. Also, I want to use following with my transaction.
#Bean
public TransactionSynchronizationFactory transactionSynchronizationFactory() {
ExpressionEvaluatingTransactionSynchronizationProcessor syncProcessor
= new ExpressionEvaluatingTransactionSynchronizationProcessor();
syncProcessor.setAfterCommitChannel(successChannel());
syncProcessor.setAfterRollbackChannel(failureChannel());
return new DefaultTransactionSynchronizationFactory(syncProcessor);
}
The taskSourceChannel is an executor channel.
#Bean
public MessageChannel taskSourceChannel() {
return new ExecutorChannel(executor());
}
How can I add transaction support after split while using TransactionSynchronizationFactory. I don't want to make polling transacational. The only solution I can think of is putting transactional on activator but that won't solve my problem. I would like to make it applicable to any service activator uses this channel.
You question is not so clear, but you definitely need to consider to add transaction into the service activator. Although you don't show what is the subscriber for that taskSourceChannel, but you need to think do not have several subscribers on it.
Nevertheless I think your point is to apply TX into the service activator on this taskSourceChannel and everything after that one.
For this purpose Spring Integration provides a TransactionHandleMessageAdvice. See more info the Reference Manual: https://docs.spring.io/spring-integration/reference/html/messaging-endpoints-chapter.html#tx-handle-message-advice.
The TransactionSynchronizationFactory is only used from the AbstractPollingEndpoint implementations. However you can still utilize it in your transactional context relying on the TransactionSynchronizationManager.registerSynchronization().
I have an integration flow to poll json files from some folder and send them to a rest http endpoint. Below my code:
#Bean
public IntegrationFlow jsonFileToRestFlow() {
return IntegrationFlows
.from("fileInputChannel")
.transform(new FileToStringTransformer())
.enrichHeaders(s -> s.header("Content-Type", "application/json; charset=utf8"))
.handle(httpRequestExecutingMessageHandler())
.get();
}
After receiving response from http endpoint i will move my files to successful or failure channel. Now i want to test my code. What is the best way to test this code. My idea is to put some json Files in my inputChannel and then mock my http response and check if expected message is in successChannel or failure. But I don't know how to start. Can anyone give me some tips? thanks
Please, take a look into the Spring Integration Testing Framework. So, your httpRequestExecutingMessageHandler can be replaces with the MockIntegration.mockMessageHandler() where you really can produce any possible mocked reply in the handleNextAndReply().
Another option is like a MockMvc and its MockMvcClientHttpRequestFactory to be injected into the RestTemplate for the HttpRequestExecutingMessageHandler.
Success or failure can be achieved with the ExpressionEvaluatingRequestHandlerAdvice applied on the .handle(httpRequestExecutingMessageHandler()) endpoint.
This question is more of a design question than a real problem. Given following basic flow:
#Bean
public DirectChannel getFileToSftpChannel() {
return new DirectChannel();
}
#Bean
public IntegrationFlow sftpOutboundFlow() {
return IntegrationFlows.from(getFileToSftpChannel())
.handle(Sftp.outboundAdapter(this.sftpSessionFactory)
.useTemporaryFileName(false)
.remoteDirectory("test")).get();
}
#Bean
public IntegrationFlow filePollingInboundFlow() {
return from(s -> s.file(new File("path")).patternFilter("*.ext"),
e -> e.poller(fixedDelay(60, SECONDS).channel(getFileToSftpChannel()).get();
}
There is an inbound file polling flow which publishes messages via a DirectChannel to an outbound SFTP flow uploading the file.
After the entire flow finishes, I want to execute a "success" action: move the original file (locally) to an archive folder.
Using the DirectChannel, I understand that the upload will happen in the same thread as the file polling.
In other words, the file poller blocks untill the upload completes (or an error message is returned which is then pushed to the error channel).
Knowing this, I want to place the 'success' action (= moving the original file) on the inbound flow. Things I already know about and don't want to use:
Another 'handle' on the sftpOutbound. Reason: moving the file is tied to the inboud flow not the outbound flow. For ex. if I would introduce another, 2nd, producer later on (eg. a JMS inbound flow) publishing to the same channel, there would be no 'file' to be moved.
Adding an interceptor on the DirectChannel and use the 'afterSendCompletion'. Reason: same as above, I want to logic to be tied to the inbound flow
Add transaction semantics on the inbound flow and react on 'commit'. Reason: as all of this is non transactional (file system/SFTP based) I want to avoid using this.
Another thing I tried was adding an 'handle' on the inbound flow. However, I learned as the inbound flow has no real 'reply', the handle is executed before the message is sent, so this doesn't work as the move has to be executed after successful processing of the message.
Question in short: what is the standard way of executing an action supplied by the producer (=inbound flow) after the message was successfully processed by a consumer (=outbound flow) via the DirectChannel?
Well, the standard way to do something similar is transaction and that's why we some time ago introduced the PseudoTransactionManager and the XML sample for similar task looks like:
<int-file:inbound-channel-adapter id="realTx" channel="txInput" auto-startup="false"
directory="${java.io.tmpdir}/si-test2">
<int:poller fixed-rate="500">
<int:transactional synchronization-factory="syncFactory"/>
</int:poller>
</int-file:inbound-channel-adapter>
<bean id="transactionManager" class="org.springframework.integration.transaction.PseudoTransactionManager"/>
<int:transaction-synchronization-factory id="syncFactory">
<int:after-commit expression="payload.delete()"/>
</int:transaction-synchronization-factory>
As you see we remove the file in the end of transaction which is caused really after your move to SFTP.
I'd say it is the best way to be tied with only the producer.
Another way is to introduce one more channel before getFileToSftpChannel() and apply the ChannelInterceptor.afterSendCompletion which will be invoked in the end too, by the same single-thread reason. With this approach you should just bridge all your producers with their specific DirectChannels to that single getFileToSftpChannel() for the SFTP adapter.
So, it's up to you what to choose. You have good argument from the architectural perspective to divide the logic by the responsibility levels, but as you see there is no so much choice...
Any other ideas are welcome!
You can try something like the following
#Bean
public DirectChannel getFileToSftpChannel() {
DirectChannel directChannel = new DirectChannel();
directChannel.addInterceptor(new ChannelInterceptorAdapter() {
#Override
public void afterSendCompletion(final Message<?> message,
final MessageChannel channel, final boolean sent, final Exception ex) {
if (ex == null) {
new Archiver().archive((File) message.getPayload());
}
}
});
return directChannel;
}
I have a JMS Outbound Gateway which sends messages out via a request queue and receives messages in via a response queue. I would like to know what is the simplest way to apply throttling to the receive part of messages off the response queue. I have tried setting a Poller to the Outbound Gateway but, when I set it, the response messages are not consumed at all. Can a Poller be used in Outbound Gateways for the purpose of message consumption throttling? If so, how? If not, how can I best throttle message response consumption instead?
My stack is:
o.s.i:spring-integration-java-dsl:1.0.0.RC1
o.s.i:spring-integration-jms:4.0.4.RELEASE
My IntegrationgConfig.class:
#Configuration
#EnableIntegration
public class IntegrationConfig {
...
#Bean
public IntegrationFlow testFlow() {
return IntegrationFlows
.from("test.request.ch")
.handle(Jms.outboundGateway(connectionFactory)
.receiveTimeout(45000)
.requestDestination("REQUEST_QUEUE")
.replyDestination("RESPONSE_QUEUE")
.correlationKey("JMSCorrelationID"), e -> {
e.requiresReply(true);
e.poller(Pollers.fixedRate(1000).maxMessagesPerPoll(2)); // when this poller is set, response messages are not consumed at all...
})
.handle("testService",
"testMethod")
.channel("test.response.ch").get();
}
...
}
Cheers,
PM
Since you are going to fetch messages from the response queue the .poller() doesn't help you.
We need poller if our endpoint's input-channel (in your case test.request.ch) is a PollableChannel. See docs on the matter.
There .replyContainer() option on the Jms.outboundGateway for you. With that you can configure concurrentConsumers options to achieve better throughput on response queue.
Otherwise the JmsOutboundGateway creates MessageConsumer for each request message.