How to Read Files and trigger http rest multipart endpoint using Spring Integration - spring-integration

I am following this spring integration example - https://github.com/iainporter/spring-file-poller
#Bean
public IntegrationFlow writeToFile(#Qualifier("fileWritingMessageHandler") MessageHandler fileWritingMessageHandler) {
return IntegrationFlows.from(ApplicationConfiguration.INBOUND_CHANNEL)
.transform(m -> new StringBuilder((String)m).reverse().toString())
.handle(fileWritingMessageHandler)
.log(LoggingHandler.Level.INFO)
.get();
}
#Bean (name = FILE_WRITING_MESSAGE_HANDLER)
public MessageHandler fileWritingMessageHandler(#Qualifier(OUTBOUND_FILENAME_GENERATOR) FileNameGenerator fileNameGenerator) {
FileWritingMessageHandler handler = new FileWritingMessageHandler(inboundOutDirectory);
handler.setAutoCreateDirectory(true);
handler.setFileNameGenerator(fileNameGenerator);
return handler;
}
Controller example
#PostMapping(value ="/data/{id}")
public String load( #RequestParam("jsonFile") MultipartFile jsonFile,
#PathVariable("id") Long id) throws JsonMappingException, JsonProcessingException{
//some business logic
return "Controller is called";
}
Instead of simple handling, I want to call a Rest endpoint that expects a file.
i.e. calling a rest api in handler similar to fileWritingMessageHandler
https://github.com/spring-projects/spring-integration-samples/blob/261648bed136a076f76ed15b1017f5e5b6d8b9ae/intermediate/multipart-http/src/main/resources/META-INF/spring/integration/http-outbound-config.xml
How can I create Map
Map<String, Object> multipartMap = new HashMap<String, Object>();
multipartMap.put("jsonFile", ????);
and call a getway method like
HttpStatus postMultipartRequest(Map<String, Object> multipartRequest);

To send a multi-part request you need to have a payload as a Map<String, Object>. You can read files from a directory using FileReadingMessageSource and respective poller configuration: https://docs.spring.io/spring-integration/docs/current/reference/html/file.html#file-reading. This one emits messages with java.io.File as a payload. To create a Map for it you just need a simple transformer in Java DSL:
.<File, Map<String, File>>transform(file -> Collections.singletonMap("jsonFile", file))
and then you use standard .handle(Http.outboundChannelAdapter("/data/{id}").uriVariable("id", "headers.someId")): https://docs.spring.io/spring-integration/docs/current/reference/html/http.html#http-java-config

Related

Retry handler in Spring Java DSL

currently, I have Spring Integration Flow where reading payload from JMS queue, transforming to XML format, then send the XML payload to the core app. at the RecordSenderHandler, there is logic to make call rest API to my core app and store the response to Redis according to the response I received. If my core app is not accessible or something wrong with my backend, I flag as error HTTP 500. But I do want to retry the execution for certain times and limit maximum error I got. below is my code. any suggestions?
#Bean
public IntegrationFlow jmsMessageDrivenFlowWithContainer() {
return IntegrationFlows
.from(Jms.messageDrivenChannelAdapter(
Jms.container(this.jmsConnectionFactory, recordDestinationQueue)
.concurrentConsumers(xmlConcurrentConsumers)
.maxConcurrentConsumers(xmlMaxConcurrentConsumers))
.errorChannel("errorChannel"))
.handle(payloadSender(), e->e.advice(circuitBreakerAdvice()))
.get();
}
#Bean
#ServiceActivator(inputChannel = "handleChannel")
public PayloadSender payloadSender() {
return new PayloadSender ();
}
#Bean
public RequestHandlerCircuitBreakerAdvice circuitBreakerAdvice() {
RequestHandlerCircuitBreakerAdvice requestHandlerCircuitBreakerAdvice = new RequestHandlerCircuitBreakerAdvice();
requestHandlerCircuitBreakerAdvice.setThreshold(3);
requestHandlerCircuitBreakerAdvice.setHalfOpenAfter(15000);
return requestHandlerCircuitBreakerAdvice;
}
See Adding Behavior to Endpoints and in particular the RequestHandlerRetryAdvice.
.handle(..., e -> e.advice(retryAdvice()))
...
#Bean
public RequestHandlerRetryAdvice retryAdvice() {
...
}

spring-integration-kafka: Annotation-driven handling of KafkaProducerMessageHandler result?

Is there a way to achieve the behavior of the code below using annotation driven code?
#Bean
#ServiceActivator(inputChannel = "toKafka")
public MessageHandler handler() throws Exception {
KafkaProducerMessageHandler<String, String> handler =
new KafkaProducerMessageHandler<>(kafkaTemplate());
handler.setTopicExpression(new LiteralExpression("someTopic"));
handler.setMessageKeyExpression(new LiteralExpression("someKey"));
handler.setSendSuccessChannel(success());
handler.setSendFailureChannel(failure());
return handler;
}
#Bean
public KafkaTemplate<String, String> kafkaTemplate() {
return new KafkaTemplate<>(producerFactory());
}
#Bean
public ProducerFactory<String, String> producerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, this.brokerAddress);
// set more properties
return new DefaultKafkaProducerFactory<>(props);
}
Can I specify the send success/failure channels using Spring Integration annotations?
I'd like as much as possible to keep a consistent pattern of doing things (e.g., specifying the flow of messages) throughout my app, and I like the Spring Integration diagrams (e.g., of how channels are connected) IntelliJ automatically generates when you configure your Spring Integration app with XML or Java annotations.
No; it is not possible, the success/failure channels have to be set explicitly when using Java configuration.
This configuration is specific to the Kafka handler and #ServiceActivator is a generic annotation for all types of message handler.

spring-integration-kafka: KafkaTemplate#setMessageConverter(RecordMessageConverter) has no effect

I'm trying to set a custom message converter for my Spring Integration Kafka message handler (yes, I know I can supply serializer configsā€”I'm trying to do something a little different).
I have the following:
#Bean
public KafkaTemplate<String, String> kafkaTemplate() {
final KafkaTemplate<String, String> kafkaTemplate = new KafkaTemplate<>(producerFactory());
kafkaTemplate.setMessageConverter(new MessagingMessageConverter() {
#Override
public ProducerRecord<?, ?> fromMessage(final Message<?> message, final String s) {
LOGGER.info("fromMessage({}, {})", message, s);
return super.fromMessage(message, s);
}
});
return kafkaTemplate;
}
#Bean
#ServiceActivator(inputChannel = "kafkaMessageChannel")
public MessageHandler kafkaMessageHandler() {
final KafkaProducerMessageHandler<String, String> handler = new KafkaProducerMessageHandler<>(kafkaTemplate());
handler.setTopicExpression(new LiteralExpression(getTopic()));
handler.setSendSuccessChannel(kafkaSuccessChannel());
return handler;
}
When a message is sent to kafkaMessageChannel, the handler sends it and the result shows up in kafkaSuccessChannel, but the RecordMessageConverter I set in the template was never called
The template message converter is only used when using template.send(Message<?>) which is not used by the outbound channel adapter.
The outbound adapter maps the headers itself using its header mapper; there is no conversion performed on the message payload.
What documentation leads you to believe the converter is used in this context?

Spring integration DSL creating JMS MessageDriver Channel Adapter in java 1.7

I am trying to create a integration flow for JMS MessageDriverChannelAdapter through which I need to send message to the Kafka server. But I really
stuck when I am trying to convert the the xml tag to dsl specific code, not able to convert the xml to required DSL. Can any one please provide
any pointer to it as I am not able to proceed over here.
I have created a MessageListenerContainer like this........
String brokerUrl = "tcp://101.11.102.125:31316";
String topic = "sometpoic";
String kafkaBrokerUrl = "101.11.102.125:1012";
String kafkaTopic = "kafka_Topic";
#Bean
public DefaultMessageListenerContainer listenerContainer() {
DefaultMessageListenerContainer container = new DefaultMessageListenerContainer();
ActiveMQConnectionFactory conFactory = new ActiveMQConnectionFactory();
ActiveMQTopic mqTopic = new ActiveMQTopic(topic);
conFactory.setBrokerURL(brokerUrl);
container.setConnectionFactory(conFactory);
container.setDestination(mqTopic);
container.setSessionTransacted(true);
return container;
}
These are my input and output channels........
#Bean
public MessageChannel jmsInChannel() {
return MessageChannels.publishSubscribe().get();
}
#Bean
public MessageChannel jmsOutChannel() {
return MessageChannels.publishSubscribe().get();
}
And this is my JMS adapter flow............
#Bean
public IntegrationFlow jmsMessageDrivenFlow() {
return IntegrationFlows
.from(Jms.messageDriverChannelAdapter(listenerContainer())
.autoStartup(true))
.channel(jmsInChannel())
.get();
}
Now I need to create a header-enricher like this but not able to covert this into DSL.
<int:header-enricher input-channel="jmsInChannel" output-channel="jmsOutChannel">
<int:header name="kafkaBrokerUrl" value="${kafka.url}"></int:header>
<int:header name="kafkaTopic" value="${kafka.topic}"></int:header>
and I need to create a service-activator and call a kafka producer method form a different class like this in xml....
<int:service-activator input-channel="jmsOutChannel" ref="KafkaProducer" method="produceToJmsKafka"/>
<bean id="KafkaProducer" class="com.david.jms.JmsKafkaProducer"/>
So how to convert these above xml code to similar DSL specific code.
After getting the compilation error I have tried like this...
#SuppressWarnings("unchecked")
#Bean
public IntegrationFlow jmsMessageDrivenFlow() {
return IntegrationFlows
.from(Jms.messageDriverChannelAdapter(listenerContainer())
.autoStartup(true))
.channel(jmsInChannel())
.enrichHeaders(new MapBuilder()
.put("brokerid", brokerid)
.put("topic", topic)
.put("source", source)
.put("fileType", fileType))
.handle("KafkaProducer", "produceToJmsKafka")
.get();
}
#Bean
public JmsProducer KafkaProducer() {
return new JmsProducer();
}
That may be like this:
#Value("${kafka.url}")
private String kafkaBrokerUrl;
#Value("${kafka.topic}")
private String kafkaTopic;
....
#Bean
public IntegrationFlow jmsMessageDrivenFlow() {
return IntegrationFlows
.from(Jms.messageDriverChannelAdapter(listenerContainer())
.autoStartup(true))
.channel(jmsInChannel())
.enrichHeaders(new StringStringMapBuilder()
.put("kafkaBrokerUrl", kafkaBrokerUrl)
.put("kafkaTopic", kafkaTopic))
.handle("KafkaProducer", "produceToJmsKafka")
.get();
}
From here I don't see reason to have those MessageChannel beans, especially like publishSubscribe().
From other side since DSL 1.1 we provide the implementation for Spring Integration Kafka Adapters.

Spring Integration 4 - configuring a LoadBalancingStrategy in Java DSL

I have a simple Spring Integration 4 Java DSL flow which uses a DirectChannel's LoadBalancingStrategy to round-robin Message requests to a number of possible REST Services (i.e. calls a REST service from one of two possible service endpoint URIs).
How my flow is currently configured:
#Bean(name = "test.load.balancing.ch")
public DirectChannel testLoadBalancingCh() {
LoadBalancingStrategy loadBalancingStrategy = new RoundRobinLoadBalancingStrategy();
DirectChannel directChannel = new DirectChannel(loadBalancingStrategy);
return directChannel;
}
#Bean
public IntegrationFlow testLoadBalancing0Flow() {
return IntegrationFlows.from("test.load.balancing.ch")
.handle(restHandler0())
.channel("test.result.ch")
.get();
}
#Bean
public IntegrationFlow testLoadBalancing1Flow() {
return IntegrationFlows.from("test.load.balancing.ch")
.handle(restHandler1())
.channel("test.result.ch")
.get();
}
#Bean
public HttpRequestExecutingMessageHandler restHandler0() {
return createRestHandler(endpointUri0, 0);
}
#Bean
public HttpRequestExecutingMessageHandler restHandler1() {
return createRestHandler(endpointUri1, 1);
}
private HttpRequestExecutingMessageHandler createRestHandler(String uri, int order) {
HttpRequestExecutingMessageHandler handler = new HttpRequestExecutingMessageHandler(uri);
// handler configuration goes here..
handler.setOrder(order);
return handler;
}
My configuration works, but I am wondering whether there is a simpler/better way of configuring the flow using Spring Integration's Java DSL?
Cheers,
PM
First of all the RoundRobinLoadBalancingStrategy is the default one for the DirectChannel.
So, can get rid of the testLoadBalancingCh() bean definition at all.
Further, to avoid duplication for the .channel("test.result.ch") you can configure it on the HttpRequestExecutingMessageHandler as setOutputChannel().
From other side your configuration is so simple that I don't see reason to use DSL. You can achieve the same just with annotation configuration:
#Bean(name = "test.load.balancing.ch")
public DirectChannel testLoadBalancingCh() {
return new DirectChannel();
}
#Bean(name = "test.result.ch")
public DirectChannel testResultCh() {
return new DirectChannel();
}
#Bean
#ServiceActivator(inputChannel = "test.load.balancing.ch")
public HttpRequestExecutingMessageHandler restHandler0() {
return createRestHandler(endpointUri0, 0);
}
#Bean
#ServiceActivator(inputChannel = "test.load.balancing.ch")
public HttpRequestExecutingMessageHandler restHandler1() {
return createRestHandler(endpointUri1, 1);
}
private HttpRequestExecutingMessageHandler createRestHandler(String uri, int order) {
HttpRequestExecutingMessageHandler handler = new HttpRequestExecutingMessageHandler(uri);
// handler configuration goes here..
handler.setOrder(order);
handler.setOutputChannel(testResultCh());
return handler;
}
From other side there is MessageChannels builder factory to allow to simplify loadBalancer for your case:
#Bean(name = "test.load.balancing.ch")
public DirectChannel testLoadBalancingCh() {
return MessageChannels.direct()
.loadBalancer(new RoundRobinLoadBalancingStrategy())
.get();
}
However, I can guess that you want to avoid duplication within DSL flow definition to DRY, but it isn't possible now. That's because IntegrationFlow is linear to tie endoints bypassing the boilerplate code for standard objects creation.
As you see to achieve Round-Robin we have to duplicate, at least, inputChannel, to subscribe several MessageHandlers to the same channel. And we do that in the XML, via Annotations and, of course, from DSL.
I'm not sure that it will be so useful for real applications to provide a hook to configure several handlers using single .handle() for the same Round-Robin channel. Because the further downstream flow may not be so simple as your .channel("test.result.ch").
Cheers

Resources