spring-integration-kafka: KafkaTemplate#setMessageConverter(RecordMessageConverter) has no effect - spring-integration

I'm trying to set a custom message converter for my Spring Integration Kafka message handler (yes, I know I can supply serializer configs—I'm trying to do something a little different).
I have the following:
#Bean
public KafkaTemplate<String, String> kafkaTemplate() {
final KafkaTemplate<String, String> kafkaTemplate = new KafkaTemplate<>(producerFactory());
kafkaTemplate.setMessageConverter(new MessagingMessageConverter() {
#Override
public ProducerRecord<?, ?> fromMessage(final Message<?> message, final String s) {
LOGGER.info("fromMessage({}, {})", message, s);
return super.fromMessage(message, s);
}
});
return kafkaTemplate;
}
#Bean
#ServiceActivator(inputChannel = "kafkaMessageChannel")
public MessageHandler kafkaMessageHandler() {
final KafkaProducerMessageHandler<String, String> handler = new KafkaProducerMessageHandler<>(kafkaTemplate());
handler.setTopicExpression(new LiteralExpression(getTopic()));
handler.setSendSuccessChannel(kafkaSuccessChannel());
return handler;
}
When a message is sent to kafkaMessageChannel, the handler sends it and the result shows up in kafkaSuccessChannel, but the RecordMessageConverter I set in the template was never called

The template message converter is only used when using template.send(Message<?>) which is not used by the outbound channel adapter.
The outbound adapter maps the headers itself using its header mapper; there is no conversion performed on the message payload.
What documentation leads you to believe the converter is used in this context?

Related

How to Read Files and trigger http rest multipart endpoint using Spring Integration

I am following this spring integration example - https://github.com/iainporter/spring-file-poller
#Bean
public IntegrationFlow writeToFile(#Qualifier("fileWritingMessageHandler") MessageHandler fileWritingMessageHandler) {
return IntegrationFlows.from(ApplicationConfiguration.INBOUND_CHANNEL)
.transform(m -> new StringBuilder((String)m).reverse().toString())
.handle(fileWritingMessageHandler)
.log(LoggingHandler.Level.INFO)
.get();
}
#Bean (name = FILE_WRITING_MESSAGE_HANDLER)
public MessageHandler fileWritingMessageHandler(#Qualifier(OUTBOUND_FILENAME_GENERATOR) FileNameGenerator fileNameGenerator) {
FileWritingMessageHandler handler = new FileWritingMessageHandler(inboundOutDirectory);
handler.setAutoCreateDirectory(true);
handler.setFileNameGenerator(fileNameGenerator);
return handler;
}
Controller example
#PostMapping(value ="/data/{id}")
public String load( #RequestParam("jsonFile") MultipartFile jsonFile,
#PathVariable("id") Long id) throws JsonMappingException, JsonProcessingException{
//some business logic
return "Controller is called";
}
Instead of simple handling, I want to call a Rest endpoint that expects a file.
i.e. calling a rest api in handler similar to fileWritingMessageHandler
https://github.com/spring-projects/spring-integration-samples/blob/261648bed136a076f76ed15b1017f5e5b6d8b9ae/intermediate/multipart-http/src/main/resources/META-INF/spring/integration/http-outbound-config.xml
How can I create Map
Map<String, Object> multipartMap = new HashMap<String, Object>();
multipartMap.put("jsonFile", ????);
and call a getway method like
HttpStatus postMultipartRequest(Map<String, Object> multipartRequest);
To send a multi-part request you need to have a payload as a Map<String, Object>. You can read files from a directory using FileReadingMessageSource and respective poller configuration: https://docs.spring.io/spring-integration/docs/current/reference/html/file.html#file-reading. This one emits messages with java.io.File as a payload. To create a Map for it you just need a simple transformer in Java DSL:
.<File, Map<String, File>>transform(file -> Collections.singletonMap("jsonFile", file))
and then you use standard .handle(Http.outboundChannelAdapter("/data/{id}").uriVariable("id", "headers.someId")): https://docs.spring.io/spring-integration/docs/current/reference/html/http.html#http-java-config

Spring Kafka outboundChannelAdapter's control does not return back in the integration flow

After a messsage is sent, it gets published to Kafka topic but the Message from KafkaSuccessTransformer does not return back to the REST controller. I am trying to return the message as-is if sent successfully but nothing after Kafka handler seems to be invoked.
#MessagingGateway
public interface MyGateway<String, Message<?>> {
#Gateway(requestChannel = "enrollChannel")
Message<?> sendMsg(#Payload String payload);
}
------------------------
#RestController
public class Controller {
MyGateway<String, Message<?>> myGateway;
#PostMapping
public Message<?> send(#RequestBody String request) throws Exception {
Message<?> resp = myGateway.sendMsg(request);
log.info("I am back"); // control doesn't come to this point
return resp;
}
}
--------------------------
#Component
public class MyIntegrationFlow {
KafkaSuccessTransformer stransformer;
#Bean
public MessageChannel enrollChannel() {
return new DirectChannel();
}
#Bean
public MessageChannel kafkaSuccessChannel() {
return new DirectChannel();
}
#Bean
public IntegrationFlow enrollIntegrationFlow() {
return IntegrationFlows.from("enrollChannel")
//another transformer which turns the string to Message<?>
.handle(Kafka.outboundChannelAdapter(kafkaTemplate) //kafkaTemplate has the necesssary config
.topic("topic1")
.messageKey(messageKeyFunction -> messageKeyFunction.getHeaders()
.get("key1")
.sendSuccessChannel("kafkaSuccessChannel"));
}
#Bean
public IntegrationFlow successfulKafkaSends() {
return f -> IntegrationFlows.from("kafkaSuccessChannel").transform(stransformer);
}
}
--------------
#Component
public class KafkaSuccessTransformer {
#Transformer
public Message<?> transform(Message<?> message) {
log.info("Message is sent to Kafka");
return message; //control comes here but does not return to REST controller
}
}
Channel adapters are for one-way traffic; there is no result.
Add a publishSubscribe channel with two subflows; the second one can be just a bridge to nowhere - .bridge() ends the flow. It will then return the outbound message to the gateway.
See https://docs.spring.io/spring-integration/docs/current/reference/html/dsl.html#java-dsl-subflows
Per Artem:
Something is off in the configuration or code. The logic is like this: processSendResult(message, producerRecord, sendFuture, getSendSuccessChannel());. Then: getMessageBuilderFactory().fromMessage(message). So, the replyChannel header is present in this "success" message. Therefore that transform(stransformer) should really produce its return to the replyChannel for a gateway in the beginning. Only the problem could be in the KafkaSuccessTransformer code where it does not copy request message headers for reply message. Please, share its whole code.

Spring Integration aws Kinesis , message aggregator, Release Strategy

this is a follow-up question to Spring Integration AWS RabbitMQ Kinesis
I have the following configuration. I am noticing that when I send a message to the input channel named kinesisSendChannel for the first time, the aggregator and release strategy is getting invoked and messages are sent to Kinesis Streams. I put debug breakpoints at different places and could verify this behavior. But when I again publish messages to the same input channel the release strategy and the outbound processor are not getting invoked and messages are not sent to the Kinesis. I am not sure why the aggregator flow is getting invoked only the first time and not for subsequent messages. For testing purpose , the TimeoutCountSequenceSizeReleaseStrategy is set with count as 1 & time as 60 seconds. There is no specific MessageStore used. Could you help identify the issue?
#Bean(name = "kinesisSendChannel")
public MessageChannel kinesisSendChannel() {
return MessageChannels.direct().get();
}
#Bean(name = "resultChannel")
public MessageChannel resultChannel() {
return MessageChannels.direct().get();
}
#Bean
#ServiceActivator(inputChannel = "kinesisSendChannel")
public MessageHandler aggregator(TestMessageProcessor messageProcessor,
MessageChannel resultChannel,
TimeoutCountSequenceSizeReleaseStrategy timeoutCountSequenceSizeReleaseStrategy) {
AggregatingMessageHandler handler = new AggregatingMessageHandler(messageProcessor);
handler.setCorrelationStrategy(new ExpressionEvaluatingCorrelationStrategy("headers['foo']"));
handler.setReleaseStrategy(timeoutCountSequenceSizeReleaseStrategy);
handler.setOutputProcessor(messageProcessor);
handler.setOutputChannel(resultChannel);
return handler;
}
#Bean
#ServiceActivator(inputChannel = "resultChannel")
public MessageHandler kinesisMessageHandler1(#Qualifier("successChannel") MessageChannel successChannel,
#Qualifier("errorChannel") MessageChannel errorChannel, final AmazonKinesisAsync amazonKinesis) {
KinesisMessageHandler kinesisMessageHandler = new KinesisMessageHandler(amazonKinesis);
kinesisMessageHandler.setSync(true);
kinesisMessageHandler.setOutputChannel(successChannel);
kinesisMessageHandler.setFailureChannel(errorChannel);
return kinesisMessageHandler;
}
public class TestMessageProcessor extends AbstractAggregatingMessageGroupProcessor {
#Override
protected Object aggregatePayloads(MessageGroup group, Map<String, Object> defaultHeaders) {
final PutRecordsRequest putRecordsRequest = new PutRecordsRequest().withStreamName("test-stream");
final List<PutRecordsRequestEntry> putRecordsRequestEntry = group.getMessages().stream()
.map(message -> (PutRecordsRequestEntry) message.getPayload()).collect(Collectors.toList());
putRecordsRequest.withRecords(putRecordsRequestEntry);
return putRecordsRequestEntry;
}
}
I believe the problem is here handler.setCorrelationStrategy(new ExpressionEvaluatingCorrelationStrategy("headers['foo']"));. All your messages come with the same foo header. So, all of them form the same message group. As long as you release group and don’t remove it, all the new messages are going to be discarded.
Please, revise aggregator documentation to make yourself familiar with all the possible behavior : https://docs.spring.io/spring-integration/docs/current/reference/html/message-routing.html#aggregator

Vaadin and receiving email asynchronouosly

My Vaadin 14 application should receive emails in the background. If emails with a certain subject have been received, the user should be informed about this via PUSH message on the UI.
For the entire email handling I implemented the email / message handling from Spring integration and that works too. Two beans (IntegrationFlow and a ServiceActivator) are generated via #Configuration and #Bean annotation in the Spring Application Context like so:
#Configuration
public class EmailReceiver {
#Bean
public HeaderMapper<MimeMessage> mailHeaderMapper() {
return new DefaultMailHeaderMapper();
}
#Bean
public IntegrationFlow imapMailFlow() {
IntegrationFlow flow = IntegrationFlows
.from(Mail.imapInboundAdapter("imaps://user:pass#imap.ionos.de/INBOX")
.userFlag("testSIUserFlag")
.javaMailProperties(new Properties()),
e -> e.autoStartup(true)
.poller(p -> p.fixedDelay(5000)))
.transform(Mail.toStringTransformer())
.channel(MessageChannels.queue("imapChannel"))
.get();
return flow;
}
#Bean(name = PollerMetadata.DEFAULT_POLLER)
public PollerMetadata defaultPoller() {
PollerMetadata pollerMetadata = new PollerMetadata();
pollerMetadata.setTrigger(new PeriodicTrigger(1000));
return pollerMetadata;
}
#Bean
#ServiceActivator(inputChannel = "imapChannel")
public MessageHandler processNewEmail() {
MessageHandler messageHandler = new MessageHandler() {
#Override
public void handleMessage(org.springframework.messaging.Message<?> message) throws MessagingException {
System.out.println("new email received");
}
};
return messageHandler;
}
}
See also here: https://docs.spring.io/spring-integration/docs/current/reference/html/mail.html#mail-java-dsl-configuration
With such a #Configuration annotated class, the emails are received in the background of the Vaadin app. Check.
But how can I integrate a callback into a Vaadin view in the method EmailReceiver.processNewEmail?
#Bean
#ServiceActivator(inputChannel = "imapChannel")
public MessageHandler processNewEmail(UI ui) {
This always throws an error at application start: Scope vaadin-ui is not active for the current thread; consider defining a scoped proxy for this bean.
There is the example for asynchronous updates with Vaadin https://vaadin.com/docs/v14/flow/advanced/tutorial-push-access.
In contrast to this, I have to create a #Bean for #ServiceActivator handling. As soon as that is the case, there is always the error There is no UI available. The UI scope is not active.
If I move the method processNewEmail() into a separate class I still cannot reference a Vaadin UI:
#MessageEndpoint
class EmailMessageHandler {
private UI ui;
public EmailMessageHandler(UI ui) {
this.ui = ui;
}
#Bean
#ServiceActivator(inputChannel = "imapChannel")
public MessageHandler processNewEmail() {
MessageHandler messageHandler = new MessageHandler() {
#Override
public void handleMessage(org.springframework.messaging.Message<?> message) throws MessagingException {
System.out.println("new email received" + message);
}
};
return messageHandler;
}
}
How can I combine Vaadin asynchronous handling and Spring-Integration Email/ServiceActivator processing?
The point is that your mail receiving functionality is singleton per your application. On the other hand you are going to have as many UIs as many users do HTTP requests to your application. So, you need to think about some intermediary to dump email and get them from there when UI request happens.
You already have that imapChannel as a QueueChannel. So, you can take it from your UI scoped code and call its receive() API to pull the next message. Only the problem that it is a queue: as long as one call receive(), the other won't see the same message. Probably this is OK for your so far, but better to think about something what could be treated as topic in messaging terms. A good candidate easy to use is a Reactor's Sinks.Many: https://projectreactor.io/docs/core/release/reference/#sinks

spring-integration-kafka: Annotation-driven handling of KafkaProducerMessageHandler result?

Is there a way to achieve the behavior of the code below using annotation driven code?
#Bean
#ServiceActivator(inputChannel = "toKafka")
public MessageHandler handler() throws Exception {
KafkaProducerMessageHandler<String, String> handler =
new KafkaProducerMessageHandler<>(kafkaTemplate());
handler.setTopicExpression(new LiteralExpression("someTopic"));
handler.setMessageKeyExpression(new LiteralExpression("someKey"));
handler.setSendSuccessChannel(success());
handler.setSendFailureChannel(failure());
return handler;
}
#Bean
public KafkaTemplate<String, String> kafkaTemplate() {
return new KafkaTemplate<>(producerFactory());
}
#Bean
public ProducerFactory<String, String> producerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, this.brokerAddress);
// set more properties
return new DefaultKafkaProducerFactory<>(props);
}
Can I specify the send success/failure channels using Spring Integration annotations?
I'd like as much as possible to keep a consistent pattern of doing things (e.g., specifying the flow of messages) throughout my app, and I like the Spring Integration diagrams (e.g., of how channels are connected) IntelliJ automatically generates when you configure your Spring Integration app with XML or Java annotations.
No; it is not possible, the success/failure channels have to be set explicitly when using Java configuration.
This configuration is specific to the Kafka handler and #ServiceActivator is a generic annotation for all types of message handler.

Resources