Spring Integration Kafka Manual Acknowledgment - spring-integration

I'm having issues using manual acknowledgements with the KafkaTopicOffsetManager. When acknowledge() is called, the topic begins to get spammed repeatedly. Kafka has log.cleaner.enable set to true and the topic is using cleanup.policy=compact. Thanks for any help.
Config:
#Bean
public ZookeeperConfiguration zookeeperConfiguration() {
ZookeeperConfiguration zookeeperConfiguration = new ZookeeperConfiguration(kafkaConfig.getZookeeperAddress());
zookeeperConfiguration.setClientId("clientId");
return zookeeperConfiguration;
}
#Bean
public ConnectionFactory connectionFactory() {
return new DefaultConnectionFactory(zookeeperConfiguration());
}
#Bean
public TestMessageHandler messageListener() {
return new TestMessageHandler();
}
#Bean
public OffsetManager offsetManager() {
ZookeeperConnect zookeeperConnect = new ZookeeperConnect(kafkaConfig.getZookeeperAddress());
OffsetManager offsetManager = new KafkaTopicOffsetManager(zookeeperConnect, kafkaConfig.getTopic() + "_OFFSET");
return offsetManager;
}
#Bean
public KafkaMessageListenerContainer kafkaMessageListenerContainer() {
KafkaMessageListenerContainer kafkaMessageListenerContainer = new KafkaMessageListenerContainer(connectionFactory(), kafkaConfig.getTopic());
kafkaMessageListenerContainer.setMessageListener(messageListener());
kafkaMessageListenerContainer.setOffsetManager(offsetManager());
return kafkaMessageListenerContainer;
}
Listener:
public class TestMessageHandler implements AcknowledgingMessageListener {
private static final Logger logger = LoggerFactory.getLogger(TestMessageHandler.class);
#Override
public void onMessage(KafkaMessage message, Acknowledgment acknowledgment) {
logger.info(message.toString());
acknowledgment.acknowledge();
}
}

The KafkaTopicOffsetManager needs its own topic to maintain the offset of the actual topic being consumed.

If you don't want to deal with decoding the message payload yourself (its painful in my opinion), extend listener from abstract class AbstractDecodingAcknowledgingMessageListener and provide org.springframework.integration.kafka.serializer.common.StringDecoder as the decoder.
public class TestMessageHandlerDecoding extends AbstractDecodingAcknowledgingMessageListener {
public TestMessageHandlerDecoding(Decoder keyDecoder, Decoder payloadDecoder) {
super(keyDecoder, payloadDecoder);
}
#Override
public void doOnMessage(Object key, Object payload, KafkaMessageMetadata metadata, Acknowledgment acknowledgment) {
LOGGER.info("payload={}",payload);
}

Related

How to get Azure Service Bus message id when sending a message to a topic using Spring Integration

After I send a message to a topic on Azure Service Bus using Spring Integration I would like to get the message id Azure generates. I can do this using JMS. Is there a way to do this using Spring Integration? The code I'm working with:
#Service
public class ServiceBusDemo {
private static final String OUTPUT_CHANNEL = "topic.output";
private static final String TOPIC_NAME = "my_topic";
#Autowired
TopicOutboundGateway messagingGateway;
public String send(String message) {
// How can I get the Azure message id after sending here?
this.messagingGateway.send(message);
return message;
}
#Bean
#ServiceActivator(inputChannel = OUTPUT_CHANNEL)
public MessageHandler topicMessageSender(ServiceBusTopicOperation topicOperation) {
DefaultMessageHandler handler = new DefaultMessageHandler(TOPIC_NAME, topicOperation);
handler.setSendCallback(new ListenableFutureCallback<>() {
#Override
public void onSuccess(Void result) {
System.out.println("Message was sent successfully to service bus.");
}
#Override
public void onFailure(Throwable ex) {
System.out.println("There was an error sending the message to service bus.");
}
});
return handler;
}
#MessagingGateway(defaultRequestChannel = OUTPUT_CHANNEL)
public interface TopicOutboundGateway {
void send(String text);
}
}
You could use ChannelInterceptor to get message headers:
public class CustomChannelInterceptor implements ChannelInterceptor {
#Override
public Message<?> preSend(Message<?> message, MessageChannel channel) {
//key of the message-id header is not stable, you should add logic here to check which header key should be used here.
//ref: https://github.com/Azure/azure-sdk-for-java/tree/main/sdk/spring/azure-spring-cloud-starter-servicebus#support-for-service-bus-message-headers-and-properties
String messageId = message.getHeaders().get("message-id-header-key").toString();
return ChannelInterceptor.super.preSend(message, channel);
}
}
Then in the configuration, set this interceptor to your channel
#Bean(name = OUTPUT_CHANNEL)
public BroadcastCapableChannel pubSubChannel() {
PublishSubscribeChannel channel = new PublishSubscribeChannel();
channel.setInterceptors(Arrays.asList(new CustomChannelInterceptor()));
return channel;
}

Using EmbeddedKafkaBroker with spring integration and spring kafka

I want to use EmbeddedKafkaBroker to test my flow that involves KafkaMessageDrivenChannelAdapter,
it looks like consumer starts correclty , subscribed to topic but handler is not triggered after pushing message to EmbeddedKafkaBroker.
#SpringBootTest(properties = {"...."}, classes = {....class})
#EmbeddedKafka
class IntTests {
#BeforeAll
static void setup() {
embeddedKafka = new EmbeddedKafkaBroker(1, true, TOPIC);
embeddedKafka.kafkaPorts(57412);
embeddedKafka.afterPropertiesSet();
}
#Test
void testit() throws InterruptedException {
String ip = embeddedKafka.getBrokersAsString();
Map<String, Object> configs = new HashMap<>(KafkaTestUtils.producerProps(embeddedKafka));
Producer<String, String> producer = new DefaultKafkaProducerFactory<>(configs, new StringSerializer(), new StringSerializer()).createProducer();
// Act
producer.send(new ProducerRecord<>(TOPIC, "key", "{\"name\":\"Test\"}"));
producer.flush();
....
}
...
}
And the main class:
#Configuration
public class Kafka {
#Bean
public KafkaMessageDrivenChannelAdapter<String, String> adapter(KafkaMessageListenerContainer<String, String> container) {
KafkaMessageDrivenChannelAdapter<String, String> kafkaMessageDrivenChannelAdapter =..
kafkaMessageDrivenChannelAdapter.setOutputChannelName("kafkaChannel");
}
#Bean
public KafkaMessageListenerContainer<String, String> container() {
ContainerProperties properties = new ContainerProperties(TOPIC);
KafkaMessageListenerContainer<String, String> kafkaContainer = ...;
return kafkaContainer;
}
#Bean
public ConsumerFactory<String, String> consumerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "127.0.0.1:57412");
props.put(ConsumerConfig.GROUP_ID_CONFIG, "group12");
...
return new DefaultKafkaConsumerFactory<>(props);
}
#Bean
public PublishSubscribeChannel kafkaChannel() {
return new PublishSubscribeChannel ();
}
#Bean
#ServiceActivator(inputChannel = "kafkaChannel")
public MessageHandler handler() {
return new MessageHandler() {
#Override
public void handleMessage(Message<?> message) throws MessagingException {
}
};
}
...
}
in log I do see:
clients.consumer.KafkaConsumer : [Consumer clientId=consumer-group12-1, groupId=group12] Subscribed to topic(s): TOPIC
ThreadPoolTaskScheduler : Initializing ExecutorService
KafkaMessageDrivenChannelAdapter : started bean 'adapter'; defined in: 'com.example.demo.demo.Kafka';
Having embeddedKafka = new EmbeddedKafkaBroker(1, true, TOPIC); and #EmbeddedKafka, you essentially start two separate Kafka clusters. See ports option of the #EmbeddedKafka if you want to change a random port for embedded broker. But at the same time it is better to rely in what Spring Boot provides for us with its auto-configuration.
See documentation for more info: https://docs.spring.io/spring-boot/docs/current/reference/html/spring-boot-features.html#boot-features-embedded-kafka. Pay attention to the bootstrapServersProperty = "spring.kafka.bootstrap-servers" property.
UPDATE
In your test you have this #SpringBootTest(classes = {Kafka.class}). When I remove that classes attribute, everything has started to work. The problem that your config class is not auto-configuration aware, therefore you don't have Spring Integration initialized properly and the message is not consumed from the channel. There might be some other affect. But still: better to rely on the auto-configuration, so let your test to see that #SpringBootApplication annotation.

Spring Integration - #InboundChannelAdapter polling

I am new to Spring Integration. We are creating our application using Spring Integration Annotations.
I have configured an #InboundChannelAdapter with poller fixed delay of 5 seconds. But the problem is as soon as I start my application on weblogic, the adapter starts polling and hits endpoint with practically no message.
We need to call a rest service and then trigger this adapter.
Is there a way to implement the same?
TIA!
Set the autoStartup property to false and use a control bus to start/stop it.
#SpringBootApplication
#IntegrationComponentScan
public class So59469573Application {
public static void main(String[] args) {
SpringApplication.run(So59469573Application.class, args);
}
}
#Component
class Integration {
#Autowired
private ApplicationContext context;
#InboundChannelAdapter(channel = "channel", autoStartup = "false",
poller = #Poller(fixedDelay = "5000"))
public String foo() {
return "foo";
}
#ServiceActivator(inputChannel = "channel")
public void handle(String in) {
System.out.println(in);
}
#ServiceActivator(inputChannel = "controlChannel")
#Bean
public ExpressionControlBusFactoryBean controlBus() {
return new ExpressionControlBusFactoryBean();
}
}
#MessagingGateway(defaultRequestChannel = "controlChannel")
interface Control {
void send(String control);
}
#RestController
class Rest {
#Autowired
Control control;
#PostMapping("/foo/{command}")
public void trigger(#PathVariable String command) {
if ("start".equals(command)) {
control.send("#'integration.foo.inboundChannelAdapter'.start()");
}
}
}

How to implement distributed lock around poller in Spring Integration using ZooKeeper

Spring Integration has ZooKeeper support as documented in https://docs.spring.io/spring-integration/reference/html/zookeeper.html
However this document is so vague.
It suggests adding below bean but does not give details on how to start/stop a poller when the node is granted leadership.
#Bean
public LeaderInitiatorFactoryBean leaderInitiator(CuratorFramework client) {
return new LeaderInitiatorFactoryBean()
.setClient(client)
.setPath("/siTest/")
.setRole("cluster");
}
Do we have any example on how to ensure below poller is run only once in a cluster at any time using zookeeper?
#Component
public class EventsPoller {
public void pullEvents() {
//pull events should be run by only one node in the cluster at any time
}
}
The LeaderInitiator emits an OnGrantedEvent and OnRevokedEvent, when it becomes leader and its leadership is revoked.
See https://docs.spring.io/spring-integration/reference/html/messaging-endpoints-chapter.html#endpoint-roles and the next https://docs.spring.io/spring-integration/reference/html/messaging-endpoints-chapter.html#leadership-event-handling for more info about those events handling and how it affects your components in the particular role.
Although I agree that Zookkeper chapter must have some link to that SmartLifecycleRoleController chapter. Feel free to raise a JIRA on the matter and contribution is welcome!
UPDATE
This is what I did in our test:
#RunWith(SpringRunner.class)
#DirtiesContext
public class LeaderInitiatorFactoryBeanTests extends ZookeeperTestSupport {
private static CuratorFramework client;
#Autowired
private PollableChannel stringsChannel;
#BeforeClass
public static void getClient() throws Exception {
client = createNewClient();
}
#AfterClass
public static void closeClient() {
if (client != null) {
client.close();
}
}
#Test
public void test() {
assertNotNull(this.stringsChannel.receive(10_000));
}
#Configuration
#EnableIntegration
public static class Config {
#Bean
public LeaderInitiatorFactoryBean leaderInitiator(CuratorFramework client) {
return new LeaderInitiatorFactoryBean()
.setClient(client)
.setPath("/siTest/")
.setRole("foo");
}
#Bean
public CuratorFramework client() {
return LeaderInitiatorFactoryBeanTests.client;
}
#Bean
#InboundChannelAdapter(channel = "stringsChannel", autoStartup = "false", poller = #Poller(fixedDelay = "100"))
#Role("foo")
public Supplier<String> inboundChannelAdapter() {
return () -> "foo";
}
#Bean
public PollableChannel stringsChannel() {
return new QueueChannel();
}
}
}
And I have in logs something like this:
2018-12-14 10:12:33,542 DEBUG [Curator-LeaderSelector-0] [org.springframework.integration.support.SmartLifecycleRoleController] - Starting [leaderInitiatorFactoryBeanTests.Config.inboundChannelAdapter.inboundChannelAdapter] in role foo
2018-12-14 10:12:33,578 DEBUG [Curator-LeaderSelector-0] [org.springframework.integration.support.SmartLifecycleRoleController] - Stopping [leaderInitiatorFactoryBeanTests.Config.inboundChannelAdapter.inboundChannelAdapter] in role foo

ServiceActivator does not receive message from ImapIdleChannelAdapter

ServiceActivator does not receive messages from ImapIdleChannelAdapter...
JavaMail logs successful FETCH, but MIME messages do not get delivered to SA endpoint... I want to understand what is wrong in my code.
A7 FETCH 1:35 (ENVELOPE INTERNALDATE RFC822.SIZE FLAGS BODYSTRUCTURE)
* 1 FETCH (ENVELOPE ("Fri....
Code snippet below:
`
#Autowired
EmailConfig emailCfg;
#Bean
public SubscribableChannel mailChannel() {
return MessageChannels.direct().get();
}
#Bean
public ImapIdleChannelAdapter getMailAdapter() {
ImapMailReceiver mailReceiver = new ImapMailReceiver(emailCfg.getImapUrl());
mailReceiver.setJavaMailProperties(javaMailProperties());
mailReceiver.setShouldDeleteMessages(false);
mailReceiver.setShouldMarkMessagesAsRead(true);
ImapIdleChannelAdapter imapIdleChannelAdapter = new ImapIdleChannelAdapter(mailReceiver);
imapIdleChannelAdapter.setOutputChannel(mailChannel());
imapIdleChannelAdapter.setAutoStartup(true);
imapIdleChannelAdapter.afterPropertiesSet();
return imapIdleChannelAdapter;
}
#ServiceActivator(inputChannel = "mailChannel")
public void receive(String mail) {
log.warn(mail);
}
private Properties javaMailProperties() {
Properties javaMailProperties = new Properties();
javaMailProperties.setProperty("mail.imap.socketFactory.class", "javax.net.ssl.SSLSocketFactory");
javaMailProperties.setProperty("mail.imap.socketFactory.fallback", "false");
javaMailProperties.setProperty("mail.store.protocol", "imaps");
javaMailProperties.setProperty("mail.debug", "true");
javaMailProperties.setProperty("mail.imap.ssl", "true");
return javaMailProperties;
}
`
The problem was due to wrong bean initialization. Full version that works OK:
#Slf4j
#Configuration
#EnableIntegration
public class MyMailAdapter {
#Autowired
EmailConfig emailCfg;
#Bean
public SubscribableChannel mailChannel() {
log.info("Channel ready");
return MessageChannels.direct().get();
}
#Bean
public ImapMailReceiver receiver() {
ImapMailReceiver mailReceiver = new ImapMailReceiver(emailCfg.getImapUrl());
mailReceiver.setJavaMailProperties(javaMailProperties());
mailReceiver.setShouldDeleteMessages(false);
mailReceiver.setShouldMarkMessagesAsRead(true);
return mailReceiver;
}
#Bean
public ImapIdleChannelAdapter adapter() {
ImapIdleChannelAdapter imapIdleChannelAdapter = new ImapIdleChannelAdapter(receiver());
imapIdleChannelAdapter.setOutputChannel(mailChannel());
imapIdleChannelAdapter.afterPropertiesSet();
return imapIdleChannelAdapter;
}
#ServiceActivator(inputChannel = "mailChannel")
public void receive(Message<MimeMessage> mail) throws MessagingException {
log.info(mail.getPayload().toString());
}
private Properties javaMailProperties() {
Properties javaMailProperties = new Properties();
javaMailProperties.setProperty("mail.imap.socketFactory.class", "javax.net.ssl.SSLSocketFactory");
javaMailProperties.setProperty("mail.imap.socketFactory.fallback", "false");
javaMailProperties.setProperty("mail.store.protocol", "imaps");
javaMailProperties.setProperty("mail.debug", "true");
javaMailProperties.setProperty("mail.imap.ssl", "true");
return javaMailProperties;
}
}
I don't know what's exactly wrong with your code but I will suggest you few approches that could help you.
Firstly I suggest you to use java DSL in java based configuration. It will provide you nice way to directly specific flow of your integration application (and avoid simply mistakes). For example for spliiter and service activator:
#Bean
public IntegrationFlow yourFlow(AbstractMessageSplitter splitter,
MessageHandler handler) {
return
IntegrationFlows
.from(CHANNEL)
.split(splitter)
.handle(handler).get();
}
Secondly it's generally bad idea to directly specify message type to String. Try something like this (why String?):
#ServiceActivator(inputChannel = "mailChannel")
public void receive(Message<?> message) {
/* (String) message.getPayload() */
}
Maybe it's not a case but let's check it.

Resources