I want to use EmbeddedKafkaBroker to test my flow that involves KafkaMessageDrivenChannelAdapter,
it looks like consumer starts correclty , subscribed to topic but handler is not triggered after pushing message to EmbeddedKafkaBroker.
#SpringBootTest(properties = {"...."}, classes = {....class})
#EmbeddedKafka
class IntTests {
#BeforeAll
static void setup() {
embeddedKafka = new EmbeddedKafkaBroker(1, true, TOPIC);
embeddedKafka.kafkaPorts(57412);
embeddedKafka.afterPropertiesSet();
}
#Test
void testit() throws InterruptedException {
String ip = embeddedKafka.getBrokersAsString();
Map<String, Object> configs = new HashMap<>(KafkaTestUtils.producerProps(embeddedKafka));
Producer<String, String> producer = new DefaultKafkaProducerFactory<>(configs, new StringSerializer(), new StringSerializer()).createProducer();
// Act
producer.send(new ProducerRecord<>(TOPIC, "key", "{\"name\":\"Test\"}"));
producer.flush();
....
}
...
}
And the main class:
#Configuration
public class Kafka {
#Bean
public KafkaMessageDrivenChannelAdapter<String, String> adapter(KafkaMessageListenerContainer<String, String> container) {
KafkaMessageDrivenChannelAdapter<String, String> kafkaMessageDrivenChannelAdapter =..
kafkaMessageDrivenChannelAdapter.setOutputChannelName("kafkaChannel");
}
#Bean
public KafkaMessageListenerContainer<String, String> container() {
ContainerProperties properties = new ContainerProperties(TOPIC);
KafkaMessageListenerContainer<String, String> kafkaContainer = ...;
return kafkaContainer;
}
#Bean
public ConsumerFactory<String, String> consumerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "127.0.0.1:57412");
props.put(ConsumerConfig.GROUP_ID_CONFIG, "group12");
...
return new DefaultKafkaConsumerFactory<>(props);
}
#Bean
public PublishSubscribeChannel kafkaChannel() {
return new PublishSubscribeChannel ();
}
#Bean
#ServiceActivator(inputChannel = "kafkaChannel")
public MessageHandler handler() {
return new MessageHandler() {
#Override
public void handleMessage(Message<?> message) throws MessagingException {
}
};
}
...
}
in log I do see:
clients.consumer.KafkaConsumer : [Consumer clientId=consumer-group12-1, groupId=group12] Subscribed to topic(s): TOPIC
ThreadPoolTaskScheduler : Initializing ExecutorService
KafkaMessageDrivenChannelAdapter : started bean 'adapter'; defined in: 'com.example.demo.demo.Kafka';
Having embeddedKafka = new EmbeddedKafkaBroker(1, true, TOPIC); and #EmbeddedKafka, you essentially start two separate Kafka clusters. See ports option of the #EmbeddedKafka if you want to change a random port for embedded broker. But at the same time it is better to rely in what Spring Boot provides for us with its auto-configuration.
See documentation for more info: https://docs.spring.io/spring-boot/docs/current/reference/html/spring-boot-features.html#boot-features-embedded-kafka. Pay attention to the bootstrapServersProperty = "spring.kafka.bootstrap-servers" property.
UPDATE
In your test you have this #SpringBootTest(classes = {Kafka.class}). When I remove that classes attribute, everything has started to work. The problem that your config class is not auto-configuration aware, therefore you don't have Spring Integration initialized properly and the message is not consumed from the channel. There might be some other affect. But still: better to rely on the auto-configuration, so let your test to see that #SpringBootApplication annotation.
Related
While using spring kafka I am able to read the messages from the topic based on time stamp with the below code -
ConsumerRecords<String, String> records = consumer.poll(100);
if (flag) {
Map<TopicPartition, Long> query = new HashMap<>();
query.put(new TopicPartition(kafkaTopic, 0), millisecondsFromEpochToReplay);
Map<TopicPartition, OffsetAndTimestamp> result = consumer.offsetsForTimes(query);
if(result != null)
{
records = ConsumerRecords.empty();
}
result.entrySet().stream()
.forEach(entry -> consumer.seek(entry.getKey(), entry.getValue().offset()));
flag = false;
}
How can the same functionality be achieved using spring integration DSL - with KafkaMessageDrivenChannelAdapter?
How can we set the Integration Flows and read message from topic based on the timestamp?
Configure the adapter's listener container with a ConsumerAwareRebalanceListener and perform the lookup/seeks when the partitions are assigned.
EDIT
Using Spring Boot (but you can configure the container however you create the container)...
spring.kafka.consumer.enable-auto-commit=false
spring.kafka.consumer.auto-offset-reset=earliest
spring.kafka.consumer.group-id=so54664761
and
#SpringBootApplication
public class So54664761Application {
public static void main(String[] args) {
SpringApplication.run(So54664761Application.class, args);
}
#Bean
public ApplicationRunner runner(KafkaTemplate<String, String> template) {
return args -> template.send("so54664761", "foo");
}
#Bean
public NewTopic topic() {
return new NewTopic("so54664761", 1, (short) 1);
}
#Bean
public IntegrationFlow flow(ConcurrentKafkaListenerContainerFactory<String, String> containerFactory) {
ConcurrentMessageListenerContainer<String, String> container = container(containerFactory);
return IntegrationFlows.from(new KafkaMessageDrivenChannelAdapter<>(container))
.handle(System.out::println)
.get();
}
#Bean
public ConcurrentMessageListenerContainer<String, String> container(
ConcurrentKafkaListenerContainerFactory<String, String> containerFactory) {
ConcurrentMessageListenerContainer<String, String> container = containerFactory.createContainer("so54664761");
container.getContainerProperties().setConsumerRebalanceListener(new ConsumerAwareRebalanceListener() {
#Override
public void onPartitionsAssigned(Consumer<?, ?> consumer, Collection<TopicPartition> partitions) {
System.out.println("Partitions assigned - do the lookup/seeks here");
}
});
return container;
}
}
and
Partitions assigned - do the lookup/seeks here
GenericMessage [payload=foo, headers={kafka_offset=0, kafka_consumer=org.apache.kafka.clients.consumer.KafkaConsumer#2f5b2297, kafka_timestampType=CREATE_TIME, kafka_receivedMessageKey=null, kafka_receivedPartitionId=0, kafka_receivedTopic=so54664761, kafka_receivedTimestamp=1550241100112}]
ServiceActivator does not receive messages from ImapIdleChannelAdapter...
JavaMail logs successful FETCH, but MIME messages do not get delivered to SA endpoint... I want to understand what is wrong in my code.
A7 FETCH 1:35 (ENVELOPE INTERNALDATE RFC822.SIZE FLAGS BODYSTRUCTURE)
* 1 FETCH (ENVELOPE ("Fri....
Code snippet below:
`
#Autowired
EmailConfig emailCfg;
#Bean
public SubscribableChannel mailChannel() {
return MessageChannels.direct().get();
}
#Bean
public ImapIdleChannelAdapter getMailAdapter() {
ImapMailReceiver mailReceiver = new ImapMailReceiver(emailCfg.getImapUrl());
mailReceiver.setJavaMailProperties(javaMailProperties());
mailReceiver.setShouldDeleteMessages(false);
mailReceiver.setShouldMarkMessagesAsRead(true);
ImapIdleChannelAdapter imapIdleChannelAdapter = new ImapIdleChannelAdapter(mailReceiver);
imapIdleChannelAdapter.setOutputChannel(mailChannel());
imapIdleChannelAdapter.setAutoStartup(true);
imapIdleChannelAdapter.afterPropertiesSet();
return imapIdleChannelAdapter;
}
#ServiceActivator(inputChannel = "mailChannel")
public void receive(String mail) {
log.warn(mail);
}
private Properties javaMailProperties() {
Properties javaMailProperties = new Properties();
javaMailProperties.setProperty("mail.imap.socketFactory.class", "javax.net.ssl.SSLSocketFactory");
javaMailProperties.setProperty("mail.imap.socketFactory.fallback", "false");
javaMailProperties.setProperty("mail.store.protocol", "imaps");
javaMailProperties.setProperty("mail.debug", "true");
javaMailProperties.setProperty("mail.imap.ssl", "true");
return javaMailProperties;
}
`
The problem was due to wrong bean initialization. Full version that works OK:
#Slf4j
#Configuration
#EnableIntegration
public class MyMailAdapter {
#Autowired
EmailConfig emailCfg;
#Bean
public SubscribableChannel mailChannel() {
log.info("Channel ready");
return MessageChannels.direct().get();
}
#Bean
public ImapMailReceiver receiver() {
ImapMailReceiver mailReceiver = new ImapMailReceiver(emailCfg.getImapUrl());
mailReceiver.setJavaMailProperties(javaMailProperties());
mailReceiver.setShouldDeleteMessages(false);
mailReceiver.setShouldMarkMessagesAsRead(true);
return mailReceiver;
}
#Bean
public ImapIdleChannelAdapter adapter() {
ImapIdleChannelAdapter imapIdleChannelAdapter = new ImapIdleChannelAdapter(receiver());
imapIdleChannelAdapter.setOutputChannel(mailChannel());
imapIdleChannelAdapter.afterPropertiesSet();
return imapIdleChannelAdapter;
}
#ServiceActivator(inputChannel = "mailChannel")
public void receive(Message<MimeMessage> mail) throws MessagingException {
log.info(mail.getPayload().toString());
}
private Properties javaMailProperties() {
Properties javaMailProperties = new Properties();
javaMailProperties.setProperty("mail.imap.socketFactory.class", "javax.net.ssl.SSLSocketFactory");
javaMailProperties.setProperty("mail.imap.socketFactory.fallback", "false");
javaMailProperties.setProperty("mail.store.protocol", "imaps");
javaMailProperties.setProperty("mail.debug", "true");
javaMailProperties.setProperty("mail.imap.ssl", "true");
return javaMailProperties;
}
}
I don't know what's exactly wrong with your code but I will suggest you few approches that could help you.
Firstly I suggest you to use java DSL in java based configuration. It will provide you nice way to directly specific flow of your integration application (and avoid simply mistakes). For example for spliiter and service activator:
#Bean
public IntegrationFlow yourFlow(AbstractMessageSplitter splitter,
MessageHandler handler) {
return
IntegrationFlows
.from(CHANNEL)
.split(splitter)
.handle(handler).get();
}
Secondly it's generally bad idea to directly specify message type to String. Try something like this (why String?):
#ServiceActivator(inputChannel = "mailChannel")
public void receive(Message<?> message) {
/* (String) message.getPayload() */
}
Maybe it's not a case but let's check it.
i like to unit test some spring kafka listeners. This works fine in production, but i have some problems with the unit tests. I defined the configuration complettly with spring configuration beans but the listener is never called. Did i miss something?
#RunWith(SpringJUnit4ClassRunner.class)
#SpringBootTest(classes = {KafkaSpringBootTest.class})
#Configuration
#DirtiesContext
public class KafkaSpringBootTest {
#ClassRule
public static KafkaEmbedded embeddedKafka = new KafkaEmbedded(1);
#BeforeClass
public static void setup() {
System.setProperty("spring.kafka.bootstrap-servers", embeddedKafka.getBrokersAsString());
System.setProperty("spring.cloud.stream.kafka.binder.zkNodes", embeddedKafka.getZookeeperConnectionString());
System.setProperty("spring.kafka.consumer.auto-offset-reset", "earliest");
}
#Bean
public Map<String, Object> producerConfigs() {
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, embeddedKafka.getBrokersAsString());
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
return props;
}
#Bean
public ProducerFactory<String, String> producerFactory() {
return new DefaultKafkaProducerFactory<>(producerConfigs());
}
#Bean
public DefaultKafkaConsumerFactory<String, String> consumerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, embeddedKafka.getBrokersAsString());
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
return new DefaultKafkaConsumerFactory<>(props);
}
#Bean
public KafkaTemplate<String, String> kafkaTemplate() {
return new KafkaTemplate<>(producerFactory());
}
#Bean
public KafkaTransferListener kafkaTransferListener() {
return new KafkaTransferListener();
}
#Autowired
private KafkaTemplate<String, String> kafkaTemplate;
#Test
public void testSendMessage() throws InterruptedException {
System.out.println("now sending");
kafkaTemplate.send("test", "hello");
Thread.sleep(5000);
}
}
class KafkaTransferListener {
#KafkaListener(topics = "test")
public void listen(String test) {
System.out.println("received message via kafka: " + test);
}
}
Versions:
org.springframework.kafka:spring-kafka-test:2.0.0.RELEASE
org.apache.kafka:kafka_2.11:0.11.0.0
Thanks in advance
I don't see #SpringBootApplication configuration to be sure that Spring Kafka is auto-configured.
You use KafkaEmbedded and its properties to configure ProducerConfig, but at the same time I don't see how you configure ConsumerConfig. Essentially you should use the same properties from the EmbeddedKafka.
With the Boot auto-configuration you really don't need all that kung-fu. There is a simplest way to configure everything against EmbeddedKafka:
#BeforeClass
public static void setup() {
System.setProperty("spring.kafka.bootstrap-servers", embeddedKafka.getBrokersAsString());
System.setProperty("spring.cloud.stream.kafka.binder.zkNodes", embeddedKafka.getZookeeperConnectionString());
}
And you definitely must have #SpringBootApplication class in the same package as this test. Everything rest will be done by Boot.
See this sample on the matter.
In the XD stream, messages are consumed from a Kafka topic through a source module, and then sent to a sink Kafka module. The reason behind developing the custom source and sink Kafka modules is that I want to update the offsets from source module only when I get acknowledgement from the sink module downstream, on successfully sent messages.
I am using Spring Integration Kafka 2.0.1.RELEASE and Spring Kafka 1.0.3.RELEASE with topics in Kafka 0.10.0.0 environment. I have tried the following:
Source Module Configuration:
#Configuration
public class ModuleConfiguration {
#Value("${topic}")
private String topic;
#Value("${brokerList}")
private String brokerAddress;
#Bean
public SubscribableChannel output() {
DirectChannel output = new DirectChannel();
return output;
}
#Autowired
TopicPartitionInitialOffset topicPartition;
#Bean
public TopicPartitionInitialOffset topicPartition(){
return new TopicPartitionInitialOffset(this.topic, 0, (long) 0);
}
#Bean
public KafkaMessageListenerContainer<String, String> container() throws Exception {
ContainerProperties containerProps = new ContainerProperties(topicPartition);
containerProps.setAckMode(AckMode.MANUAL);
KafkaMessageListenerContainer<String, String> kafkaMessageListenerContainer = new KafkaMessageListenerContainer<>(consumerFactory(),containerProps);
return kafkaMessageListenerContainer;
}
#Bean
public ConsumerFactory<String, String> consumerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, this.brokerAddress);
props.put(ConsumerConfig.GROUP_ID_CONFIG, "test-consumer-group");
props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, false);
props.put(ConsumerConfig.SESSION_TIMEOUT_MS_CONFIG, 15000);
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
DefaultKafkaConsumerFactory<String,String> consumerFactory = new DefaultKafkaConsumerFactory<>(props);
return consumerFactory;
}
}
Source Module: InboundKafkaMessageDrivenAdapter
#MessageEndpoint
#Import(ModuleConfiguration.class)
public class InboundKafkaMessageDrivenAdapter {
#Autowired
KafkaMessageListenerContainer<String, String> container;
#Autowired
SubscribableChannel output;
#Bean
public KafkaMessageDrivenChannelAdapter<String, String> adapter(KafkaMessageListenerContainer<String, String> container) {
KafkaMessageDrivenChannelAdapter<String, String> kafkaMessageDrivenChannelAdapter = new KafkaMessageDrivenChannelAdapter<>(container);
kafkaMessageDrivenChannelAdapter.setOutputChannel(output);
return kafkaMessageDrivenChannelAdapter;
}
}
Sink Module: Configuration
#Configuration
#EnableIntegration
public class ModuleConfiguration {
#Value("${topic}")
private String topic;
#Value("${brokerList}")
private String brokerAddress;
#Bean
public KafkaProducerMessageHandler<String,String> handler() throws Exception {
KafkaProducerMessageHandler<String, String> handler = new KafkaProducerMessageHandler<>(kafkaTemplate());
handler.setTopicExpression(new LiteralExpression(this.topic));
return handler;
}
#Bean
public SubscribableChannel input() {
return new DirectChannel();
}
#Bean
public KafkaTemplate<String, String> kafkaTemplate() {
return new KafkaTemplate<>(producerFactory());
}
#Bean
public ProducerFactory<String, String> producerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, this.brokerAddress);
props.put(ProducerConfig.RETRIES_CONFIG, 0);
props.put(ProducerConfig.BATCH_SIZE_CONFIG, 16384);
props.put(ProducerConfig.LINGER_MS_CONFIG, 1);
props.put(ProducerConfig.BUFFER_MEMORY_CONFIG, 33554432);
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
return new DefaultKafkaProducerFactory<>(props);
}
}
Sink Module: SinkActivator
#Import(ModuleConfiguration.class)
#MessageEndpoint
public class SinkActivator {
#Autowired
KafkaProducerMessageHandler<String,String> handler;
#Autowired
SubscribableChannel input;
#ServiceActivator(inputChannel = "input")
public void sendMessage(Message<?> msg) throws Exception{
Acknowledgment acknowledgment = msg.getHeaders().get(KafkaHeaders.ACKNOWLEDGMENT, Acknowledgment.class);
handler.handleMessage(msg);
acknowledgment.acknowledge();
}
}
The source is successful in receiving the messages and sending them to the sink, however when I try to get the Acknowledgment in the sink:
Acknowledgment acknowledgment = msg.getHeaders().get(KafkaHeaders.ACKNOWLEDGMENT, Acknowledgment.class);
The following exception is thrown:
Caused by: java.lang.IllegalArgumentException: Incorrect type specified for header 'kafka_acknowledgment'. Expected [interface org.springframework.kafka.support.Acknowledgment] but actual type is [class org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer$ConsumerAcknowledgment]
In the source code for spring-integration-kafka-2.0.1.RELEASE the class KafkaMessageListenerContainer when AckMode=MANUAL a kafka_acknowledgment header is added to the message, however the type is an inner static class of ConsumerAcknowldgment.
So how do I get the Acknowledgment from the sink module on the message sent from the source?
Unless you are using the local transport, you can't do that, the Acknowledgment is a "live" object and can't be sent over the wire to another module.
If you are using the local transport it will work, but you will have class loader problems because each module runs in its own class loader and the Acknowledgment interfaces are difference instances of the class.
You would have to move spring-integration-kafka and spring-kafka to the xd/lib folder so the classes are loaded from a common classloader.
I'm having issues using manual acknowledgements with the KafkaTopicOffsetManager. When acknowledge() is called, the topic begins to get spammed repeatedly. Kafka has log.cleaner.enable set to true and the topic is using cleanup.policy=compact. Thanks for any help.
Config:
#Bean
public ZookeeperConfiguration zookeeperConfiguration() {
ZookeeperConfiguration zookeeperConfiguration = new ZookeeperConfiguration(kafkaConfig.getZookeeperAddress());
zookeeperConfiguration.setClientId("clientId");
return zookeeperConfiguration;
}
#Bean
public ConnectionFactory connectionFactory() {
return new DefaultConnectionFactory(zookeeperConfiguration());
}
#Bean
public TestMessageHandler messageListener() {
return new TestMessageHandler();
}
#Bean
public OffsetManager offsetManager() {
ZookeeperConnect zookeeperConnect = new ZookeeperConnect(kafkaConfig.getZookeeperAddress());
OffsetManager offsetManager = new KafkaTopicOffsetManager(zookeeperConnect, kafkaConfig.getTopic() + "_OFFSET");
return offsetManager;
}
#Bean
public KafkaMessageListenerContainer kafkaMessageListenerContainer() {
KafkaMessageListenerContainer kafkaMessageListenerContainer = new KafkaMessageListenerContainer(connectionFactory(), kafkaConfig.getTopic());
kafkaMessageListenerContainer.setMessageListener(messageListener());
kafkaMessageListenerContainer.setOffsetManager(offsetManager());
return kafkaMessageListenerContainer;
}
Listener:
public class TestMessageHandler implements AcknowledgingMessageListener {
private static final Logger logger = LoggerFactory.getLogger(TestMessageHandler.class);
#Override
public void onMessage(KafkaMessage message, Acknowledgment acknowledgment) {
logger.info(message.toString());
acknowledgment.acknowledge();
}
}
The KafkaTopicOffsetManager needs its own topic to maintain the offset of the actual topic being consumed.
If you don't want to deal with decoding the message payload yourself (its painful in my opinion), extend listener from abstract class AbstractDecodingAcknowledgingMessageListener and provide org.springframework.integration.kafka.serializer.common.StringDecoder as the decoder.
public class TestMessageHandlerDecoding extends AbstractDecodingAcknowledgingMessageListener {
public TestMessageHandlerDecoding(Decoder keyDecoder, Decoder payloadDecoder) {
super(keyDecoder, payloadDecoder);
}
#Override
public void doOnMessage(Object key, Object payload, KafkaMessageMetadata metadata, Acknowledgment acknowledgment) {
LOGGER.info("payload={}",payload);
}