I have a model object which is populated after several transformation and parsing. Now I need to send the message attribute within the model to kafka using spring integration.
I am able to construct the key using the method messageKey but how can i get the actual message from the model like m.getPayload().getMessage() and send it to kafka.
.publishSubscribeChannel(pubSub -> pubSub
.subscribe(flow -> flow
.bridge(e -> e.order(Ordered.HIGHEST_PRECEDENCE))
.handle(Kafka.outboundChannelAdapter(kafkaTemplate).
messageKey(m -> ((AcarsFlightInformation) m.getPayload()).getFlightNbr()).topic(acarsKafkaTopic)))
It's not entirely clear what you are asking. The payload of the message sent to the adapter becomes the value of the producer record.
I think what you are asking is that you only want to send part of the payload.
Use a header enricher and transformer before the .handle...
.enrichHeaders(h -> h.headerExpression(KafkaHeaders.MESSAGE_KEY, "payload.flightNumber")
.transform("payload.message")
.handle(Kafka.outboundChannelAdapter(kafkaTemplate)
.topic(acarsKafkaTopic))
.get();
The adapter will look for that header for the key.
Related
How can I consume Server Sent Events with Spring Integration? I am aware Spring supports SSE with Webflux, but how to convert the incoming Flux into separate Message instances? And possibly wrap this code into some Spring-Integration-Lifecycle-aware component (MessageProducerSupport?)
WebClient client = WebClient.create("http://myhost:8080/sse");
ParameterizedTypeReference<ServerSentEvent<String>> type
= new ParameterizedTypeReference<ServerSentEvent<String>>() {};
Flux<ServerSentEvent<String>> eventStream = client.get()
.uri("/stream-sse")
.retrieve()
.bodyToFlux(type);
eventStream.subscribe(
content -> ;/* here I believe the message should be produced/sent to a channel */ );
See Spring Integration WebFlux Outbound Gateway: https://docs.spring.io/spring-integration/docs/current/reference/html/webflux.html#webflux-outbound:
The setExpectedResponseType(Class<?>) or setExpectedResponseTypeExpression(Expression) identifies the target type of the response body element conversion. If replyPayloadToFlux is set to true, the response body is converted to a Flux with the provided expectedResponseType for each element, and this Flux is sent as the payload downstream. Afterwards, you can use a splitter to iterate over this Flux in a reactive manner.
WebFlux.outboundGateway("http://myhost:8080/sse/stream-sse")
.httpMethod(HttpMethod.GET)
.replyPayloadToFlux(true)
.setExpectedResponseTypeExpression(new ParameterizedTypeReference<ServerSentEvent<String>>() {})
To make it start working just after an application is ready, yo can implement an ApplicationRunner to send a "void" message into a channel for the flow with that WebFlux.outboundGateway(). I don't think we need a special, dedicated component just for SSE requesting and producing. The combination of existing components is fully enough.
I am using SpringIntegration's IntegrationFlows to define the message flow, and used Jms.messageDrivenChannelAdapter to get the message from the MQ, now I need to parse it, send it to KAFKA and update couchbase.
IntegrationFlows
.from(Jms.messageDrivenChannelAdapter(this.acarsMqListener)) //MQ Listener with session transacted=true
.wireTap(ACARS_WIRE_TAP_CHNL) // Logging the message
.transform(agmTransformer, "parseXMLMessage") .filter(acarsFilter,"filterMessageOnSmiImi") // Filter the message based on condition
.handle(acarsProcessor, "processEvent") // Create the message
.handle(Kafka.outboundChannelAdapter(kafkaTemplate).messageKey(MESSAGE_KEY).topic(acarsKafkaTopic)) //send it to kafka
.handle(updateCouchbase, "saveToDB") // Update couchbase
.get();
For each message received we want to log it using MDC to help us to collect/aggregate it based on UUID.
Please suggest how to put the UUID in MDC and then clear out the MDC for each message in the above flow
You just can configure a global WireTap and do an appropriate transformation over there in that wire-tapped flow before logging the message.
On the other hand there might be just need to play with MDC since you can inject something like For header into the message and log() them as usual, so you will see messages in logs and would be able to correlate using that header.
Spring integration Kafka Listener (DSL) which consumes messages from Kafka topic and de-serializes message payload to JSON POJO object. The consumer is configured correctly to receive an Object type of A. But when the message is consumed, the spring integration kafka topic listener, it is de-serialized as type B object. Most of the fields in The object A and Object B are same except for few fields.
When I consume the message on the kafka topic using console consumer tool it correctly consumes the message as as type B the object type and shows all the fields.
Please find the version and broker details below.
springKafkaVersion = 2.1.5.RELEASE
springIntegrationVersion = 2.0.3.RELEASE
springIntegrationKafkaVersion = 3.0.3.RELEASE
kakfa client 1.0.1
I am refactoring some of my Spring Integration RSS feed code which uses feed inbound channel adaptor to a microservice. I want the feeds to be stored internally in a mongodb database (in case of failures, audit etc) and also to write the feed (in JSON) to a kafka topic for onward processing.
How do I do this with Spring Integration? Do I need a pub/subscribe queue with two handlers?
Any example code using Java DSL would be most helpful.
Not sure if you need pub/sub for such a use-case, but definitely you can use QueueChannel in between Feed Inbound Channel Adapter and Kafka Outbound Channel Adapter. That QueueChannel can really be supplied with the MongoDbChannelMessageStore for persistence in case of failures.
The Java DSL sample is like this:
#Bean
public IntegrationFlow feedFlow(MongoDbChannelMessageStore messageStore) {
return IntegrationFlows
.from(Feed.inboundAdapter(...),
e -> e.poller(p -> p.fixedDelay(1000)))
.channel(c -> c.queue(messageStore, "entries"))
.handle(Kafka.outboundChannelAdapter(...),
e -> e.poller(p -> p.fixedDelay(1000)))
.get();
}
https://docs.spring.io/spring-integration/docs/5.0.3.RELEASE/reference/html/feed.html#feed-java-configuration
https://docs.spring.io/spring-integration/docs/5.0.3.RELEASE/reference/html/mongodb.html#mongodb-priority-channel-message-store
https://docs.spring.io/spring-kafka/docs/2.1.4.RELEASE/reference/html/_spring_integration.html#si-outbound
I have a flow that is similar to
IntegrationFlows.from(
Http.inboundGateway("/events")
.requestMapping(requestMappingSpec -> {
requestMappingSpec.methods(HttpMethod.POST);
requestMappingSpec.consumes(MediaType.APPLICATION_JSON_VALUE);
requestMappingSpec.produces(MediaType.APPLICATION_JSON_VALUE);
})
.requestPayloadType(PushEvent.class)
.errorChannel(ERROR_CHANNEL))
.channel(ReleaseFlow.REQUEST_CHANNEL)
.enrichHeaders(h -> h
.header(HttpHeaders.STATUS_CODE, HttpStatus.ACCEPTED))
.get();
When submitting multiple requests, a request will be processed by the flow attached to the REQUEST_CHANNEL and the following request will be processed by just the enrichedHeaders. My understanding is that the endpoints in this example should be processed serially ...
A request arrives at the /events endpoint
The request is processed by the flow listening to REQUEST_CHANNEL
The response from the previous flow will then have its headers enriched
The flow ends and the response is returned to the remote requestor
I appreciate your help in understanding why request n is processed by the channel (and not enrichHeaders()), request n + 1 is being processed by enrichHeaders() (and not the flow listening to the REQUEST_CHANNEL), request n + 2 processed by the channel (and not enrichHeaders()), ...
UPDATE 1
I am new to Spring Integration, but thought it was appropriate to collect events from a GitHub server and then create a release using an external service. The integration service would be responsible for determining the appropriate workflow based upon the data associated to the commit. The endpoint in question would receive a push event and forward it to the flow attached to the subscribable request channel (REQUEST_CHANNEL). This second flow will make a number of outbound requests to collect the appropriate release template and construct and start the pipeline.
UPDATE 2
I have not developed the second flow completely at this point, but here is a first version that simply performs a transformation based upon data associated with the commit.
return f -> f
.route(branchRouter(), mapping -> mapping
.subFlowMapping(true, t -> t
.transform(pushEventToEventTransformer()))
.subFlowMapping(false, n -> n
.transform(skip -> "")));
When the code has been submitted to a "monitored" branch the actions described in the first update will be performed. I am attempting to build the flows incrementally given my limited knowledge of the framework.
Subscribable channels are point-to-point by default, which means if there are two subscribers, messages will be distributed in round-robin fashion.
If you have another flow " ... attached to the REQUEST_CHANNEL " then you have two subscribers - that flow and the header-enricher.
Perhaps if you can describe what you are trying to do we can help.
With the header enricher after the channel, all that happens is the headers are enriched and the inbound message is returned to the gateway.
Perhaps you want this... ?
IntegrationFlows.from(
Http.inboundGateway("/events")
.requestMapping(requestMappingSpec -> {
requestMappingSpec.methods(HttpMethod.POST);
requestMappingSpec.consumes(MediaType.APPLICATION_JSON_VALUE);
requestMappingSpec.produces(MediaType.APPLICATION_JSON_VALUE);
})
.requestPayloadType(PushEvent.class)
.errorChannel(ERROR_CHANNEL))
.enrichHeaders(h -> h
.header(HttpHeaders.STATUS_CODE, HttpStatus.ACCEPTED))
.channel(ReleaseFlow.REQUEST_CHANNEL)
.get();
Which means all messages with the enriched headers will be sent to the channel.
EDIT
If you want the message to go to both places, you need to use a publish-subscribe channel.